Sample records for source code modifications

  1. SiC JFET Transistor Circuit Model for Extreme Temperature Range

    NASA Technical Reports Server (NTRS)

    Neudeck, Philip G.

    2008-01-01

    A technique for simulating extreme-temperature operation of integrated circuits that incorporate silicon carbide (SiC) junction field-effect transistors (JFETs) has been developed. The technique involves modification of NGSPICE, which is an open-source version of the popular Simulation Program with Integrated Circuit Emphasis (SPICE) general-purpose analog-integrated-circuit-simulating software. NGSPICE in its unmodified form is used for simulating and designing circuits made from silicon-based transistors that operate at or near room temperature. Two rapid modifications of NGSPICE source code enable SiC JFETs to be simulated to 500 C using the well-known Level 1 model for silicon metal oxide semiconductor field-effect transistors (MOSFETs). First, the default value of the MOSFET surface potential must be changed. In the unmodified source code, this parameter has a value of 0.6, which corresponds to slightly more than half the bandgap of silicon. In NGSPICE modified to simulate SiC JFETs, this parameter is changed to a value of 1.6, corresponding to slightly more than half the bandgap of SiC. The second modification consists of changing the temperature dependence of MOSFET transconductance and saturation parameters. The unmodified NGSPICE source code implements a T(sup -1.5) temperature dependence for these parameters. In order to mimic the temperature behavior of experimental SiC JFETs, a T(sup -1.3) temperature dependence must be implemented in the NGSPICE source code. Following these two simple modifications, the Level 1 MOSFET model of the NGSPICE circuit simulation program reasonably approximates the measured high-temperature behavior of experimental SiC JFETs properly operated with zero or reverse bias applied to the gate terminal. Modification of additional silicon parameters in the NGSPICE source code was not necessary to model experimental SiC JFET current-voltage performance across the entire temperature range from 25 to 500 C.

  2. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  3. FEDEF: A High Level Architecture Federate Development Framework

    DTIC Science & Technology

    2010-09-01

    require code changes for operability between HLA specifications. Configuration of federate requirements such as publications, subscriptions, time ... management , and management protocol should occur outside of federate source code, allowing for federate reusability without code modification and re

  4. Image authentication using distributed source coding.

    PubMed

    Lin, Yao-Chung; Varodayan, David; Girod, Bernd

    2012-01-01

    We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.

  5. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    ERIC Educational Resources Information Center

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  6. Mod3DMT and EMTF: Free Software for MT Data Processing and Inversion

    NASA Astrophysics Data System (ADS)

    Egbert, G. D.; Kelbert, A.; Meqbel, N. M.

    2017-12-01

    "ModEM" was developed at Oregon State University as a modular system for inversion of electromagnetic (EM) geophysical data (Egbert and Kelbert, 2012; Kelbert et al., 2014). Although designed for more general (frequency domain) EM applications, and originally intended as a testbed for exploring inversion search and regularization strategies, our own initial uses of ModEM were for 3-D imaging of the deep crust and upper mantle at large scales. Since 2013 we have offered a version of the source code suitable for 3D magnetotelluric (MT) inversion on an "as is, user beware" basis for free for non-commercial applications. This version, which we refer to as Mod3DMT, has since been widely used by the international MT community. Over 250 users have registered to download the source code, and at least 50 MT studies in the refereed literature, covering locations around the globe at a range of spatial scales, cite use of ModEM for 3D inversion. For over 30 years I have also made MT processing software available for free use. In this presentation, I will discuss my experience with these freely available (but perhaps not truly open-source) computer codes. Although users are allowed to make modifications to the codes (on conditions that they provide a copy of the modified version) only a handful of users have tried to make any modification, and only rarely are modifications even reported, much less provided back to the developers.

  7. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  8. Flow Instability Tests for a Particle Bed Reactor Nuclear Thermal Rocket Fuel Element

    DTIC Science & Technology

    1993-05-01

    2.0 with GWBASIC or higher (DOS 5.0 was installed on the machine). Since the source code was written in BASIC, it was easy to make modifications...8217 AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for Public Release IAW 190-1 Distribution Unlimited MICHAEL M. BRICKER, SMSgt, USAF Chief...Administration 13. ABSTRACT (Maximum 200 words) i.14. SUBJECT TERMS 15. NUMBER OF PAGES 339 16. PRICE CODE 󈧕. SECURITY CLASSIFICATION 18. SECURITY

  9. Locally adaptive vector quantization: Data compression with feature preservation

    NASA Technical Reports Server (NTRS)

    Cheung, K. M.; Sayano, M.

    1992-01-01

    A study of a locally adaptive vector quantization (LAVQ) algorithm for data compression is presented. This algorithm provides high-speed one-pass compression and is fully adaptable to any data source and does not require a priori knowledge of the source statistics. Therefore, LAVQ is a universal data compression algorithm. The basic algorithm and several modifications to improve performance are discussed. These modifications are nonlinear quantization, coarse quantization of the codebook, and lossless compression of the output. Performance of LAVQ on various images using irreversible (lossy) coding is comparable to that of the Linde-Buzo-Gray algorithm, but LAVQ has a much higher speed; thus this algorithm has potential for real-time video compression. Unlike most other image compression algorithms, LAVQ preserves fine detail in images. LAVQ's performance as a lossless data compression algorithm is comparable to that of Lempel-Ziv-based algorithms, but LAVQ uses far less memory during the coding process.

  10. Distributed single source coding with side information

    NASA Astrophysics Data System (ADS)

    Vila-Forcen, Jose E.; Koval, Oleksiy; Voloshynovskiy, Sviatoslav V.

    2004-01-01

    In the paper we advocate image compression technique in the scope of distributed source coding framework. The novelty of the proposed approach is twofold: classical image compression is considered from the positions of source coding with side information and, contrarily to the existing scenarios, where side information is given explicitly, side information is created based on deterministic approximation of local image features. We consider an image in the transform domain as a realization of a source with a bounded codebook of symbols where each symbol represents a particular edge shape. The codebook is image independent and plays the role of auxiliary source. Due to the partial availability of side information at both encoder and decoder we treat our problem as a modification of Berger-Flynn-Gray problem and investigate a possible gain over the solutions when side information is either unavailable or available only at decoder. Finally, we present a practical compression algorithm for passport photo images based on our concept that demonstrates the superior performance in very low bit rate regime.

  11. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  12. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  13. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  14. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    NASA Astrophysics Data System (ADS)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  15. Python Source Code Plagiarism Attacks on Introductory Programming Course Assignments

    ERIC Educational Resources Information Center

    Karnalim, Oscar

    2017-01-01

    This paper empirically enlists Python plagiarism attacks that have been found on Introductory Programming course assignments for undergraduate students. According to our observation toward 400 plagiarism-suspected cases, there are 35 plagiarism attacks that have been conducted by students. It starts with comment & whitespace modification as…

  16. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  17. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  18. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  19. 40 CFR 52.1824 - Review of new sources and modifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) North Dakota § 52.1824... the language contained in the North Dakota Administrative Code on the use of the EPA “Guideline on Air Quality Models” as supplemented by the “North Dakota Guideline for Air Quality Modeling Analysis”.In a...

  20. 40 CFR 52.1824 - Review of new sources and modifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) North Dakota § 52.1824... the language contained in the North Dakota Administrative Code on the use of the EPA “Guideline on Air Quality Models” as supplemented by the “North Dakota Guideline for Air Quality Modeling Analysis”.In a...

  1. 40 CFR 52.1824 - Review of new sources and modifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) North Dakota § 52.1824... the language contained in the North Dakota Administrative Code on the use of the EPA “Guideline on Air Quality Models” as supplemented by the “North Dakota Guideline for Air Quality Modeling Analysis”.In a...

  2. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  3. Evaluation of Diagnostic Codes in Morbidity and Mortality Data Sources for Heat-Related Illness Surveillance

    PubMed Central

    Watkins, Sharon

    2017-01-01

    Objectives: The primary objective of this study was to identify patients with heat-related illness (HRI) using codes for heat-related injury diagnosis and external cause of injury in 3 administrative data sets: emergency department (ED) visit records, hospital discharge records, and death certificates. Methods: We obtained data on ED visits, hospitalizations, and deaths for Florida residents for May 1 through October 31, 2005-2012. To identify patients with HRI, we used codes from the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to search data on ED visits and hospitalizations and codes from the International Classification of Diseases, Tenth Revision (ICD-10) to search data on deaths. We stratified the results by data source and whether the HRI was work related. Results: We identified 23 981 ED visits, 4816 hospitalizations, and 140 deaths in patients with non–work-related HRI and 2979 ED visits, 415 hospitalizations, and 23 deaths in patients with work-related HRI. The most common diagnosis codes among patients were for severe HRI (heat exhaustion or heatstroke). The proportion of patients with a severe HRI diagnosis increased with data source severity. If ICD-9-CM code E900.1 and ICD-10 code W92 (excessive heat of man-made origin) were used as exclusion criteria for HRI, 5.0% of patients with non–work-related deaths, 3.0% of patients with work-related ED visits, and 1.7% of patients with work-related hospitalizations would have been removed. Conclusions: Using multiple data sources and all diagnosis fields may improve the sensitivity of HRI surveillance. Future studies should evaluate the impact of converting ICD-9-CM to ICD-10-CM codes on HRI surveillance of ED visits and hospitalizations. PMID:28379784

  4. Sulfur Dioxide (SO2) Emissions From Onshore Natural Gas Processing for Which Construction, Reconstruction, or Modification Commenced After January 20, 1984, and on or Before August 23, 2011: New Source Performance Standards (NSPS)

    EPA Pesticide Factsheets

    Learn more about the NSPS regulation for SO2 emissions from onshore natural gas processing by reading the rule history, rule summary, federal register notices and the code of federal regulations subpart

  5. The influence of a wall function on turbine blade heat transfer prediction

    NASA Technical Reports Server (NTRS)

    Whitaker, Kevin W.

    1989-01-01

    The second phase of a continuing investigation to improve the prediction of turbine blade heat transfer coefficients was completed. The present study specifically investigated how a numeric wall function in the turbulence model of a two-dimensional boundary layer code, STAN5, affected heat transfer prediction capabilities. Several sources of inaccuracy in the wall function were identified and then corrected or improved. Heat transfer coefficient predictions were then obtained using each one of the modifications to determine its effect. Results indicated that the modifications made to the wall function can significantly affect the prediction of heat transfer coefficients on turbine blades. The improvement in accuracy due the modifications is still inconclusive and is still being investigated.

  6. DATAMAP upgrade version 4.0

    NASA Technical Reports Server (NTRS)

    Watts, Michael E.; Dejpour, Shabob R.

    1989-01-01

    The changes made on the data analysis and management program DATAMAP (Data from Aeromechanics Test and Analytics - Management and Analysis Package) are detailed. These changes are made to Version 3.07 (released February, 1981) and are called Version 4.0. Version 4.0 improvements were performed by Sterling Software under contract to NASA Ames Research Center. The increased capabilities instituted in this version include the breakout of the source code into modules for ease of modification, addition of a more accurate curve fit routine, ability to handle higher frequency data, additional data analysis features, and improvements in the functionality of existing features. These modification will allow DATAMAP to be used on more data sets and will make future modifications and additions easier to implement.

  7. Supersonic propulsion simulation by incorporating component models in the large perturbation inlet (LAPIN) computer code

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Richard, Jacques C.

    1991-01-01

    An approach to simulating the internal flows of supersonic propulsion systems is presented. The approach is based on a fairly simple modification of the Large Perturbation Inlet (LAPIN) computer code. LAPIN uses a quasi-one dimensional, inviscid, unsteady formulation of the continuity, momentum, and energy equations. The equations are solved using a shock capturing, finite difference algorithm. The original code, developed for simulating supersonic inlets, includes engineering models of unstart/restart, bleed, bypass, and variable duct geometry, by means of source terms in the equations. The source terms also provide a mechanism for incorporating, with the inlet, propulsion system components such as compressor stages, combustors, and turbine stages. This requires each component to be distributed axially over a number of grid points. Because of the distributed nature of such components, this representation should be more accurate than a lumped parameter model. Components can be modeled by performance map(s), which in turn are used to compute the source terms. The general approach is described. Then, simulation of a compressor/fan stage is discussed to show the approach in detail.

  8. Equipment Leaks of Volatile Organic Compounds From Onshore Natural Gas Processing Plants for Which Construction, Reconstruction, or Modification Commenced After January 20, 1984, and on or Before August 23, 2011: New Source Performance Standards (NSPS)

    EPA Pesticide Factsheets

    Learn about the NSPS regulation for equipment leaks of Volatile Organic Compounds (VOC) from onshore natural gas processing plants by reading the rule summary, rule history, federal register citations, and the code of federal regulations

  9. Evaluation of Grid Modification Methods for On- and Off-Track Sonic Boom Analysis

    NASA Technical Reports Server (NTRS)

    Nayani, Sudheer N.; Campbell, Richard L.

    2013-01-01

    Grid modification methods have been under development at NASA to enable better predictions of low boom pressure signatures from supersonic aircraft. As part of this effort, two new codes, Stretched and Sheared Grid - Modified (SSG) and Boom Grid (BG), have been developed in the past year. The CFD results from these codes have been compared with ones from the earlier grid modification codes Stretched and Sheared Grid (SSGRID) and Mach Cone Aligned Prism (MCAP) and also with the available experimental results. NASA's unstructured grid suite of software TetrUSS and the automatic sourcing code AUTOSRC were used for base grid generation and flow solutions. The BG method has been evaluated on three wind tunnel models. Pressure signatures have been obtained up to two body lengths below a Gulfstream aircraft wind tunnel model. Good agreement with the wind tunnel results have been obtained for both on-track and off-track (up to 53 degrees) cases. On-track pressure signatures up to ten body lengths below a Straight Line Segmented Leading Edge (SLSLE) wind tunnel model have been extracted. Good agreement with the wind tunnel results have been obtained. Pressure signatures have been obtained at 1.5 body lengths below a Lockheed Martin aircraft wind tunnel model. Good agreement with the wind tunnel results have been obtained for both on-track and off-track (up to 40 degrees) cases. Grid sensitivity studies have been carried out to investigate any grid size related issues. Methods have been evaluated for fully turbulent, mixed laminar/turbulent and fully laminar flow conditions.

  10. IonRayTrace: An HF Propagation Model for Communications and Radar Applications

    DTIC Science & Technology

    2014-12-01

    for modeling the impact of ionosphere variability on detection algorithms. Modification of IonRayTrace’s source code to include flexible gridding and...color denotes plasma frequency in MHz .................................................................. 6 4. Ionospheric absorption (dB) versus... Ionosphere for its environmental background [3]. IonRayTrace’s operation is summarized briefly in Section 3. However, the scope of this document is primarily

  11. Prediction of effects of wing contour modifications on low-speed maximum lift and transonic performance for the EA-6B aircraft

    NASA Technical Reports Server (NTRS)

    Allison, Dennis O.; Waggoner, E. G.

    1990-01-01

    Computational predictions of the effects of wing contour modifications on maximum lift and transonic performance were made and verified against low speed and transonic wind tunnel data. This effort was part of a program to improve the maneuvering capability of the EA-6B electronics countermeasures aircraft, which evolved from the A-6 attack aircraft. The predictions were based on results from three computer codes which all include viscous effects: MCARF, a 2-D subsonic panel code; TAWFIVE, a transonic full potential code; and WBPPW, a transonic small disturbance potential flow code. The modifications were previously designed with the aid of these and other codes. The wing modifications consists of contour changes to the leading edge slats and trailing edge flaps and were designed for increased maximum lift with minimum effect on transonic performance. The prediction of the effects of the modifications are presented, with emphasis on verification through comparisons with wind tunnel data from the National Transonic Facility. Attention is focused on increments in low speed maximum lift and increments in transonic lift, pitching moment, and drag resulting from the contour modifications.

  12. Boltzmann Transport Code Update: Parallelization and Integrated Design Updates

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.

    2003-01-01

    The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.

  13. Code Development in Coupled PARCS/RELAP5 for Supercritical Water Reactor

    DOE PAGES

    Hu, Po; Wilson, Paul

    2014-01-01

    The new capability is added to the existing coupled code package PARCS/RELAP5, in order to analyze SCWR design under supercritical pressure with the separated water coolant and moderator channels. This expansion is carried out on both codes. In PARCS, modification is focused on extending the water property tables to supercritical pressure, modifying the variable mapping input file and related code module for processing thermal-hydraulic information from separated coolant/moderator channels, and modifying neutronics feedback module to deal with the separated coolant/moderator channels. In RELAP5, modification is focused on incorporating more accurate water properties near SCWR operation/transient pressure and temperature in themore » code. Confirming tests of the modifications is presented and the major analyzing results from the extended codes package are summarized.« less

  14. Intrinsic Radiation Source Generation with the ISC Package: Data Comparisons and Benchmarking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solomon, Clell J. Jr.

    The characterization of radioactive emissions from unstable isotopes (intrinsic radiation) is necessary for shielding and radiological-dose calculations from radioactive materials. While most radiation transport codes, e.g., MCNP [X-5 Monte Carlo Team, 2003], provide the capability to input user prescribed source definitions, such as radioactive emissions, they do not provide the capability to calculate the correct radioactive-source definition given the material compositions. Special modifications to MCNP have been developed in the past to allow the user to specify an intrinsic source, but these modification have not been implemented into the primary source base [Estes et al., 1988]. To facilitate the descriptionmore » of the intrinsic radiation source from a material with a specific composition, the Intrinsic Source Constructor library (LIBISC) and MCNP Intrinsic Source Constructor (MISC) utility have been written. The combination of LIBISC and MISC will be herein referred to as the ISC package. LIBISC is a statically linkable C++ library that provides the necessary functionality to construct the intrinsic-radiation source generated by a material. Furthermore, LIBISC provides the ability use different particle-emission databases, radioactive-decay databases, and natural-abundance databases allowing the user flexibility in the specification of the source, if one database is preferred over others. LIBISC also provides functionality for aging materials and producing a thick-target bremsstrahlung photon source approximation from the electron emissions. The MISC utility links to LIBISC and facilitates the description of intrinsic-radiation sources into a format directly usable with the MCNP transport code. Through a series of input keywords and arguments the MISC user can specify the material, age the material if desired, and produce a source description of the radioactive emissions from the material in an MCNP readable format. Further details of using the MISC utility can be obtained from the user guide [Solomon, 2012]. The remainder of this report presents a discussion of the databases available to LIBISC and MISC, a discussion of the models employed by LIBISC, a comparison of the thick-target bremsstrahlung model employed, a benchmark comparison to plutonium and depleted-uranium spheres, and a comparison of the available particle-emission databases.« less

  15. Use of computer code for dose distribution studies in A 60CO industrial irradiator

    NASA Astrophysics Data System (ADS)

    Piña-Villalpando, G.; Sloan, D. P.

    1995-09-01

    This paper presents a benchmark comparison between calculated and experimental absorbed dose values tor a typical product, in a 60Co industrial irradiator, located at ININ, México. The irradiator is a two levels, two layers system with overlapping product configuration with activity around 300kCi. Experimental values were obtanied from routine dosimetry, using red acrylic pellets. Typical product was Petri dishes packages, apparent density 0.13 g/cm3; that product was chosen because uniform size, large quantity and low density. Minimum dose was fixed in 15 kGy. Calculated values were obtained from QAD-CGGP code. This code uses a point kernel technique, build-up factors fitting was done by geometrical progression and combinatorial geometry is used for system description. Main modifications for the code were related with source sumilation, using punctual sources instead of pencils and an energy and anisotropic emission spectrums were included. Results were, for maximum dose, calculated value (18.2 kGy) was 8% higher than experimental average value (16.8 kGy); for minimum dose, calculated value (13.8 kGy) was 3% higher than experimental average value (14.3 kGy).

  16. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  17. Lack of agreement in pediatric emergency department discharge diagnoses from clinical and administrative data sources.

    PubMed

    Gorelick, Marc H; Knight, Stacey; Alessandrini, Evaline A; Stanley, Rachel M; Chamberlain, James M; Kuppermann, Nathan; Alpern, Elizabeth R

    2007-07-01

    Diagnosis information from existing data sources is used commonly for epidemiologic, administrative, and research purposes. The quality of such data for emergency department (ED) visits is unknown. To determine the agreement on final diagnoses between two sources, electronic administrative sources and manually abstracted medical records, for pediatric ED visits, in a multicenter network. This was a cross sectional study at 19 EDs nationwide. The authors obtained data from two sources at each ED during a three-month period in 2003: administrative sources for all visits and abstracted records for randomly selected visits during ten days over the study period. Records were matched using unique identifiers and probabilistic linkage. The authors recorded up to three diagnoses from each abstracted medical record and up to ten for the administrative data source. Diagnoses were grouped into 104 groups using a modification of the Clinical Classification System. A total of 8,860 abstracted records had at least one valid diagnosis code (with a total of 12,895 diagnoses) and were successfully matched to records in the administrative source. Overall, 67% (95% confidence interval = 66% to 68%) of diagnoses from the administrative and abstracted sources were within the same diagnosis group. Agreement varied by site, ranging from 54% to 77%. Agreement varied substantially by diagnosis group; there was no difference by method of linkage. Clustering clinically similar diagnosis groups improved agreement between administrative and abstracted data sources. ED diagnoses retrieved from electronic administrative sources and manual chart review frequently disagree, even if similar diagnosis codes are grouped. Agreement varies by institution and by diagnosis. Further work is needed to improve the accuracy of diagnosis coding; development of a grouping system specific to pediatric emergency care may be beneficial.

  18. Update to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1994-01-01

    This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.

  19. Development of a computer code to calculate the distribution of radionuclides within the human body by the biokinetic models of the ICRP.

    PubMed

    Matsumoto, Masaki; Yamanaka, Tsuneyasu; Hayakawa, Nobuhiro; Iwai, Satoshi; Sugiura, Nobuyuki

    2015-03-01

    This paper describes the Basic Radionuclide vAlue for Internal Dosimetry (BRAID) code, which was developed to calculate the time-dependent activity distribution in each organ and tissue characterised by the biokinetic compartmental models provided by the International Commission on Radiological Protection (ICRP). Translocation from one compartment to the next is taken to be governed by first-order kinetics, which is formulated by the first-order differential equations. In the source program of this code, the conservation equations are solved for the mass balance that describes the transfer of a radionuclide between compartments. This code is applicable to the evaluation of the radioactivity of nuclides in an organ or tissue without modification of the source program. It is also possible to handle easily the cases of the revision of the biokinetic model or the application of a uniquely defined model by a user, because this code is designed so that all information on the biokinetic model structure is imported from an input file. The sample calculations are performed with the ICRP model, and the results are compared with the analytic solutions using simple models. It is suggested that this code provides sufficient result for the dose estimation and interpretation of monitoring data. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Modification of codes NUALGAM and BREMRAD, Volume 1

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Huang, R.; Firstenberg, H.

    1971-01-01

    The NUGAM2 code predicts forward and backward angular energy differential and integrated distributions for gamma photons and fluorescent radiation emerging from finite laminar transport media. It determines buildup and albedo data for scientific research and engineering purposes; it also predicts the emission characteristics of finite radioisotope sources. The results are shown to be in very good agreement with available published data. The code predicts data for many situations in which no published data is available in the energy range up to 5 MeV. The NUGAM3 code predicts the pulse height response of inorganic (NaI and CsI) scintillation detectors to gamma photons. Because it allows the scintillator to be clad and mounted on a photomultiplier as in the experimental or industrial application, it is a more practical and thus useful code than others previously reported. Results are in excellent agreement with published Monte Carlo and experimental data in the energy range up to 4.5 MeV.

  1. Single-Nucleosome Mapping of Histone Modifications in S. cerevisiae

    PubMed Central

    Kim, Minkyu; Buratowski, Stephen; Schreiber, Stuart L; Friedman, Nir

    2005-01-01

    Covalent modification of histone proteins plays a role in virtually every process on eukaryotic DNA, from transcription to DNA repair. Many different residues can be covalently modified, and it has been suggested that these modifications occur in a great number of independent, meaningful combinations. Published low-resolution microarray studies on the combinatorial complexity of histone modification patterns suffer from confounding effects caused by the averaging of modification levels over multiple nucleosomes. To overcome this problem, we used a high-resolution tiled microarray with single-nucleosome resolution to investigate the occurrence of combinations of 12 histone modifications on thousands of nucleosomes in actively growing S. cerevisiae. We found that histone modifications do not occur independently; there are roughly two groups of co-occurring modifications. One group of lysine acetylations shows a sharply defined domain of two hypo-acetylated nucleosomes, adjacent to the transcriptional start site, whose occurrence does not correlate with transcription levels. The other group consists of modifications occurring in gradients through the coding regions of genes in a pattern associated with transcription. We found no evidence for a deterministic code of many discrete states, but instead we saw blended, continuous patterns that distinguish nucleosomes at one location (e.g., promoter nucleosomes) from those at another location (e.g., over the 3′ ends of coding regions). These results are consistent with the idea of a simple, redundant histone code, in which multiple modifications share the same role. PMID:16122352

  2. Functional Interplay between Small Non-Coding RNAs and RNA Modification in the Brain.

    PubMed

    Leighton, Laura J; Bredy, Timothy W

    2018-06-07

    Small non-coding RNAs are essential for transcription, translation and gene regulation in all cell types, but are particularly important in neurons, with known roles in neurodevelopment, neuroplasticity and neurological disease. Many small non-coding RNAs are directly involved in the post-transcriptional modification of other RNA species, while others are themselves substrates for modification, or are functionally modulated by modification of their target RNAs. In this review, we explore the known and potential functions of several distinct classes of small non-coding RNAs in the mammalian brain, focusing on the newly recognised interplay between the epitranscriptome and the activity of small RNAs. We discuss the potential for this relationship to influence the spatial and temporal dynamics of gene activation in the brain, and predict that further research in the field of epitranscriptomics will identify interactions between small RNAs and RNA modifications which are essential for higher order brain functions such as learning and memory.

  3. Applang - A DSL for specification of mobile applications for android platform based on textX

    NASA Astrophysics Data System (ADS)

    Kosanović, Milan; Dejanović, Igor; Milosavljević, Gordana

    2016-06-01

    Mobile platforms become a ubiquitous part of our daily lives thus making more pressure to software developers to develop more applications faster and with the support for different mobile operating systems. To foster the faster development of mobile services and applications and to support various mobile operating systems a new software development approaches must be undertaken. Domain-Specific Languages (DSL) are a viable approach that promise to solve a problem of target platform diversity as well as to facilitate rapid application development and shorter time-to-market. This paper presents Applang, a DSL for the specification of mobile applications for the Android platform, based on textX meta-language. The application is described using Applang DSL and the source code for a target platform is automatically generated by the provided code generator. The same application defined using single Applang source can be transformed to various targets with little or no manual modifications.

  4. Easily extensible unix software for spectral analysis, display, modification, and synthesis of musical sounds

    NASA Astrophysics Data System (ADS)

    Beauchamp, James W.

    2002-11-01

    Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.

  5. Fast tandem mass spectra-based protein identification regardless of the number of spectra or potential modifications examined.

    PubMed

    Falkner, Jayson; Andrews, Philip

    2005-05-15

    Comparing tandem mass spectra (MSMS) against a known dataset of protein sequences is a common method for identifying unknown proteins; however, the processing of MSMS by current software often limits certain applications, including comprehensive coverage of post-translational modifications, non-specific searches and real-time searches to allow result-dependent instrument control. This problem deserves attention as new mass spectrometers provide the ability for higher throughput and as known protein datasets rapidly grow in size. New software algorithms need to be devised in order to address the performance issues of conventional MSMS protein dataset-based protein identification. This paper describes a novel algorithm based on converting a collection of monoisotopic, centroided spectra to a new data structure, named 'peptide finite state machine' (PFSM), which may be used to rapidly search a known dataset of protein sequences, regardless of the number of spectra searched or the number of potential modifications examined. The algorithm is verified using a set of commercially available tryptic digest protein standards analyzed using an ABI 4700 MALDI TOFTOF mass spectrometer, and a free, open source PFSM implementation. It is illustrated that a PFSM can accurately search large collections of spectra against large datasets of protein sequences (e.g. NCBI nr) using a regular desktop PC; however, this paper only details the method for identifying peptide and subsequently protein candidates from a dataset of known protein sequences. The concept of using a PFSM as a peptide pre-screening technique for MSMS-based search engines is validated by using PFSM with Mascot and XTandem. Complete source code, documentation and examples for the reference PFSM implementation are freely available at the Proteome Commons, http://www.proteomecommons.org and source code may be used both commercially and non-commercially as long as the original authors are credited for their work.

  6. ATLAS Test Program Generator II (AGEN II). Volume I. Executive Software System.

    DTIC Science & Technology

    1980-08-01

    features. l-1 C. To provide detailed descriptions of each of the system components and modules and their corresponding flowcharts. D. To describe methods of...contains the FORTRAN source code listings to enable programmer to do the expansions and modifications. The methods and details of adding another...characteristics of the network. The top-down implementa- tion method is therefore suggested. This method starts at the top by designing the IVT modules in

  7. Coupling of an aeroacoustic model and a parabolic equation code for long range wind turbine noise propagation

    NASA Astrophysics Data System (ADS)

    Cotté, B.

    2018-05-01

    This study proposes to couple a source model based on Amiet's theory and a parabolic equation code in order to model wind turbine noise emission and propagation in an inhomogeneous atmosphere. Two broadband noise generation mechanisms are considered, namely trailing edge noise and turbulent inflow noise. The effects of wind shear and atmospheric turbulence are taken into account using the Monin-Obukhov similarity theory. The coupling approach, based on the backpropagation method to preserve the directivity of the aeroacoustic sources, is validated by comparison with an analytical solution for the propagation over a finite impedance ground in a homogeneous atmosphere. The influence of refraction effects is then analyzed for different directions of propagation. The spectrum modification related to the ground effect and the presence of a shadow zone for upwind receivers are emphasized. The validity of the point source approximation that is often used in wind turbine noise propagation models is finally assessed. This approximation exaggerates the interference dips in the spectra, and is not able to correctly predict the amplitude modulation.

  8. Development of a new version of the Vehicle Protection Factor Code (VPF3)

    NASA Astrophysics Data System (ADS)

    Jamieson, Terrance J.

    1990-10-01

    The Vehicle Protection Factor (VPF) Code is an engineering tool for estimating radiation protection afforded by armoured vehicles and other structures exposed to neutron and gamma ray radiation from fission, thermonuclear, and fusion sources. A number of suggestions for modifications have been offered by users of early versions of the code. These include: implementing some of the more advanced features of the air transport rating code, ATR5, used to perform the air over ground radiation transport analyses; allowing the ability to study specific vehicle orientations within the free field; implementing an adjoint transport scheme to reduce the number of transport runs required; investigating the possibility of accelerating the transport scheme; and upgrading the computer automated design (CAD) package used by VPF. The generation of radiation free field fluences for infinite air geometries as required for aircraft analysis can be accomplished by using ATR with the air over ground correction factors disabled. Analysis of the effects of fallout bearing debris clouds on aircraft will require additional modelling of VPF.

  9. Using the NASA GRC Sectored-One-Dimensional Combustor Simulation

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Mehta, Vishal R.

    2014-01-01

    The document is a user manual for the NASA GRC Sectored-One-Dimensional (S-1-D) Combustor Simulation. It consists of three sections. The first is a very brief outline of the mathematical and numerical background of the code along with a description of the non-dimensional variables on which it operates. The second section describes how to run the code and includes an explanation of the input file. The input file contains the parameters necessary to establish an operating point as well as the associated boundary conditions (i.e. how it is fed and terminated) of a geometrically configured combustor. It also describes the code output. The third section describes the configuration process and utilizes a specific example combustor to do so. Configuration consists of geometrically describing the combustor (section lengths, axial locations, and cross sectional areas) and locating the fuel injection point and flame region. Configuration requires modifying the source code and recompiling. As such, an executable utility is included with the code which will guide the requisite modifications and insure that they are done correctly.

  10. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  11. Rapid Prediction of Unsteady Three-Dimensional Viscous Flows in Turbopump Geometries

    NASA Technical Reports Server (NTRS)

    Dorney, Daniel J.

    1998-01-01

    A program is underway to improve the efficiency of a three-dimensional Navier-Stokes code and generalize it for nozzle and turbopump geometries. Code modifications have included the implementation of parallel processing software, incorporation of new physical models and generalization of the multiblock capability. The final report contains details of code modifications, numerical results for several nozzle and turbopump geometries, and the implementation of the parallelization software.

  12. Traceability Through Automatic Program Generation

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Green, Jeff

    2003-01-01

    Program synthesis is a technique for automatically deriving programs from specifications of their behavior. One of the arguments made in favour of program synthesis is that it allows one to trace from the specification to the program. One way in which traceability information can be derived is to augment the program synthesis system so that manipulations and calculations it carries out during the synthesis process are annotated with information on what the manipulations and calculations were and why they were made. This information is then accumulated throughout the synthesis process, at the end of which, every artifact produced by the synthesis is annotated with a complete history relating it to every other artifact (including the source specification) which influenced its construction. This approach requires modification of the entire synthesis system - which is labor-intensive and hard to do without influencing its behavior. In this paper, we introduce a novel, lightweight technique for deriving traceability from a program specification to the corresponding synthesized code. Once a program has been successfully synthesized from a specification, small changes are systematically made to the specification and the effects on the synthesized program observed. We have partially automated the technique and applied it in an experiment to one of our program synthesis systems, AUTOFILTER, and to the GNU C compiler, GCC. The results are promising: 1. Manual inspection of the results indicates that most of the connections derived from the source (a specification in the case of AUTOFILTER, C source code in the case of GCC) to its generated target (C source code in the case of AUTOFILTER, assembly language code in the case of GCC) are correct. 2. Around half of the lines in the target can be traced to at least one line of the source. 3. Small changes in the source often induce only small changes in the target.

  13. NAS Parallel Benchmarks. 2.4

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We describe a new problem size, called Class D, for the NAS Parallel Benchmarks (NPB), whose MPI source code implementation is being released as NPB 2.4. A brief rationale is given for how the new class is derived. We also describe the modifications made to the MPI (Message Passing Interface) implementation to allow the new class to be run on systems with 32-bit integers, and with moderate amounts of memory. Finally, we give the verification values for the new problem size.

  14. Evidence for the implication of the histone code in building the genome structure.

    PubMed

    Prakash, Kirti; Fournier, David

    2018-02-01

    Histones are punctuated with small chemical modifications that alter their interaction with DNA. One attractive hypothesis stipulates that certain combinations of these histone modifications may function, alone or together, as a part of a predictive histone code to provide ground rules for chromatin folding. We consider four features that relate histone modifications to chromatin folding: charge neutralisation, molecular specificity, robustness and evolvability. Next, we present evidence for the association among different histone modifications at various levels of chromatin organisation and show how these relationships relate to function such as transcription, replication and cell division. Finally, we propose a model where the histone code can set critical checkpoints for chromatin to fold reversibly between different orders of the organisation in response to a biological stimulus. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. NOTE: A Monte Carlo study of dose rate distribution around the specially asymmetric CSM3-a 137Cs source

    NASA Astrophysics Data System (ADS)

    Pérez-Calatayud, J.; Lliso, F.; Ballester, F.; Serrano, M. A.; Lluch, J. L.; Limami, Y.; Puchades, V.; Casal, E.

    2001-07-01

    The CSM3 137Cs type stainless-steel encapsulated source is widely used in manually afterloaded low dose rate brachytherapy. A specially asymmetric source, CSM3-a, has been designed by CIS Bio International (France) substituting the eyelet side seed with an inactive material in the CSM3 source. This modification has been done in order to allow a uniform dose level over the upper vaginal surface when this `linear' source is inserted at the top of the dome vaginal applicators. In this study the Monte Carlo GEANT3 simulation code, incorporating the source geometry in detail, was used to investigate the dosimetric characteristics of this special CSM3-a 137Cs brachytherapy source. The absolute dose rate distribution in water around this source was calculated and is presented in the form of an along-away table. Comparison of Sievert integral type calculations with Monte Carlo results are discussed.

  16. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  17. Enhancement of the CAVE computer code

    NASA Astrophysics Data System (ADS)

    Rathjen, K. A.; Burk, H. O.

    1983-12-01

    The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.

  18. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development (https://sourceforge.net/projects/pray-plot-rayinvr/).

  19. Rapid Prediction of Unsteady Three-Dimensional Viscous Flows in Turbopump Geometries

    NASA Technical Reports Server (NTRS)

    Dorney, Daniel J.

    1998-01-01

    A program is underway to improve the efficiency of a three-dimensional Navier-Stokes code and generalize it for nozzle and turbopump geometries. Code modifications will include the implementation of parallel processing software, incorporating new physical models and generalizing the multi-block capability to allow the simultaneous simulation of nozzle and turbopump configurations. The current report contains details of code modifications, numerical results of several flow simulations and the status of the parallelization effort.

  20. Three-dimensional Monte-Carlo simulation of gamma-ray scattering and production in the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.J.

    1989-05-15

    Monte Carlo codes have been developed to simulate gamma-ray scattering and production in the atmosphere. The scattering code simulates interactions of low-energy gamma rays (20 to several hundred keV) from an astronomical point source in the atmosphere; a modified code also simulates scattering in a spacecraft. Four incident spectra, typical of gamma-ray bursts, solar flares, and the Crab pulsar, and 511 keV line radiation have been studied. These simulations are consistent with observations of solar flare radiation scattered from the atmosphere. The production code simulates the interactions of cosmic rays which produce high-energy (above 10 MeV) photons and electrons. Itmore » has been used to calculate gamma-ray and electron albedo intensities at Palestine, Texas and at the equator; the results agree with observations in most respects. With minor modifications this code can be used to calculate intensities of other high-energy particles. Both codes are fully three-dimensional, incorporating a curved atmosphere; the production code also incorporates the variation with both zenith and azimuth of the incident cosmic-ray intensity due to geomagnetic effects. These effects are clearly reflected in the calculated albedo by intensity contrasts between the horizon and nadir, and between the east and west horizons.« less

  1. Enhancement of the CAVE computer code. [aerodynamic heating package for nose cones and scramjet engine sidewalls

    NASA Technical Reports Server (NTRS)

    Rathjen, K. A.; Burk, H. O.

    1983-01-01

    The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.

  2. Some Experimental and Monte Carlo Investigations of the Plastic Scintillators for the Current Mode Measurements at Pulsed Neutron Sources

    NASA Astrophysics Data System (ADS)

    Rogov, A.; Pepyolyshev, Yu.; Carta, M.; d'Angelo, A.

    Scintillation detector (SD) is widely used in neutron and gamma-spectrometry in a count mode. The organic scintillators for the count mode of the detector operation are investigated rather well. Usually, they are applied for measurement of amplitude and time distributions of pulses caused by single interaction events of neutrons or gamma's with scintillator material. But in a large area of scientific research scintillation detectors can alternatively be used on a current mode by recording the average current from the detector. For example,the measurements of the neutron pulse shape at the pulsed reactors or another pulsed neutron sources. So as to get a rather large volume of experimental data at pulsed neutron sources, it is necessary to use the current mode detector for registration of fast neutrons. Many parameters of the SD are changed with a transition from an accounting mode to current one. For example, the detector efficiency is different in counting and current modes. Many effects connected with time accuracy become substantial. Besides, for the registration of solely fast neutrons, as must be in many measurements, in the mixed radiation field of the pulsed neutron sources, SD efficiency has to be determined with a gamma-radiation shield present. Here is no calculations or experimental data on SD current mode operation up to now. The response functions of the detectors can be either measured in high-precision reference fields or calculated by a computer simulation. We have used the MCNP code [1] and carried out some experiments for investigation of the plastic performances in a current mode. There are numerous programs performing simulating similar to the MCNP code. For example, for neutrons there are [2-4], for photons - [5-8]. However, all known codes to use (SCINFUL, NRESP4, SANDYL, EGS49) have more stringent restrictions on the source, geometry and detector characteristics. In MCNP code a lot of these restrictions are absent and you need only to write special additions for proton and electron recoil and transfer energy to light output. These code modifications allow taking into account all processes in organic scintillator influence the light yield.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brochard, J.; Charras, T.; Ghoudi, M.

    Modifications to a computer code for ductile fracture assessment of piping systems with postulated circumferential through-wall cracks under static or dynamic loading are very briefly described. The modifications extend the capabilities of the CASTEM2000 code to the determination of fracture parameters under creep conditions. The main advantage of the approach is that thermal loads can be evaluated as secondary stresses. The code is applicable to piping systems for which crack propagation predictions differ significantly depending on whether thermal stresses are considered as primary or secondary stresses.

  4. Numerical simulation of the baking of porous anode carbon in a vertical flue ring furnace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, M.; Melaaen, M.C.

    The interaction of pitch pyrolysis in porous anode carbon during heating and volatiles combustion in the flue gas channel has been analyzed to gain insight in the anode baking process. A two-dimensional geometry of a flue gas channel adjacent to a porous flue gas wall, packing coke, and an anode was used for studying the effect of heating rate on temperature gradients and internal gas pressure in the anodes. The mathematical model included porous heat and mass transfer, pitch pyrolysis, combustion of volatiles, radiation, and turbulent channel flow. The mathematical model was developed through source code modification of the computationalmore » fluid dynamics code FLUENT. The model was useful for studying the effects of heating rate, geometry, and anode properties.« less

  5. Kablammo: an interactive, web-based BLAST results visualizer.

    PubMed

    Wintersinger, Jeff A; Wasmuth, James D

    2015-04-15

    Kablammo is a web-based application that produces interactive, vector-based visualizations of sequence alignments generated by BLAST. These visualizations can illustrate many features, including shared protein domains, chromosome structural modifications and genome misassembly. Kablammo can be used at http://kablammo.wasmuthlab.org. For a local installation, the source code and instructions are available under the MIT license at http://github.com/jwintersinger/kablammo. jeff@wintersinger.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Squeezed Back-to-Back Correlation of {D}^{0}{\\bar{D}}^{0} in Relativistic Heavy-Ion Collisions

    NASA Astrophysics Data System (ADS)

    Yang, Ai-Geng; Zhang, Yong; Cheng, Luan; Sun, Hao; Zhang, Wei-Ning

    2018-05-01

    We investigate the squeezed back-to-back correlation (BBC) of $D^0\\!{\\bar D}^0$ in relativistic heavy-ion collisions, using the in-medium mass modification calculated with a self-energy in hot pion gas and the source space-time distributions provided by the viscous hydrodynamic code VISH2+1. It is found that the BBC of $D^0\\!{\\bar D}^0$ is significant in peripheral Au+Au collisions at the RHIC energy. A possible way to detect the BBC in experiment is presented.

  7. BigBWA: approaching the Burrows-Wheeler aligner to Big Data technologies.

    PubMed

    Abuín, José M; Pichel, Juan C; Pena, Tomás F; Amigo, Jorge

    2015-12-15

    BigBWA is a new tool that uses the Big Data technology Hadoop to boost the performance of the Burrows-Wheeler aligner (BWA). Important reductions in the execution times were observed when using this tool. In addition, BigBWA is fault tolerant and it does not require any modification of the original BWA source code. BigBWA is available at the project GitHub repository: https://github.com/citiususc/BigBWA. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Coding update of the SMFM definition of low risk for cesarean delivery from ICD-9-CM to ICD-10-CM.

    PubMed

    Armstrong, Joanne; McDermott, Patricia; Saade, George R; Srinivas, Sindhu K

    2017-07-01

    In 2015, the Society for Maternal-Fetal Medicine developed a low risk for cesarean delivery definition based on administrative claims-based diagnosis codes described by the International Classification of Diseases, Ninth Revision, Clinical Modification. The Society for Maternal-Fetal Medicine definition is a clinical enrichment of 2 available measures from the Joint Commission and the Agency for Healthcare Research and Quality measures. The Society for Maternal-Fetal Medicine measure excludes diagnosis codes that represent clinically relevant risk factors that are absolute or relative contraindications to vaginal birth while retaining diagnosis codes such as labor disorders that are discretionary risk factors for cesarean delivery. The introduction of the International Statistical Classification of Diseases, 10th Revision, Clinical Modification in October 2015 expanded the number of available diagnosis codes and enabled a greater depth and breadth of clinical description. These coding improvements further enhance the clinical validity of the Society for Maternal-Fetal Medicine definition and its potential utility in tracking progress toward the goal of safely lowering the US cesarean delivery rate. This report updates the Society for Maternal-Fetal Medicine definition of low risk for cesarean delivery using International Statistical Classification of Diseases, 10th Revision, Clinical Modification coding. Copyright © 2017. Published by Elsevier Inc.

  9. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  10. Learning from a provisioning site: code of conduct compliance and behaviour of whale sharks in Oslob, Cebu, Philippines

    PubMed Central

    Araujo, Gonzalo; Penketh, Luke; Heath, Anna; McCoy, Emer; Labaja, Jessica; Lucey, Anna; Ponzo, Alessandro

    2015-01-01

    While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus) at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m) increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take preventive actions even if data on cause and effect are still inconclusive. Hence, an improved enforcement of the code of conduct coupled with a reduction in the conditioning of the whale sharks through provisioning were proposed to minimise the impacts on whale sharks in Oslob. PMID:26644984

  11. Learning from a provisioning site: code of conduct compliance and behaviour of whale sharks in Oslob, Cebu, Philippines.

    PubMed

    Schleimer, Anna; Araujo, Gonzalo; Penketh, Luke; Heath, Anna; McCoy, Emer; Labaja, Jessica; Lucey, Anna; Ponzo, Alessandro

    2015-01-01

    While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus) at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m) increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take preventive actions even if data on cause and effect are still inconclusive. Hence, an improved enforcement of the code of conduct coupled with a reduction in the conditioning of the whale sharks through provisioning were proposed to minimise the impacts on whale sharks in Oslob.

  12. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functionalmore » characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.« less

  13. Air-kerma strength determination of a miniature x-ray source for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Davis, Stephen D.

    A miniature x-ray source has been developed by Xoft Inc. for high dose-rate brachytherapy treatments. The source is contained in a 5.4 mm diameter water-cooling catheter. The source voltage can be adjusted from 40 kV to 50 kV and the beam current is adjustable up to 300 muA. Electrons are accelerated toward a tungsten-coated anode to produce a lightly-filtered bremsstrahlung photon spectrum. The sources were initially used for early-stage breast cancer treatment using a balloon applicator. More recently, Xoft Inc. has developed vaginal and surface applicators. The miniature x-ray sources have been characterized using a modification of the American Association of Physicists in Medicine Task Group No. 43 formalism normally used for radioactive brachytherapy sources. Primary measurements of air kerma were performed using free-air ionization chambers at the University of Wisconsin (UW) and the National Institute of Standards and Technology (NIST). The measurements at UW were used to calibrate a well-type ionization chamber for clinical verification of source strength. Accurate knowledge of the emitted photon spectrum was necessary to calculate the corrections required to determine air-kerma strength, defined in vacuo. Theoretical predictions of the photon spectrum were calculated using three separate Monte Carlo codes: MCNP5, EGSnrc, and PENELOPE. Each code used different implementations of the underlying radiological physics. Benchmark studies were performed to investigate these differences in detail. The most important variation among the codes was found to be the calculation of fluorescence photon production following electron-induced vacancies in the L shell of tungsten atoms. The low-energy tungsten L-shell fluorescence photons have little clinical significance at the treatment distance, but could have a large impact on air-kerma measurements. Calculated photon spectra were compared to spectra measured with high-purity germanium spectroscopy systems at both UW and NIST. The effects of escaped germanium fluorescence photons and Compton-scattered photons were taken into account for the UW measurements. The photon spectrum calculated using the PENELOPE Monte Carlo code had the best agreement with the spectrum measured at NIST. Corrections were applied to the free-air chamber measurements to arrive at an air-kerma strength determination for the miniature x-ray sources.

  14. The histone modifications governing TFF1 transcription mediated by estrogen receptor.

    PubMed

    Li, Yanyan; Sun, Luyang; Zhang, Yu; Wang, Dandan; Wang, Feng; Liang, Jing; Gui, Bin; Shang, Yongfeng

    2011-04-22

    Transcription regulation by histone modifications is a major contributing factor to the structural and functional diversity in biology. These modifications are encrypted as histone codes or histone languages and function to establish and maintain heritable epigenetic codes that define the identity and the fate of the cell. Despite recent advances revealing numerous histone modifications associated with transcription regulation, how such modifications dictate the process of transcription is not fully understood. Here we describe spatial and temporal analyses of the histone modifications that are introduced during estrogen receptor α (ERα)-activated transcription. We demonstrated that aborting RNA polymerase II caused a disruption of the histone modifications that are associated with transcription elongation but had a minimal effect on modifications deposited during transcription initiation. We also found that the histone H3S10 phosphorylation mark is catalyzed by mitogen- and stress-activated protein kinase 1 (MSK1) and is recognized by a 14-3-3ζ/14-3-3ε heterodimer through its interaction with H3K4 trimethyltransferase SMYD3 and the p52 subunit of TFIIH. We showed that H3S10 phosphorylation is a prerequisite for H3K4 trimethylation. In addition, we demonstrated that SET8/PR-Set7/KMT5A is required for ERα-regulated transcription and its catalyzed H4K20 monomethylation is implicated in both transcription initiation and elongation. Our experiments provide a relatively comprehensive analysis of histone modifications associated with ERα-regulated transcription and define the biological meaning of several key components of the histone code that governs ERα-regulated transcription.

  15. Standardizing texture and facies codes for a process-based classification of clastic sediment and rock

    USGS Publications Warehouse

    Farrell, K.M.; Harris, W.B.; Mallinson, D.J.; Culver, S.J.; Riggs, S.R.; Pierson, J.; ,; Lautier, J.C.

    2012-01-01

    Proposed here is a universally applicable, texturally based classification of clastic sediment that is independent from composition, cementation, and geologic environment, is closely allied to process sedimentology, and applies to all compartments in the source-to-sink system. The classification is contingent on defining the term "clastic" so that it is independent from composition or origin and includes any particles or grains that are subject to erosion, transportation, and deposition. Modifications to Folk's (1980) texturally based classification that include applying new assumptions and defining a broader array of textural fields are proposed to accommodate this. The revised ternary diagrams include additional textural fields that better define poorly sorted and coarse-grained deposits, so that all end members (gravel, sand, and mud size fractions) are included in textural codes. Revised textural fields, or classes, are based on a strict adherence to volumetric estimates of percentages of gravel, sand, and mud size grain populations, which by definition must sum to 100%. The new classification ensures that descriptors are applied consistently to all end members in the ternary diagram (gravel, sand, and mud) according to several rules, and that none of the end members are ignored. These modifications provide bases for standardizing vertical displays of texture in graphic logs, lithofacies codes, and their derivatives- hydrofacies. Hydrofacies codes are nondirectional permeability indicators that predict aquifer or reservoir potential. Folk's (1980) ternary diagram for fine-grained clastic sediments (sand, silt, and clay size fractions) is also revised to preserve consistency with the revised diagram for gravel, sand, and mud. Standardizing texture ensures that the principles of process sedimentology are consistently applied to compositionally variable rock sequences, such as mixed carbonate-siliciclastic ramp settings, and the extreme ends of depositional systems.

  16. Gold emissivities for hydrocode applications

    NASA Astrophysics Data System (ADS)

    Bowen, C.; Wagon, F.; Galmiche, D.; Loiseau, P.; Dattolo, E.; Babonneau, D.

    2004-10-01

    The Radiom model [M. Busquet, Phys Fluids B 5, 4191 (1993)] is designed to provide a radiative-hydrodynamic code with non-local thermodynamic equilibrium (non-LTE) data efficiently by using LTE tables. Comparison with benchmark data [M. Klapisch and A. Bar-Shalom, J. Quant. Spectrosc. Radiat. Transf. 58, 687 (1997)] has shown Radiom to be inaccurate far from LTE and for heavy ions. In particular, the emissivity was found to be strongly underestimated. A recent algorithm, Gondor [C. Bowen and P. Kaiser, J. Quant. Spectrosc. Radiat. Transf. 81, 85 (2003)], was introduced to improve the gold non-LTE ionization and corresponding opacity. It relies on fitting the collisional ionization rate to reproduce benchmark data given by the Averroès superconfiguration code [O. Peyrusse, J. Phys. B 33, 4303 (2000)]. Gondor is extended here to gold emissivity calculations, with two simple modifications of the two-level atom line source function used by Radiom: (a) a larger collisional excitation rate and (b) the addition of a Planckian source term, fitted to spectrally integrated Averroès emissivity data. This approach improves the agreement between experiments and hydrodynamic simulations.

  17. Ground motion models used in the 2014 U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.

    2015-01-01

    The National Seismic Hazard Maps (NSHMs) are an important component of seismic design regulations in the United States. This paper compares hazard using the new suite of ground motion models (GMMs) relative to hazard using the suite of GMMs applied in the previous version of the maps. The new source characterization models are used for both cases. A previous paper (Rezaeian et al. 2014) discussed the five NGA-West2 GMMs used for shallow crustal earthquakes in the Western United States (WUS), which are also summarized here. Our focus in this paper is on GMMs for earthquakes in stable continental regions in the Central and Eastern United States (CEUS), as well as subduction interface and deep intraslab earthquakes. We consider building code hazard levels for peak ground acceleration (PGA), 0.2-s, and 1.0-s spectral accelerations (SAs) on uniform firm-rock site conditions. The GMM modifications in the updated version of the maps created changes in hazard within 5% to 20% in WUS; decreases within 5% to 20% in CEUS; changes within 5% to 15% for subduction interface earthquakes; and changes involving decreases of up to 50% and increases of up to 30% for deep intraslab earthquakes for most U.S. sites. These modifications were combined with changes resulting from modifications in the source characterization models to obtain the new hazard maps.

  18. Harnessing epigenome modifications for better crops

    USDA-ARS?s Scientific Manuscript database

    Chemical DNA modifications such as methylation influence translation of the DNA code to specific genetic outcomes. While such modifications can be heritable, others are transient, and their overall contribution to plant genetic diversity remains intriguing but uncertain. The focus of this article is...

  19. Assessment and Mitigation of Radiation, EMP, Debris & Shrapnel Impacts at Megajoule-Class Laser Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eder, D C; Anderson, R W; Bailey, D S

    2009-10-05

    The generation of neutron/gamma radiation, electromagnetic pulses (EMP), debris and shrapnel at mega-Joule class laser facilities (NIF and LMJ) impacts experiments conducted at these facilities. The complex 3D numerical codes used to assess these impacts range from an established code that required minor modifications (MCNP - calculates neutron and gamma radiation levels in complex geometries), through a code that required significant modifications to treat new phenomena (EMSolve - calculates EMP from electrons escaping from laser targets), to a new code, ALE-AMR, that is being developed through a joint collaboration between LLNL, CEA, and UC (UCSD, UCLA, and LBL) for debrismore » and shrapnel modelling.« less

  20. MELCOR/CONTAIN LMR Implementation Report. FY14 Progress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, Larry L; Louie, David L.Y.

    2014-10-01

    This report describes the preliminary implementation of the sodium thermophysical properties and the design documentation for the sodium models of CONTAIN-LMR to be implemented into MELCOR 2.1. In the past year, the implementation included two separate sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. To minimize the impact to MELCOR, the implementation of the fusion safety database (FSD) was done by utilizing the detection of the datamore » input file as a way to invoking the FSD. The FSD methodology has been adapted currently for this work, but it may subject modification as the project continues. The second source uses properties generated for the SIMMER code. Preliminary testing and results from this implementation of sodium properties are given. In this year, the design document for the CONTAIN-LMR sodium models, such as the two condensable option, sodium spray fire, and sodium pool fire is being developed. This design document is intended to serve as a guide for the MELCOR implementation. In addition, CONTAIN-LMR code used was based on the earlier version of CONTAIN code. Many physical models that were developed since this early version of CONTAIN may not be captured by the code. Although CONTAIN 2, which represents the latest development of CONTAIN, contains some sodium specific models, which are not complete, the utilizing CONTAIN 2 with all sodium models implemented from CONTAIN-LMR as a comparison code for MELCOR should be done. This implementation should be completed in early next year, while sodium models from CONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use.« less

  1. Performance Study of Monte Carlo Codes on Xeon Phi Coprocessors — Testing MCNP 6.1 and Profiling ARCHER Geometry Module on the FS7ONNi Problem

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Wolfe, Noah; Lin, Hui; Zieb, Kris; Ji, Wei; Caracappa, Peter; Carothers, Christopher; Xu, X. George

    2017-09-01

    This paper contains two parts revolving around Monte Carlo transport simulation on Intel Many Integrated Core coprocessors (MIC, also known as Xeon Phi). (1) MCNP 6.1 was recompiled into multithreading (OpenMP) and multiprocessing (MPI) forms respectively without modification to the source code. The new codes were tested on a 60-core 5110P MIC. The test case was FS7ONNi, a radiation shielding problem used in MCNP's verification and validation suite. It was observed that both codes became slower on the MIC than on a 6-core X5650 CPU, by a factor of 4 for the MPI code and, abnormally, 20 for the OpenMP code, and both exhibited limited capability of strong scaling. (2) We have recently added a Constructive Solid Geometry (CSG) module to our ARCHER code to provide better support for geometry modelling in radiation shielding simulation. The functions of this module are frequently called in the particle random walk process. To identify the performance bottleneck we developed a CSG proxy application and profiled the code using the geometry data from FS7ONNi. The profiling data showed that the code was primarily memory latency bound on the MIC. This study suggests that despite low initial porting e_ort, Monte Carlo codes do not naturally lend themselves to the MIC platform — just like to the GPUs, and that the memory latency problem needs to be addressed in order to achieve decent performance gain.

  2. 75 FR 6252 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49... Transportation, Inc. seeks approval of the proposed modification of the bridge tender controlled signals to...

  3. A Web-based open-source database for the distribution of hyperspectral signatures

    NASA Astrophysics Data System (ADS)

    Ferwerda, J. G.; Jones, S. D.; Du, Pei-Jun

    2006-10-01

    With the coming of age of field spectroscopy as a non-destructive means to collect information on the physiology of vegetation, there is a need for storage of signatures, and, more importantly, their metadata. Without the proper organisation of metadata, the signatures itself become limited. In order to facilitate re-distribution of data, a database for the storage & distribution of hyperspectral signatures and their metadata was designed. The database was built using open-source software, and can be used by the hyperspectral community to share their data. Data is uploaded through a simple web-based interface. The database recognizes major file-formats by ASD, GER and International Spectronics. The database source code is available for download through the hyperspectral.info web domain, and we happily invite suggestion for additions & modification for the database to be submitted through the online forums on the same website.

  4. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  5. MedlinePlus Connect

    MedlinePlus

    ... code requests: Problems/Diagnoses • ICD-9-CM (International Classification of Disease, 9 th edition, Clinical Modification) • ICD-10-CM (International Classification of Disease, 10 th edition, Clinical Modification) • SNOMED ...

  6. Issues in Developing a Surveillance Case Definition for Nonfatal Suicide Attempt and Intentional Self-harm Using International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) Coded Data.

    PubMed

    Hedegaard, Holly; Schoenbaum, Michael; Claassen, Cynthia; Crosby, Alex; Holland, Kristin; Proescholdbell, Scott

    2018-02-01

    Suicide and intentional self-harm are among the leading causes of death in the United States. To study this public health issue, epidemiologists and researchers often analyze data coded using the International Classification of Diseases (ICD). Prior to October 1, 2015, health care organizations and providers used the clinical modification of the Ninth Revision of ICD (ICD-9-CM) to report medical information in electronic claims data. The transition in October 2015 to use of the clinical modification of the Tenth Revision of ICD (ICD-10-CM) resulted in the need to update methods and selection criteria previously developed for ICD-9-CM coded data. This report provides guidance on the use of ICD-10-CM codes to identify cases of nonfatal suicide attempts and intentional self-harm in ICD-10-CM coded data sets. ICD-10-CM codes for nonfatal suicide attempts and intentional self-harm include: X71-X83, intentional self-harm due to drowning and submersion, firearms, explosive or thermal material, sharp or blunt objects, jumping from a high place, jumping or lying in front of a moving object, crashing of motor vehicle, and other specified means; T36-T50 with a 6th character of 2 (except for T36.9, T37.9, T39.9, T41.4, T42.7, T43.9, T45.9, T47.9, and T49.9, which are included if the 5th character is 2), intentional self-harm due to drug poisoning (overdose); T51-T65 with a 6th character of 2 (except for T51.9, T52.9, T53.9, T54.9, T56.9, T57.9, T58.0, T58.1, T58.9, T59.9, T60.9, T61.0, T61.1, T61.9, T62.9, T63.9, T64.0, T64.8, and T65.9, which are included if the 5th character is 2), intentional self-harm due to toxic effects of nonmedicinal substances; T71 with a 6th character of 2, intentional self-harm due to asphyxiation, suffocation, strangulation; and T14.91, Suicide attempt. Issues to consider when selecting records for nonfatal suicide attempts and intentional self-harm from ICD-10-CM coded administrative data sets are also discussed. All material appearing in this report is in the public domain and may be reproduced or copied without permission; citation as to source, however, is appreciated.

  7. Incidence Rates and Trend of Serious Farm-Related Injury in Minnesota, 2000-2011.

    PubMed

    Landsteiner, Adrienne M K; McGovern, Patricia M; Alexander, Bruce H; Lindgren, Paula G; Williams, Allan N

    2015-01-01

    Only about 2% of Minnesota's workers were employed in agriculture for the years 2005-2012, this small portion of the workforce accounted for 31% of the 563 work-related deaths that occurred in Minnesota during that same time period. Agricultural fatalities in Minnesota and elsewhere are well documented; however, nonfatal injuries are not. To explore the burden of injury, Minnesota hospital discharge data were used to examine rates and trends of farm injury for the years 2000-2011. Cases were identified through the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM), injury codes and external cause of injury codes (E codes). Probable cases were defined as E code E849.1 (occurred on a farm) or E919.0 (involving agricultural machinery). Possible cases were based on five less specific E codes primarily involving animals or pesticides. Multiple data sources were used to estimate the agricultural population. An annual average of over 500 cases was identified as probable, whereas 2,000 cases were identified as possible. Trend analysis of all identified cases indicated a small but significant average annual increase of 1.5% for the time period 2000-2011. Probable cases were predominantly male (81.5%), whereas possible cases were predominantly female (63.9%). The average age of an injury case was 38.5 years, with the majority of injuries occurring in late summer and fall months. Despite the undercount of less serious injuries, hospital discharge data provide a meaningful data source for the identification and surveillance of nonfatal agricultural injuries. These methods could be utilized by other states for ongoing surveillance for nonfatal agricultural injuries.

  8. Report of experiments and evidence for ASC L2 milestone 4467 : demonstration of a legacy application's path to exascale.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke

    2012-03-01

    This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to provemore » that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were that applications can expect to encounter performance issues related to the computing environment, system software, and algorithms. Careful profiling of runtime performance will be needed to identify the source of an issue, in strong combination with knowledge of system software and application source code.« less

  9. A Numerical Study of the Non-Ideal Behavior, Parameters, and Novel Applications of an Electrothermal Plasma Source

    NASA Astrophysics Data System (ADS)

    Winfrey, A. Leigh

    Electrothermal plasma sources have numerous applications including hypervelocity launchers, fusion reactor pellet injection, and space propulsion systems. The time evolution of important plasma parameters at the source exit is important in determining the suitability of the source for different applications. In this study a capillary discharge code has been modified to incorporate non-ideal behavior by using an exact analytical model for the Coulomb logarithm in the plasma electrical conductivity formula. Actual discharge currents from electrothermal plasma experiments were used and code results for both ideal and non-ideal plasma models were compared to experimental data, specifically the ablated mass from the capillary and the electrical conductivity as measured by the discharge current and the voltage. Electrothermal plasma sources operating in the ablation-controlled arc regime use discharge currents with pulse lengths between 100 micros to 1 ms. Faster or longer or extended flat-top pulses can also be generated to satisfy various applications of ET sources. Extension of the peak current for up to an additional 1000 micros was tested. Calculations for non-ideal and ideal plasma models show that extended flattop pulses produce more ablated mass, which scales linearly with increased pulse length while other parameters remain almost constant. A new configuration of the PIPE source has been proposed in order to investigate the formation of plasmas from mixed materials. The electrothermal segmented plasma source can be used for studies related to surface coatings, surface modification, ion implantation, materials synthesis, and the physics of complex mixed plasmas. This source is a capillary discharge where the ablation liner is made from segments of different materials instead of a single sleeve. This system should allow for the modeling and characterization of the growth plasma as it provides all materials needed for fabrication through the same method. An ablation-free capillary discharge computer code has been developed to model plasma flow and acceleration of pellets for fusion fueling in magnetic fusion reactors. Two case studies with and without ablation, including different source configurations have been studied here. Velocities necessary for fusion fueling have been achieved. New additions made to the code model incorporate radial heat and energy transfer and move ETFLOW towards being a 2-D model of the plasma flow. This semi 2-D approach gives a view of the behavior of the plasma inside the capillary as it is affected by important physical parameters such as radial thermal heat conduction and their effect on wall ablation.

  10. 75 FR 6251 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49... Company (BNSF) seeks approval of the proposed modification to the traffic control signal system over the...

  11. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  12. Parallelization of sequential Gaussian, indicator and direct simulation algorithms

    NASA Astrophysics Data System (ADS)

    Nunes, Ruben; Almeida, José A.

    2010-08-01

    Improving the performance and robustness of algorithms on new high-performance parallel computing architectures is a key issue in efficiently performing 2D and 3D studies with large amount of data. In geostatistics, sequential simulation algorithms are good candidates for parallelization. When compared with other computational applications in geosciences (such as fluid flow simulators), sequential simulation software is not extremely computationally intensive, but parallelization can make it more efficient and creates alternatives for its integration in inverse modelling approaches. This paper describes the implementation and benchmarking of a parallel version of the three classic sequential simulation algorithms: direct sequential simulation (DSS), sequential indicator simulation (SIS) and sequential Gaussian simulation (SGS). For this purpose, the source used was GSLIB, but the entire code was extensively modified to take into account the parallelization approach and was also rewritten in the C programming language. The paper also explains in detail the parallelization strategy and the main modifications. Regarding the integration of secondary information, the DSS algorithm is able to perform simple kriging with local means, kriging with an external drift and collocated cokriging with both local and global correlations. SIS includes a local correction of probabilities. Finally, a brief comparison is presented of simulation results using one, two and four processors. All performance tests were carried out on 2D soil data samples. The source code is completely open source and easy to read. It should be noted that the code is only fully compatible with Microsoft Visual C and should be adapted for other systems/compilers.

  13. Coding paediatric outpatient data to provide health planners with information on children with chronic conditions and disabilities.

    PubMed

    Craig, Elizabeth; Kerr, Neal; McDonald, Gabrielle

    2017-03-01

    In New Zealand, there is a paucity of information on children with chronic conditions and disabilities (CCD). One reason is that many are managed in hospital outpatients where diagnostic coding of health-care events does not occur. This study explores the feasibility of coding paediatric outpatient data to provide health planners with information on children with CCD. Thirty-seven clinicians from six District Health Boards (DHBs) trialled coding over 12 weeks. In five DHBs, the International Classification of Diseases and Related Health Problems, 10th Edition, Australian Modification (ICD-10-AM) and Systematised Nomenclature of Medicine Clinical Terms (SNOMED-CT) were trialled for 6 weeks each. In one DHB, ICD-10-AM was trialled for 12 weeks. A random sample (30%) of ICD-10-AM coded events were also coded by clinical coders. A mix of paper and electronic methods were used. In total 2,604 outpatient events were coded in ICD-10-AM and 693 in SNOMED-CT. Dual coding occurred for 770 (29.6%) ICD-10-AM events. Overall, 34% of ICD-10-AM and 40% of SNOMED-CT events were for developmental and behavioural disorders. Chronic medical conditions were also common. Clinicians were concerned about the workload impacts, particularly for paper-based methods. Coder's were concerned about clinician's adherence to coding guidelines and the poor quality of documentation in some notes. Coded outpatient data could provide planners with a rich source of information on children with CCD. However, coding is also resource intensive. Thus its costs need to be weighed against the costs of managing a much larger health budget using very limited information. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).

  14. User's guide to SEAWAT; a computer program for simulation of three-dimensional variable-density ground-water flow

    USGS Publications Warehouse

    Guo, Weixing; Langevin, C.D.

    2002-01-01

    This report documents a computer program (SEAWAT) that simulates variable-density, transient, ground-water flow in three dimensions. The source code for SEAWAT was developed by combining MODFLOW and MT3DMS into a single program that solves the coupled flow and solute-transport equations. The SEAWAT code follows a modular structure, and thus, new capabilities can be added with only minor modifications to the main program. SEAWAT reads and writes standard MODFLOW and MT3DMS data sets, although some extra input may be required for some SEAWAT simulations. This means that many of the existing pre- and post-processors can be used to create input data sets and analyze simulation results. Users familiar with MODFLOW and MT3DMS should have little difficulty applying SEAWAT to problems of variable-density ground-water flow.

  15. Modification of orthogonal tRNAs: unexpected consequences for sense codon reassignment.

    PubMed

    Biddle, Wil; Schmitt, Margaret A; Fisk, John D

    2016-12-01

    Breaking the degeneracy of the genetic code via sense codon reassignment has emerged as a way to incorporate multiple copies of multiple non-canonical amino acids into a protein of interest. Here, we report the modification of a normally orthogonal tRNA by a host enzyme and show that this adventitious modification has a direct impact on the activity of the orthogonal tRNA in translation. We observed nearly equal decoding of both histidine codons, CAU and CAC, by an engineered orthogonal M. jannaschii tRNA with an AUG anticodon: tRNA Opt We suspected a modification of the tRNA Opt AUG anticodon was responsible for the anomalous lack of codon discrimination and demonstrate that adenosine 34 of tRNA Opt AUG is converted to inosine. We identified tRNA Opt AUG anticodon loop variants that increase reassignment of the histidine CAU codon, decrease incorporation in response to the histidine CAC codon, and improve cell health and growth profiles. Recognizing tRNA modification as both a potential pitfall and avenue of directed alteration will be important as the field of genetic code engineering continues to infiltrate the genetic codes of diverse organisms. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. 40 CFR 52.736 - Review of new sources and modifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sources and modifications. (a) [Reserved] (b) The rules submitted by the State on March 24, 1988, to... Sources Construction and Modification as effective March 22, 1991. The moratorium on construction and...

  17. Determining Epigenetic Targets: A Beginner's Guide to Identifying Genome Functionality Through Database Analysis.

    PubMed

    Hay, Elizabeth A; Cowie, Philip; MacKenzie, Alasdair

    2017-01-01

    There can now be little doubt that the cis-regulatory genome represents the largest information source within the human genome essential for health. In addition to containing up to five times more information than the coding genome, the cis-regulatory genome also acts as a major reservoir of disease-associated polymorphic variation. The cis-regulatory genome, which is comprised of enhancers, silencers, promoters, and insulators, also acts as a major functional target for epigenetic modification including DNA methylation and chromatin modifications. These epigenetic modifications impact the ability of cis-regulatory sequences to maintain tissue-specific and inducible expression of genes that preserve health. There has been limited ability to identify and characterize the functional components of this huge and largely misunderstood part of the human genome that, for decades, was ignored as "Junk" DNA. In an attempt to address this deficit, the current chapter will first describe methods of identifying and characterizing functional elements of the cis-regulatory genome at a genome-wide level using databases such as ENCODE, the UCSC browser, and NCBI. We will then explore the databases on the UCSC genome browser, which provides access to DNA methylation and chromatin modification datasets. Finally, we will describe how we can superimpose the huge volume of study data contained in the NCBI archives onto that contained within the UCSC browser in order to glean relevant in vivo study data for any locus within the genome. An ability to access and utilize these information sources will become essential to informing the future design of experiments and subsequent determination of the role of epigenetics in health and disease and will form a critical step in our development of personalized medicine.

  18. Power-on performance predictions for a complete generic hypersonic vehicle configuration

    NASA Technical Reports Server (NTRS)

    Bennett, Bradford C.

    1991-01-01

    The Compressible Navier-Stokes (CNS) code was developed to compute external hypersonic flow fields. It has been applied to various hypersonic external flow applications. Here, the CNS code was modified to compute hypersonic internal flow fields. Calculations were performed on a Mach 18 sidewall compression inlet and on the Lewis Mach 5 inlet. The use of the ARC3D diagonal algorithm was evaluated for internal flows on the Mach 5 inlet flow. The initial modifications to the CNS code involved generalization of the boundary conditions and the addition of viscous terms in the second crossflow direction and modifications to the Baldwin-Lomax turbulence model for corner flows.

  19. Recent Updates to the MELCOR 1.8.2 Code for ITER Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, Brad J

    This report documents recent changes made to the MELCOR 1.8.2 computer code for application to the International Thermonuclear Experimental Reactor (ITER), as required by ITER Task Agreement ITA 81-18. There are four areas of change documented by this report. The first area is the addition to this code of a model for transporting HTO. The second area is the updating of the material oxidation correlations to match those specified in the ITER Safety Analysis Data List (SADL). The third area replaces a modification to an aerosol tranpsort subroutine that specified the nominal aerosol density internally with one that now allowsmore » the user to specify this density through user input. The fourth area corrected an error that existed in an air condensation subroutine of previous versions of this modified MELCOR code. The appendices of this report contain FORTRAN listings of the coding for these modifications.« less

  20. Modeling high-temperature superconductors and metallic alloys on the Intel IPSC/860

    NASA Astrophysics Data System (ADS)

    Geist, G. A.; Peyton, B. W.; Shelton, W. A.; Stocks, G. M.

    Oak Ridge National Laboratory has embarked on several computational Grand Challenges, which require the close cooperation of physicists, mathematicians, and computer scientists. One of these projects is the determination of the material properties of alloys from first principles and, in particular, the electronic structure of high-temperature superconductors. While the present focus of the project is on superconductivity, the approach is general enough to permit study of other properties of metallic alloys such as strength and magnetic properties. This paper describes the progress to date on this project. We include a description of a self-consistent KKR-CPA method, parallelization of the model, and the incorporation of a dynamic load balancing scheme into the algorithm. We also describe the development and performance of a consolidated KKR-CPA code capable of running on CRAYs, workstations, and several parallel computers without source code modification. Performance of this code on the Intel iPSC/860 is also compared to a CRAY 2, CRAY YMP, and several workstations. Finally, some density of state calculations of two perovskite superconductors are given.

  1. The Role of Sulforaphane in Epigenetic Mechanisms, Including Interdependence between Histone Modification and DNA Methylation

    PubMed Central

    Kaufman-Szymczyk, Agnieszka; Majewski, Grzegorz; Lubecka-Pietruszewska, Katarzyna; Fabianowska-Majewska, Krystyna

    2015-01-01

    Carcinogenesis as well as cancer progression result from genetic and epigenetic changes of the genome that leads to dysregulation of transcriptional activity of genes. Epigenetic mechanisms in cancer cells comprise (i) post-translation histone modification (i.e., deacetylation and methylation); (ii) DNA global hypomethylation; (iii) promoter hypermethylation of tumour suppressor genes and genes important for cell cycle regulation, cell differentiation and apoptosis; and (iv) posttranscriptional regulation of gene expression by noncoding microRNA. These epigenetic aberrations can be readily reversible and responsive to both synthetic agents and natural components of diet. A source of one of such diet components are cruciferous vegetables, which contain high levels of a number of glucosinolates and deliver, after enzymatic hydrolysis, sulforaphane and other bioactive isothiocyanates, that are involved in effective up-regulation of transcriptional activity of certain genes and also in restoration of active chromatin structure. Thus a consumption of cruciferous vegetables, treated as a source of isothiocyanates, seems to be potentially useful as an effective cancer preventive factor or as a source of nutrients improving efficacy of standard chemotherapies. In this review an attempt is made to elucidate the role of sulforaphane in regulation of gene promoter activity through a direct down-regulation of histone deacetylase activity and alteration of gene promoter methylation in indirect ways, but the sulforaphane influence on non-coding micro-RNA will not be a subject of this review. PMID:26703571

  2. Core Physics and Kinetics Calculations for the Fissioning Plasma Core Reactor

    NASA Technical Reports Server (NTRS)

    Butler, C.; Albright, D.

    2007-01-01

    Highly efficient, compact nuclear reactors would provide high specific impulse spacecraft propulsion. This analysis and numerical simulation effort has focused on the technical feasibility issues related to the nuclear design characteristics of a novel reactor design. The Fissioning Plasma Core Reactor (FPCR) is a shockwave-driven gaseous-core nuclear reactor, which uses Magneto Hydrodynamic effects to generate electric power to be used for propulsion. The nuclear design of the system depends on two major calculations: core physics calculations and kinetics calculations. Presently, core physics calculations have concentrated on the use of the MCNP4C code. However, initial results from other codes such as COMBINE/VENTURE and SCALE4a. are also shown. Several significant modifications were made to the ISR-developed QCALC1 kinetics analysis code. These modifications include testing the state of the core materials, an improvement to the calculation of the material properties of the core, the addition of an adiabatic core temperature model and improvement of the first order reactivity correction model. The accuracy of these modifications has been verified, and the accuracy of the point-core kinetics model used by the QCALC1 code has also been validated. Previously calculated kinetics results for the FPCR were described in the ISR report, "QCALC1: A code for FPCR Kinetics Model Feasibility Analysis" dated June 1, 2002.

  3. 78 FR 12823 - Notice of Joint Application for Approval of Discontinuance or Modification of a Railroad Signal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-25

    ... of Joint Application for Approval of Discontinuance or Modification of a Railroad Signal System In... Administration (FRA) seeking approval for the discontinuance or modification of a signal system. FRA assigned the.... David B. Olson, Chief Engineer Communication and Signals, 500 Water Street, Speed Code J-350...

  4. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  5. 3D Field Modifications of Core Neutral Fueling In the EMC3-EIRENE Code

    NASA Astrophysics Data System (ADS)

    Waters, Ian; Frerichs, Heinke; Schmitz, Oliver; Ahn, Joon-Wook; Canal, Gustavo; Evans, Todd; Feng, Yuehe; Kaye, Stanley; Maingi, Rajesh; Soukhanovskii, Vsevolod

    2017-10-01

    The application of 3-D magnetic field perturbations to the edge plasmas of tokamaks has long been seen as a viable way to control damaging Edge Localized Modes (ELMs). These 3-D fields have also been correlated with a density drop in the core plasmas of tokamaks; known as `pump-out'. While pump-out is typically explained as the result of enhanced outward transport, degraded fueling of the core may also play a role. By altering the temperature and density of the plasma edge, 3-D fields will impact the distribution function of high energy neutral particles produced through ion-neutral energy exchange processes. Starved of the deeply penetrating neutral source, the core density will decrease. Numerical studies carried out with the EMC3-EIRENE code on National Spherical Tokamak eXperiment-Upgrade (NSTX-U) equilibria show that this change to core fueling by high energy neutrals may be a significant contributor to the overall particle balance in the NSTX-U tokamak: deep core (Ψ < 0.5) fueling from neutral ionization sources is decreased by 40-60% with RMPs. This work was funded by the US Department of Energy under Grant DE-SC0012315.

  6. 21 CFR 640.74 - Modification of Source Plasma.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Modification of Source Plasma. 640.74 Section 640...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.74 Modification of Source Plasma. (a) Upon approval by the Director, Center for Biologics Evaluation and Research, Food and...

  7. 21 CFR 640.74 - Modification of Source Plasma.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Modification of Source Plasma. 640.74 Section 640...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.74 Modification of Source Plasma. Link to an amendment published at 77 FR 18, Jan. 3, 2012. (a) Upon approval by the...

  8. 21 CFR 640.74 - Modification of Source Plasma.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Modification of Source Plasma. 640.74 Section 640...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.74 Modification of Source Plasma. (a) Upon approval by the Director, Center for Biologics Evaluation and Research, Food and...

  9. 21 CFR 640.74 - Modification of Source Plasma.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Modification of Source Plasma. 640.74 Section 640...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.74 Modification of Source Plasma. (a) Upon approval by the Director, Center for Biologics Evaluation and Research, Food and...

  10. 21 CFR 640.74 - Modification of Source Plasma.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Modification of Source Plasma. 640.74 Section 640...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.74 Modification of Source Plasma. (a) Upon approval by the Director, Center for Biologics Evaluation and Research, Food and...

  11. Prediction of phosphorus loads in an artificially drained lowland catchment using a modified SWAT model

    NASA Astrophysics Data System (ADS)

    Bauwe, Andreas; Eckhardt, Kai-Uwe; Lennartz, Bernd

    2017-04-01

    Eutrophication is still one of the main environmental problems in the Baltic Sea. Currently, agricultural diffuse sources constitute the major portion of phosphorus (P) fluxes to the Baltic Sea and have to be reduced to achieve the HELCOM targets and improve the ecological status. Eco-hydrological models are suitable tools to identify sources of nutrients and possible measures aiming at reducing nutrient loads into surface waters. In this study, the Soil and Water Assessment Tool (SWAT) was applied to the Warnow river basin (3300 km2), the second largest watershed in Germany discharging into the Baltic Sea. The Warnow river basin is located in northeastern Germany and characterized by lowlands with a high proportion of artificially drained areas. The aim of this study were (i) to estimate P loadings for individual flow fractions (point sources, surface runoff, tile flow, groundwater flow), spatially distributed on sub-basin scale. Since the official version of SWAT does not allow for the modeling of P in tile drains, we tested (ii) two different approaches of simulating P in tile drains by changing the SWAT source code. The SWAT source code was modified so that (i) the soluble P concentration of the groundwater was transferred to the tile water and (ii) the soluble P in the soil was transferred to the tiles. The SWAT model was first calibrated (2002-2011) and validated (1992-2001) for stream flow at 7 headwater catchments at a daily time scale. Based on this, the stream flow at the outlet of the Warnow river basin was simulated. Performance statistics indicated at least satisfactory model results for each sub-basin. Breaking down the discharge into flow constituents, it becomes visible that stream flow is mainly governed by groundwater and tile flow. Due to the topographic situation with gentle slopes, surface runoff played only a minor role. Results further indicate that the prediction of soluble P loads was improved by the modified SWAT versions. Major sources of P in rivers are groundwater and tile flow. P was also released by surface runoff during large storm events when sediment was eroded into the rivers. The contributions of point sources in terms of waste water treatment plants to the overall P loading were low. The modifications made in the SWAT source code should be considered as a starting point to simulate P loads in artificially drained landscapes more precisely. Further testing and development of the code is required.

  12. CESAR: A Code for Nuclear Fuel and Waste Characterisation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vidal, J.M.; Grouiller, J.P.; Launay, A.

    2006-07-01

    CESAR (Simplified Evolution Code Applied to Reprocessing) is a depletion code developed through a joint program between CEA and COGEMA. In the late 1980's, the first use of this code dealt with nuclear measurement at the Laboratories of the La Hague reprocessing plant. The use of CESAR was then extended to characterizations of all entrance materials and for characterisation, via tracer, of all produced waste. The code can distinguish more than 100 heavy nuclides, 200 fission products and 100 activation products, and it can characterise both the fuel and the structural material of the fuel. CESAR can also make depletionmore » calculations from 3 months to 1 million years of cooling time. Between 2003-2005, the 5. version of the code was developed. The modifications were related to the harmonisation of the code's nuclear data with the JEF2.2 nuclear data file. This paper describes the code and explains the extensive use of this code at the La Hague reprocessing plant and also for prospective studies. The second part focuses on the modifications of the latest version, and describes the application field and the qualification of the code. Many companies and the IAEA use CESAR today. CESAR offers a Graphical User Interface, which is very user-friendly. (authors)« less

  13. Epigenetics of Peripheral B-Cell Differentiation and the Antibody Response

    PubMed Central

    Zan, Hong; Casali, Paolo

    2015-01-01

    Epigenetic modifications, such as histone post-translational modifications, DNA methylation, and alteration of gene expression by non-coding RNAs, including microRNAs (miRNAs) and long non-coding RNAs (lncRNAs), are heritable changes that are independent from the genomic DNA sequence. These regulate gene activities and, therefore, cellular functions. Epigenetic modifications act in concert with transcription factors and play critical roles in B cell development and differentiation, thereby modulating antibody responses to foreign- and self-antigens. Upon antigen encounter by mature B cells in the periphery, alterations of these lymphocytes epigenetic landscape are induced by the same stimuli that drive the antibody response. Such alterations instruct B cells to undergo immunoglobulin (Ig) class switch DNA recombination (CSR) and somatic hypermutation (SHM), as well as differentiation to memory B cells or long-lived plasma cells for the immune memory. Inducible histone modifications, together with DNA methylation and miRNAs modulate the transcriptome, particularly the expression of activation-induced cytidine deaminase, which is essential for CSR and SHM, and factors central to plasma cell differentiation, such as B lymphocyte-induced maturation protein-1. These inducible B cell-intrinsic epigenetic marks guide the maturation of antibody responses. Combinatorial histone modifications also function as histone codes to target CSR and, possibly, SHM machinery to the Ig loci by recruiting specific adaptors that can stabilize CSR/SHM factors. In addition, lncRNAs, such as recently reported lncRNA-CSR and an lncRNA generated through transcription of the S region that form G-quadruplex structures, are also important for CSR targeting. Epigenetic dysregulation in B cells, including the aberrant expression of non-coding RNAs and alterations of histone modifications and DNA methylation, can result in aberrant antibody responses to foreign antigens, such as those on microbial pathogens, and generation of pathogenic autoantibodies, IgE in allergic reactions, as well as B cell neoplasia. Epigenetic marks would be attractive targets for new therapeutics for autoimmune and allergic diseases, and B cell malignancies. PMID:26697022

  14. 78 FR 78705 - Airworthiness Directives; Airbus Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ..., and equipment to perform this type of modification, repair, and access. UAL also stated that certain... Association (ATA) of America Code 25, Equipment/ Furnishings; and Code 53, Fuselage. (e) Reason This AD was...

  15. Experiences with Cray multi-tasking

    NASA Technical Reports Server (NTRS)

    Miya, E. N.

    1985-01-01

    The issues involved in modifying an existing code for multitasking is explored. They include Cray extensions to FORTRAN, an examination of the application code under study, designing workable modifications, specific code modifications to the VAX and Cray versions, performance, and efficiency results. The finished product is a faster, fully synchronous, parallel version of the original program. A production program is partitioned by hand to run on two CPUs. Loop splitting multitasks three key subroutines. Simply dividing subroutine data and control structure down the middle of a subroutine is not safe. Simple division produces results that are inconsistent with uniprocessor runs. The safest way to partition the code is to transfer one block of loops at a time and check the results of each on a test case. Other issues include debugging and performance. Task startup and maintenance (e.g., synchronization) are potentially expensive.

  16. Pediatric severe sepsis in U.S. children's hospitals.

    PubMed

    Balamuth, Fran; Weiss, Scott L; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Hayes, Katie; Gaieski, David; Hall, Matt; Shah, Samir S; Alpern, Elizabeth R

    2014-11-01

    To compare the prevalence, resource utilization, and mortality for pediatric severe sepsis identified using two established identification strategies. Observational cohort study from 2004 to 2012. Forty-four pediatric hospitals contributing data to the Pediatric Health Information Systems database. Children 18 years old or younger. We identified patients with severe sepsis or septic shock by using two International Classification of Diseases, 9th edition, Clinical Modification-based coding strategies: 1) combinations of International Classification of Diseases, 9th edition, Clinical Modification codes for infection plus organ dysfunction (combination code cohort); 2) International Classification of Diseases, 9th edition, Clinical Modification codes for severe sepsis and septic shock (sepsis code cohort). Outcomes included prevalence of severe sepsis, as well as hospital and ICU length of stay, and mortality. Outcomes were compared between the two cohorts examining aggregate differences over the study period and trends over time. The combination code cohort identified 176,124 hospitalizations (3.1% of all hospitalizations), whereas the sepsis code cohort identified 25,236 hospitalizations (0.45%), a seven-fold difference. Between 2004 and 2012, the prevalence of sepsis increased from 3.7% to 4.4% using the combination code cohort and from 0.4% to 0.7% using the sepsis code cohort (p < 0.001 for trend in each cohort). Length of stay (hospital and ICU) and costs decreased in both cohorts over the study period (p < 0.001). Overall, hospital mortality was higher in the sepsis code cohort than the combination code cohort (21.2% [95% CI, 20.7-21.8] vs 8.2% [95% CI, 8.0-8.3]). Over the 9-year study period, there was an absolute reduction in mortality of 10.9% (p < 0.001) in the sepsis code cohort and 3.8% (p < 0.001) in the combination code cohort. Prevalence of pediatric severe sepsis increased in the studied U.S. children's hospitals over the past 9 years, whereas resource utilization and mortality decreased. Epidemiologic estimates of pediatric severe sepsis varied up to seven-fold depending on the strategy used for case ascertainment.

  17. 75 FR 71487 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ... Discontinuance or Modification of a Railroad Signal System Pursuant to Title 49 Code of Federal Regulations (CFR... Administration (FRA) seeking approval for the discontinuance or modification of a signal system, as detailed below. Docket Number FRA-2010-0159 Applicant: BNSF Railway, Mr. James LeVere, AVP Signals, BNSF Railway...

  18. 76 FR 4416 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... Discontinuance or Modification of a Railroad Signal System Pursuant to Title 49 Code of Federal Regulations (CFR... Administration (FRA) seeking approval for the discontinuance or modification of the signal system. [Docket Number...--Signal/Comm./TCO, 1400 Douglas Street, Mail Stop 0910, Omaha, Nebraska 68179. The Union Pacific Railroad...

  19. 75 FR 29805 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-27

    ... Discontinuance or Modification of a Railroad Signal System or Relief Pursuant to title 49 Code of Federal... Administration (FRA) seeking approval for the discontinuance or modification of the signal system, or relief as... Trump, AVP Engineering--Signal/Comm/TCO, 1400 Douglas Street, STOP 0910, Omaha, Nebraska 68179. The...

  20. 75 FR 47880 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-09

    ... Discontinuance or Modification of a Railroad Signal System or Relief Pursuant to Title 49 Code of Federal... Administration (FRA) seeking approval for the discontinuance or modification of the signal system or relief from... & Louisville Railway, Inc., Mr. C. D. Edwards, General Supervisor of Signals & Structures, 1500 Kentucky Avenue...

  1. 75 FR 43610 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Discontinuance or Modification of a Railroad Signal System Pursuant to Title 49 Code of Federal Regulations (CFR... Administration (FRA), seeking approval for the discontinuance or modification of the signal system or relief from... Transportation, Inc., Mr. Joseph S. Ivanyo, Chief Engineer, Communications and Signals, 500 Water Street, SC J...

  2. 76 FR 21943 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-19

    ... Discontinuance or Modification of a Railroad Signal System Pursuant to Title 49 Code of Federal Regulations (CFR... Administration (FRA) seeking approval for the discontinuance or modification of the signal system, as detailed... block signal system (ABS) on three sections of the Roseburg Subdivision and on one section of the...

  3. 75 FR 21717 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-26

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49... signals shall be provided, relative to CN's EJ&E Griffith Connection project involving the Matteson...

  4. Study of the modifications needed for effective operation NASTRAN on IBM virtual storage computers

    NASA Technical Reports Server (NTRS)

    Mccormick, C. W.; Render, K. H.

    1975-01-01

    The necessary modifications were determined to make NASTRAN operational under virtual storage operating systems (VS1 and VS2). Suggested changes are presented which will make NASTRAN operate more efficiently under these systems. Estimates of the cost and time involved in design, coding, and implementation of all suggested modifications are included.

  5. Indoor Fast Neutron Generator for Biophysical and Electronic Applications

    NASA Astrophysics Data System (ADS)

    Cannuli, A.; Caccamo, M. T.; Marchese, N.; Tomarchio, E. A.; Pace, C.; Magazù, S.

    2018-05-01

    This study focuses the attention on an indoor fast neutron generator for biophysical and electronic applications. More specifically, the findings obtained by several simulations with the MCNP Monte Carlo code, necessary for the realization of a shield for indoor measurements, are presented. Furthermore, an evaluation of the neutron spectrum modification caused by the shielding is reported. Fast neutron generators are a valid and interesting available source of neutrons, increasingly employed in a wide range of research fields, such as science and engineering. The employed portable pulsed neutron source is a MP320 Thermo Scientific neutron generator, able to generate 2.5 MeV neutrons with a neutron yield of 2.0 x 106 n/s, a pulse rate of 250 Hz to 20 KHz and a duty factor varying from 5% to 100%. The neutron generator, based on Deuterium-Deuterium nuclear fusion reactions, is employed in conjunction with a solid-state photon detector, made of n-type high-purity germanium (PINS-GMX by ORTEC) and it is mainly addressed to biophysical and electronic studies. The present study showed a proposal for the realization of a shield necessary for indoor applications for MP320 neutron generator, with a particular analysis of the transport of neutrons simulated with Monte Carlo code and described the two main lines of research in which the source will be used.

  6. Long non-coding RNA produced by RNA polymerase V determines boundaries of heterochromatin

    PubMed Central

    Böhmdorfer, Gudrun; Sethuraman, Shriya; Rowley, M Jordan; Krzyszton, Michal; Rothi, M Hafiz; Bouzit, Lilia; Wierzbicki, Andrzej T

    2016-01-01

    RNA-mediated transcriptional gene silencing is a conserved process where small RNAs target transposons and other sequences for repression by establishing chromatin modifications. A central element of this process are long non-coding RNAs (lncRNA), which in Arabidopsis thaliana are produced by a specialized RNA polymerase known as Pol V. Here we show that non-coding transcription by Pol V is controlled by preexisting chromatin modifications located within the transcribed regions. Most Pol V transcripts are associated with AGO4 but are not sliced by AGO4. Pol V-dependent DNA methylation is established on both strands of DNA and is tightly restricted to Pol V-transcribed regions. This indicates that chromatin modifications are established in close proximity to Pol V. Finally, Pol V transcription is preferentially enriched on edges of silenced transposable elements, where Pol V transcribes into TEs. We propose that Pol V may play an important role in the determination of heterochromatin boundaries. DOI: http://dx.doi.org/10.7554/eLife.19092.001 PMID:27779094

  7. Pseudouridine profiling reveals regulated mRNA pseudouridylation in yeast and human cells

    PubMed Central

    Carlile, Thomas M.; Rojas-Duran, Maria F.; Zinshteyn, Boris; Shin, Hakyung; Bartoli, Kristen M.; Gilbert, Wendy V.

    2014-01-01

    Post-transcriptional modification of RNA nucleosides occurs in all living organisms. Pseudouridine, the most abundant modified nucleoside in non-coding RNAs1, enhances the function of transfer RNA and ribosomal RNA by stabilizing RNA structure2–8. mRNAs were not known to contain pseudouridine, but artificial pseudouridylation dramatically affects mRNA function – it changes the genetic code by facilitating non-canonical base pairing in the ribosome decoding center9,10. However, without evidence of naturally occurring mRNA pseudouridylation, its physiological was unclear. Here we present a comprehensive analysis of pseudouridylation in yeast and human RNAs using Pseudo-seq, a genome-wide, single-nucleotide-resolution method for pseudouridine identification. Pseudo-seq accurately identifies known modification sites as well as 100 novel sites in non-coding RNAs, and reveals hundreds of pseudouridylated sites in mRNAs. Genetic analysis allowed us to assign most of the new modification sites to one of seven conserved pseudouridine synthases, Pus1–4, 6, 7 and 9. Notably, the majority of pseudouridines in mRNA are regulated in response to environmental signals, such as nutrient deprivation in yeast and serum starvation in human cells. These results suggest a mechanism for the rapid and regulated rewiring of the genetic code through inducible mRNA modifications. Our findings reveal unanticipated roles for pseudouridylation and provide a resource for identifying the targets of pseudouridine synthases implicated in human disease11–13. PMID:25192136

  8. Reformation of Regulatory Technical Standards for Nuclear Power Generation Equipments in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikio Kurihara; Masahiro Aoki; Yu Maruyama

    2006-07-01

    Comprehensive reformation of the regulatory system has been introduced in Japan in order to apply recent technical progress in a timely manner. 'The Technical Standards for Nuclear Power Generation Equipments', known as the Ordinance No.622) of the Ministry of International Trade and Industry, which is used for detailed design, construction and operating stage of Nuclear Power Plants, was being modified to performance specifications with the consensus codes and standards being used as prescriptive specifications, in order to facilitate prompt review of the Ordinance with response to technological innovation. The activities on modification were performed by the Nuclear and Industrial Safetymore » Agency (NISA), the regulatory body in Japan, with support of the Japan Nuclear Energy Safety Organization (JNES), a technical support organization. The revised Ordinance No.62 was issued on July 1, 2005 and is enforced from January 1 2006. During the period from the issuance to the enforcement, JNES carried out to prepare enforceable regulatory guide which complies with each provisions of the Ordinance No.62, and also made technical assessment to endorse the applicability of consensus codes and standards, in response to NISA's request. Some consensus codes and standards were re-assessed since they were already used in regulatory review of the construction plan submitted by licensee. Other consensus codes and standards were newly assessed for endorsement. In case that proper consensus code or standards were not prepared, details of regulatory requirements were described in the regulatory guide as immediate measures. At the same time, appropriate standards developing bodies were requested to prepare those consensus code or standards. Supplementary note which provides background information on the modification, applicable examples etc. was prepared for convenience to the users of the Ordinance No. 62. This paper shows the activities on modification and the results, following the NISA's presentation at ICONE-13 that introduced the framework of the performance specifications and the modification process of the Ordinance NO. 62. (authors)« less

  9. Optical Surface Analysis Code (OSAC). 7.0

    NASA Technical Reports Server (NTRS)

    Glenn, P.

    1998-01-01

    The purpose of this modification to the Optical Surface Analysis Code (OSAC) is to upgrade the PSF program to allow the user to get proper diffracted energy normalization even when deliberately obscuring rays with internal obscurations.

  10. Modified NASA-Lewis chemical equilibrium code for MHD applications

    NASA Technical Reports Server (NTRS)

    Sacks, R. A.; Geyer, H. K.; Grammel, S. J.; Doss, E. D.

    1979-01-01

    A substantially modified version of the NASA-Lewis Chemical Equilibrium Code was recently developed. The modifications were designed to extend the power and convenience of the Code as a tool for performing combustor analysis for MHD systems studies. The effect of the programming details is described from a user point of view.

  11. Identification of ICD Codes Suggestive of Child Maltreatment

    ERIC Educational Resources Information Center

    Schnitzer, Patricia G.; Slusher, Paula L.; Kruse, Robin L.; Tarleton, Molly M.

    2011-01-01

    Objective: In order to be reimbursed for the care they provide, hospitals in the United States are required to use a standard system to code all discharge diagnoses: the International Classification of Disease, 9th Revision, Clinical Modification (ICD-9). Although ICD-9 codes specific for child maltreatment exist, they do not identify all…

  12. Smoothed Particle Hydrodynamic Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-10-05

    This code is a highly modular framework for developing smoothed particle hydrodynamic (SPH) simulations running on parallel platforms. The compartmentalization of the code allows for rapid development of new SPH applications and modifications of existing algorithms. The compartmentalization also allows changes in one part of the code used by many applications to instantly be made available to all applications.

  13. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  14. 75 FR 54220 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49... Engineer Signals, New Jersey Transit, One Penn Plaza East, Newark, New Jersey 07105-2246. The New Jersey...

  15. 75 FR 26838 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-12

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49..., Mr. William E. Van Trump, AVP Engineering -- Signal/Comm/TCO, 1400 Douglas Street, STOP 0910, Omaha...

  16. 75 FR 5638 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-03

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49... Signal System (ABS) on the entire railroad line between, but not including, the point of ownership at the...

  17. 75 FR 6252 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... Discontinuance or Modification of a Railroad Signal System or Relief From the Requirements of Title 49 Code of... approval for the discontinuance or modification of the signal system or relief from the requirements of 49... the conversion of dispatcher controlled holdout signals, 96L and 96R, to automatic signals, 8221 and...

  18. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    PubMed

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  19. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  20. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  1. Three-dimensional water droplet trajectory code validation using an ECS inlet geometry

    NASA Technical Reports Server (NTRS)

    Breer, Marlin D.; Goodman, Mark P.

    1993-01-01

    A task was completed under NASA contract, the purpose of which was to validate a three-dimensional particle trajectory code with existing test data obtained from the Icing Research Tunnel at NASA-LeRC. The geometry analyzed was a flush-mounted environmental control system (ECS) inlet. Results of the study indicated good overall agreement between analytical predictions and wind tunnel test results at most flight conditions. Difficulties were encountered when predicting impingement characteristics of the droplets less than or equal to 13.5 microns in diameter. This difficulty was corrected to some degree by modifications to a module of the particle trajectory code; however, additional modifications will be required to accurately predict impingement characteristics of smaller droplets.

  2. Assessment of the prevailing physics codes: LEOPARD, LASER, and EPRI-CELL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lan, J.S.

    1981-01-01

    In order to analyze core performance and fuel management, it is necessary to verify reactor physics codes in great detail. This kind of work not only serves the purpose of understanding and controlling the characteristics of each code, but also ensures the reliability as codes continually change due to constant modifications and machine transfers. This paper will present the results of a comprehensive verification of three code packages - LEOPARD, LASER, and EPRI-CELL.

  3. Artificial Intelligence, DNA Mimicry, and Human Health.

    PubMed

    Stefano, George B; Kream, Richard M

    2017-08-14

    The molecular evolution of genomic DNA across diverse plant and animal phyla involved dynamic registrations of sequence modifications to maintain existential homeostasis to increasingly complex patterns of environmental stressors. As an essential corollary, driver effects of positive evolutionary pressure are hypothesized to effect concerted modifications of genomic DNA sequences to meet expanded platforms of regulatory controls for successful implementation of advanced physiological requirements. It is also clearly apparent that preservation of updated registries of advantageous modifications of genomic DNA sequences requires coordinate expansion of convergent cellular proofreading/error correction mechanisms that are encoded by reciprocally modified genomic DNA. Computational expansion of operationally defined DNA memory extends to coordinate modification of coding and previously under-emphasized noncoding regions that now appear to represent essential reservoirs of untapped genetic information amenable to evolutionary driven recruitment into the realm of biologically active domains. Additionally, expansion of DNA memory potential via chemical modification and activation of noncoding sequences is targeted to vertical augmentation and integration of an expanded cadre of transcriptional and epigenetic regulatory factors affecting linear coding of protein amino acid sequences within open reading frames.

  4. 40 CFR 52.233 - Review of new sources and modifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Review of new sources and modifications. 52.233 Section 52.233 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.233 Review of new sources and modifications. (a) The following...

  5. 40 CFR 52.233 - Review of new sources and modifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Review of new sources and modifications. 52.233 Section 52.233 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.233 Review of new sources and modifications. (a) The following...

  6. 40 CFR 52.233 - Review of new sources and modifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 3 2012-07-01 2012-07-01 false Review of new sources and modifications. 52.233 Section 52.233 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.233 Review of new sources and modifications. (a) The following...

  7. Identifying Pediatric Severe Sepsis and Septic Shock: Accuracy of Diagnosis Codes.

    PubMed

    Balamuth, Fran; Weiss, Scott L; Hall, Matt; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Centkowski, Sierra; Baumer-Mouradian, Shannon; Weiser, Jason; Hayes, Katie; Shah, Samir S; Alpern, Elizabeth R

    2015-12-01

    To evaluate accuracy of 2 established administrative methods of identifying children with sepsis using a medical record review reference standard. Multicenter retrospective study at 6 US children's hospitals. Subjects were children >60 days to <19 years of age and identified in 4 groups based on International Classification of Diseases, Ninth Revision, Clinical Modification codes: (1) severe sepsis/septic shock (sepsis codes); (2) infection plus organ dysfunction (combination codes); (3) subjects without codes for infection, organ dysfunction, or severe sepsis; and (4) infection but not severe sepsis or organ dysfunction. Combination codes were allowed, but not required within the sepsis codes group. We determined the presence of reference standard severe sepsis according to consensus criteria. Logistic regression was performed to determine whether addition of codes for sepsis therapies improved case identification. A total of 130 out of 432 subjects met reference SD of severe sepsis. Sepsis codes had sensitivity 73% (95% CI 70-86), specificity 92% (95% CI 87-95), and positive predictive value 79% (95% CI 70-86). Combination codes had sensitivity 15% (95% CI 9-22), specificity 71% (95% CI 65-76), and positive predictive value 18% (95% CI 11-27). Slight improvements in model characteristics were observed when codes for vasoactive medications and endotracheal intubation were added to sepsis codes (c-statistic 0.83 vs 0.87, P = .008). Sepsis specific International Classification of Diseases, Ninth Revision, Clinical Modification codes identify pediatric patients with severe sepsis in administrative data more accurately than a combination of codes for infection plus organ dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Space Station Furnace Facility Management Information System (SSFF-MIS) Development

    NASA Technical Reports Server (NTRS)

    Meade, Robert M.

    1996-01-01

    This report summarizes the chronology, results, and lessons learned from the development of the SSFF-MIS. This system has been nearly two years in development and has yielded some valuable insights into specialized MIS development. General: In December of 1994, the Camber Corporation and Science Applications International Corporation (SAIC) were contracted to design, develop, and implement a MIS for Marshall Space Flight Center's Space Station Furnace Facility Project. The system was to be accessible from both EBM-Compatible PC and Macintosh platforms. The system was required to contain data manually entered into the MIS as well as data imported from other MSFC sources. Electronic interfaces were established for each data source and retrieval was to be performed at prescribed time intervals. The SOW requirement that predominantly drove the development software selection was the dual-platform (IBM-PC and Macintosh) requirement. The requirement that the system would be maintained by Government personnel influenced the selection of Commercial Off-the-shelf software because of its inherent stability and readily available documentation and support. Microsoft FoxPro Professional 2.6 for Windows and Macintosh was selected as the development tool. This is a software development tool that has been in use for many years. It is stable and powerful. Microsoft has since released the replacement for this product, Microsoft Visual FoxPro, but at the time of this development, it was only available on the Windows platform. The initial contract included included the requirement for capabilities relating to the Work- and Organizational Breakdown Structures, cost (plan and actuals), workforce (plan and actuals), critical path scheduling, trend analysis, procurements and contracts, interface to manufacturing, Safety and Mission Assurance, risk analysis, and technical performance indicators. It also required full documentation of the system and training of users. During the course of the contract, the requirements for Safety and Mission Assurance interface, risk analysis, and technical performance indicators were deleted. Additional capabilities were added as reflected in the Contract Chronology below. Modification 4 added the requirement for Support Contractor manpower data, the ability to manually input data not imported from non-nal sources, a general 'health' indicator screen, and remote usage. Mod 6 included the ability to change the level of planning of Civil Service Manpower at any time and the ability to manually enter Op Codes in the manufacturing data where such codes were not provided by the EMPACS database. Modification 9 included a number of changes to report contents and formats. Modification 11 required the preparation of a detailed System Design Document.

  9. 78 FR 58460 - Revision of Air Quality Implementation Plan; California; Placer County Air Pollution Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ...EPA is finalizing a limited approval and limited disapproval of two permitting rules submitted by California as a revision to the Placer County Air Pollution Control District (PCAPCD) and Feather River Air Quality Management District (FRAQMD) portion of the California State Implementation Plan (SIP). These revisions were proposed in the Federal Register on February 22, 2013 and concern construction and modification of stationary sources of air pollution within each District. We are approving local rules that regulate these emission sources under the Clean Air Act as amended in 1990 (CAA). Final approval of these rules makes the rules federally enforceable and corrects program deficiencies identified in a previous EPA rulemaking (76 FR 44809, July 27, 2011). EPA is also making a technical amendment to the Code of Federal Regulations (CFR) to reflect this previous rulemaking, which removed an obsolete provision from the California SIP.

  10. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  11. Modification of the fault logic circuit of a high-energy linear accelerator to accommodate selectively coded, large-field wedges.

    PubMed

    Miller, R W; van de Geijn, J

    1987-01-01

    A modification to the fault logic circuit that controls the collimator (COLL) fault is described. This modification permits the use of large-field wedges by adding an additional input into the reference voltage that determines the fault condition. The resistor controlling the amount of additional voltage is carried on board each wedge, within the wedge plug. This allows each wedge to determine its own, individual field size limit. Additionally, if no coding resistor is provided, the factory-supplied reference voltage is used, which sets the maximum allowable field size to 15 cm. This permits the use of factory-supplied wedges in conjunction with selected, large-field wedges, allowing proper sensing of the field size maximum in all conditions.

  12. Dnmt2 mediates intergenerational transmission of paternally acquired metabolic disorders through sperm small non-coding RNAs.

    PubMed

    Zhang, Yunfang; Zhang, Xudong; Shi, Junchao; Tuorto, Francesca; Li, Xin; Liu, Yusheng; Liebers, Reinhard; Zhang, Liwen; Qu, Yongcun; Qian, Jingjing; Pahima, Maya; Liu, Ying; Yan, Menghong; Cao, Zhonghong; Lei, Xiaohua; Cao, Yujing; Peng, Hongying; Liu, Shichao; Wang, Yue; Zheng, Huili; Woolsey, Rebekah; Quilici, David; Zhai, Qiwei; Li, Lei; Zhou, Tong; Yan, Wei; Lyko, Frank; Zhang, Ying; Zhou, Qi; Duan, Enkui; Chen, Qi

    2018-05-01

    The discovery of RNAs (for example, messenger RNAs, non-coding RNAs) in sperm has opened the possibility that sperm may function by delivering additional paternal information aside from solely providing the DNA 1 . Increasing evidence now suggests that sperm small non-coding RNAs (sncRNAs) can mediate intergenerational transmission of paternally acquired phenotypes, including mental stress 2,3 and metabolic disorders 4-6 . How sperm sncRNAs encode paternal information remains unclear, but the mechanism may involve RNA modifications. Here we show that deletion of a mouse tRNA methyltransferase, DNMT2, abolished sperm sncRNA-mediated transmission of high-fat-diet-induced metabolic disorders to offspring. Dnmt2 deletion prevented the elevation of RNA modifications (m 5 C, m 2 G) in sperm 30-40 nt RNA fractions that are induced by a high-fat diet. Also, Dnmt2 deletion altered the sperm small RNA expression profile, including levels of tRNA-derived small RNAs and rRNA-derived small RNAs, which might be essential in composing a sperm RNA 'coding signature' that is needed for paternal epigenetic memory. Finally, we show that Dnmt2-mediated m 5 C contributes to the secondary structure and biological properties of sncRNAs, implicating sperm RNA modifications as an additional layer of paternal hereditary information.

  13. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  14. Bounded-Angle Iterative Decoding of LDPC Codes

    NASA Technical Reports Server (NTRS)

    Dolinar, Samuel; Andrews, Kenneth; Pollara, Fabrizio; Divsalar, Dariush

    2009-01-01

    Bounded-angle iterative decoding is a modified version of conventional iterative decoding, conceived as a means of reducing undetected-error rates for short low-density parity-check (LDPC) codes. For a given code, bounded-angle iterative decoding can be implemented by means of a simple modification of the decoder algorithm, without redesigning the code. Bounded-angle iterative decoding is based on a representation of received words and code words as vectors in an n-dimensional Euclidean space (where n is an integer).

  15. Joint source-channel coding for motion-compensated DCT-based SNR scalable video.

    PubMed

    Kondi, Lisimachos P; Ishtiaq, Faisal; Katsaggelos, Aggelos K

    2002-01-01

    In this paper, we develop an approach toward joint source-channel coding for motion-compensated DCT-based scalable video coding and transmission. A framework for the optimal selection of the source and channel coding rates over all scalable layers is presented such that the overall distortion is minimized. The algorithm utilizes universal rate distortion characteristics which are obtained experimentally and show the sensitivity of the source encoder and decoder to channel errors. The proposed algorithm allocates the available bit rate between scalable layers and, within each layer, between source and channel coding. We present the results of this rate allocation algorithm for video transmission over a wireless channel using the H.263 Version 2 signal-to-noise ratio (SNR) scalable codec for source coding and rate-compatible punctured convolutional (RCPC) codes for channel coding. We discuss the performance of the algorithm with respect to the channel conditions, coding methodologies, layer rates, and number of layers.

  16. Novel base-pairing interactions at the tRNA wobble position crucial for accurate reading of the genetic code

    PubMed Central

    Rozov, Alexey; Demeshkina, Natalia; Khusainov, Iskander; Westhof, Eric; Yusupov, Marat; Yusupova, Gulnara

    2016-01-01

    Posttranscriptional modifications at the wobble position of transfer RNAs play a substantial role in deciphering the degenerate genetic code on the ribosome. The number and variety of modifications suggest different mechanisms of action during messenger RNA decoding, of which only a few were described so far. Here, on the basis of several 70S ribosome complex X-ray structures, we demonstrate how Escherichia coli tRNALysUUU with hypermodified 5-methylaminomethyl-2-thiouridine (mnm5s2U) at the wobble position discriminates between cognate codons AAA and AAG, and near-cognate stop codon UAA or isoleucine codon AUA, with which it forms pyrimidine–pyrimidine mismatches. We show that mnm5s2U forms an unusual pair with guanosine at the wobble position that expands general knowledge on the degeneracy of the genetic code and specifies a powerful role of tRNA modifications in translation. Our models consolidate the translational fidelity mechanism proposed previously where the steric complementarity and shape acceptance dominate the decoding mechanism. PMID:26791911

  17. Novel base-pairing interactions at the tRNA wobble position crucial for accurate reading of the genetic code.

    PubMed

    Rozov, Alexey; Demeshkina, Natalia; Khusainov, Iskander; Westhof, Eric; Yusupov, Marat; Yusupova, Gulnara

    2016-01-21

    Posttranscriptional modifications at the wobble position of transfer RNAs play a substantial role in deciphering the degenerate genetic code on the ribosome. The number and variety of modifications suggest different mechanisms of action during messenger RNA decoding, of which only a few were described so far. Here, on the basis of several 70S ribosome complex X-ray structures, we demonstrate how Escherichia coli tRNA(Lys)(UUU) with hypermodified 5-methylaminomethyl-2-thiouridine (mnm(5)s(2)U) at the wobble position discriminates between cognate codons AAA and AAG, and near-cognate stop codon UAA or isoleucine codon AUA, with which it forms pyrimidine-pyrimidine mismatches. We show that mnm(5)s(2)U forms an unusual pair with guanosine at the wobble position that expands general knowledge on the degeneracy of the genetic code and specifies a powerful role of tRNA modifications in translation. Our models consolidate the translational fidelity mechanism proposed previously where the steric complementarity and shape acceptance dominate the decoding mechanism.

  18. Novel base-pairing interactions at the tRNA wobble position crucial for accurate reading of the genetic code

    NASA Astrophysics Data System (ADS)

    Rozov, Alexey; Demeshkina, Natalia; Khusainov, Iskander; Westhof, Eric; Yusupov, Marat; Yusupova, Gulnara

    2016-01-01

    Posttranscriptional modifications at the wobble position of transfer RNAs play a substantial role in deciphering the degenerate genetic code on the ribosome. The number and variety of modifications suggest different mechanisms of action during messenger RNA decoding, of which only a few were described so far. Here, on the basis of several 70S ribosome complex X-ray structures, we demonstrate how Escherichia coli tRNALysUUU with hypermodified 5-methylaminomethyl-2-thiouridine (mnm5s2U) at the wobble position discriminates between cognate codons AAA and AAG, and near-cognate stop codon UAA or isoleucine codon AUA, with which it forms pyrimidine-pyrimidine mismatches. We show that mnm5s2U forms an unusual pair with guanosine at the wobble position that expands general knowledge on the degeneracy of the genetic code and specifies a powerful role of tRNA modifications in translation. Our models consolidate the translational fidelity mechanism proposed previously where the steric complementarity and shape acceptance dominate the decoding mechanism.

  19. Calculation and use of an environment's characteristic software metric set

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    Since both cost/quality and production environments differ, this study presents an approach for customizing a characteristic set of software metrics to an environment. The approach is applied in the Software Engineering Laboratory (SEL), a NASA Goddard production environment, to 49 candidate process and product metrics of 652 modules from six (51,000 to 112,000 lines) projects. For this particular environment, the method yielded the characteristic metric set (source lines, fault correction effort per executable statement, design effort, code effort, number of I/O parameters, number of versions). The uses examined for a characteristic metric set include forecasting the effort for development, modification, and fault correction of modules based on historical data.

  20. Reversible RNA adenosine methylation in biological regulation

    PubMed Central

    Jia, Guifang; Fu, Ye; He, Chuan

    2012-01-01

    N6-methyladenosine (m6A) is a ubiquitous modification in messenger RNA (mRNA) and other RNAs across most eukaryotes. For many years, however, the exact functions of m6A were not clearly understood. The discovery that the fat mass and obesity associated protein (FTO) is an m6A demethylase indicates that this modification is reversible and dynamically regulated, suggesting it has regulatory roles. In addition, it has been shown that m6A affects cell fate decisions in yeast and plant development. Recent affinity-based m6A profiling in mouse and human cells further showed that this modification is a widespread mark in coding and non-coding RNA transcripts and is likely dynamically regulated throughout developmental processes. Therefore, reversible RNA methylation, analogous to reversible DNA and histone modifications, may affect gene expression and cell fate decisions by modulating multiple RNA-related cellular pathways, which potentially provides rapid responses to various cellular and environmental signals, including energy and nutrient availability in mammals. PMID:23218460

  1. The Altered Hepatic Tubulin Code in Alcoholic Liver Disease.

    PubMed

    Groebner, Jennifer L; Tuma, Pamela L

    2015-09-18

    The molecular mechanisms that lead to the progression of alcoholic liver disease have been actively examined for decades. Because the hepatic microtubule cytoskeleton supports innumerable cellular processes, it has been the focus of many such mechanistic studies. It has long been appreciated that α-tubulin is a major target for modification by highly reactive ethanol metabolites and reactive oxygen species. It is also now apparent that alcohol exposure induces post-translational modifications that are part of the natural repertoire, mainly acetylation. In this review, the modifications of the "tubulin code" are described as well as those adducts by ethanol metabolites. The potential cellular consequences of microtubule modification are described with a focus on alcohol-induced defects in protein trafficking and enhanced steatosis. Possible mechanisms that can explain hepatic dysfunction are described and how this relates to the onset of liver injury is discussed. Finally, we propose that agents that alter the cellular acetylation state may represent a novel therapeutic strategy for treating liver disease.

  2. Dose rate calculations around 192Ir brachytherapy sources using a Sievert integration model

    NASA Astrophysics Data System (ADS)

    Karaiskos, P.; Angelopoulos, A.; Baras, P.; Rozaki-Mavrouli, H.; Sandilos, P.; Vlachos, L.; Sakelliou, L.

    2000-02-01

    The classical Sievert integral method is a valuable tool for dose rate calculations around brachytherapy sources, combining simplicity with reasonable computational times. However, its accuracy in predicting dose rate anisotropy around 192 Ir brachytherapy sources has been repeatedly put into question. In this work, we used a primary and scatter separation technique to improve an existing modification of the Sievert integral (Williamson's isotropic scatter model) that determines dose rate anisotropy around commercially available 192 Ir brachytherapy sources. The proposed Sievert formalism provides increased accuracy while maintaining the simplicity and computational time efficiency of the Sievert integral method. To describe transmission within the materials encountered, the formalism makes use of narrow beam attenuation coefficients which can be directly and easily calculated from the initially emitted 192 Ir spectrum. The other numerical parameters required for its implementation, once calculated with the aid of our home-made Monte Carlo simulation code, can be used for any 192 Ir source design. Calculations of dose rate and anisotropy functions with the proposed Sievert expression, around commonly used 192 Ir high dose rate sources and other 192 Ir elongated source designs, are in good agreement with corresponding accurate Monte Carlo results which have been reported by our group and other authors.

  3. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less

  4. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  5. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  6. Clinician's Primer to ICD-10-CM Coding for Cleft Lip/Palate Care.

    PubMed

    Allori, Alexander C; Cragan, Janet D; Della Porta, Gina C; Mulliken, John B; Meara, John G; Bruun, Richard; Shusterman, Stephen; Cassell, Cynthia H; Raynor, Eileen; Santiago, Pedro; Marcus, Jeffrey R

    2017-01-01

    On October 1, 2015, the United States required use of the Clinical Modification of the International Classification of Diseases, 10th Revision (ICD-10-CM) for diagnostic coding. This primer was written to assist the cleft care community with understanding and use of ICD-10-CM for diagnostic coding related to cleft lip and/or palate (CL/P).

  7. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  8. Culture and Creativity: World of Warcraft Modding in China and the US

    NASA Astrophysics Data System (ADS)

    Kow, Yong Ming; Nardi, Bonnie

    Modding - end-user modification of commercial hardware and software - can be traced back at least to 1961 when Spacewar! was developed by a group of MIT students on a DEC PDP-1. Spacewar! evolved into arcade games including Space Wars produced in 1977 by Cinematronics (Sotamaa 2003). In 1992, players altering Wolfenstein 3-D (1992), a first person shooter game made by id Software, overwrote the graphics and sounds by editing the game files. Learning from this experience, id Software released Doom in 1993 with isolated media files and open source code for players to develop custom maps, images, sounds, and other utilities. Players were able to pass on their modifications to others. By 1996, with the release of Quake, end-user modifications had come to be known as "mods," and modding was an accepted part of the gaming community (Kucklich 2005; Postigo 2008a, b). Since late-2005, we have been studying World of Warcraft (WoW) in which the use of mods is an important aspect of player practice (Nardi and Harris 2006; Nardi et al. 2007). Technically minded players with an interest in extending the game write mods and make them available to players for free download on distribution sites. Most modders work for free, but the distribution sites are commercial enterprises with advertising.

  9. Extending the maximum operation time of the MNSR reactor.

    PubMed

    Dawahra, S; Khattab, K; Saba, G

    2016-09-01

    An effective modification to extend the maximum operation time of the Miniature Neutron Source Reactor (MNSR) to enhance the utilization of the reactor has been tested using the MCNP4C code. This modification consisted of inserting manually in each of the reactor inner irradiation tube a chain of three polyethylene-connected containers filled of water. The total height of the chain was 11.5cm. The replacement of the actual cadmium absorber with B(10) absorber was needed as well. The rest of the core structure materials and dimensions remained unchanged. A 3-D neutronic model with the new modifications was developed to compare the neutronic parameters of the old and modified cores. The results of the old and modified core excess reactivities (ρex) were: 3.954, 6.241 mk respectively. The maximum reactor operation times were: 428, 1025min and the safety reactivity factors were: 1.654 and 1.595 respectively. Therefore, a 139% increase in the maximum reactor operation time was noticed for the modified core. This increase enhanced the utilization of the MNSR reactor to conduct a long time irradiation of the unknown samples using the NAA technique and increase the amount of radioisotope production in the reactor. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. HEMCO v1.0: A Versatile, ESMF-Compliant Component for Calculating Emissions in Atmospheric Models

    NASA Technical Reports Server (NTRS)

    Keller, C. A.; Long, M. S.; Yantosca, R. M.; Da Silva, A. M.; Pawson, S.; Jacob, D. J.

    2014-01-01

    We describe the Harvard-NASA Emission Component version 1.0 (HEMCO), a stand-alone software component for computing emissions in global atmospheric models. HEMCO determines emissions from different sources, regions, and species on a user-defined grid and can combine, overlay, and update a set of data inventories and scale factors, as specified by the user through the HEMCO configuration file. New emission inventories at any spatial and temporal resolution are readily added to HEMCO and can be accessed by the user without any preprocessing of the data files or modification of the source code. Emissions that depend on dynamic source types and local environmental variables such as wind speed or surface temperature are calculated in separate HEMCO extensions. HEMCO is fully compliant with the Earth System Modeling Framework (ESMF) environment. It is highly portable and can be deployed in a new model environment with only few adjustments at the top-level interface. So far, we have implemented HEMCO in the NASA Goddard Earth Observing System (GEOS-5) Earth system model (ESM) and in the GEOS-Chem chemical transport model (CTM). By providing a widely applicable framework for specifying constituent emissions, HEMCO is designed to ease sensitivity studies and model comparisons, as well as inverse modeling in which emissions are adjusted iteratively. The HEMCO code, extensions, and the full set of emissions data files used in GEOS-Chem are available at http: //wiki.geos-chem.org/HEMCO.

  11. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  12. A Nutrient-Driven tRNA Modification Alters Translational Fidelity and Genome-wide Protein Coding across an Animal Genus

    PubMed Central

    Zaborske, John M.; Bauer DuMont, Vanessa L.; Wallace, Edward W. J.; Pan, Tao; Aquadro, Charles F.; Drummond, D. Allan

    2014-01-01

    Natural selection favors efficient expression of encoded proteins, but the causes, mechanisms, and fitness consequences of evolved coding changes remain an area of aggressive inquiry. We report a large-scale reversal in the relative translational accuracy of codons across 12 fly species in the Drosophila/Sophophora genus. Because the reversal involves pairs of codons that are read by the same genomically encoded tRNAs, we hypothesize, and show by direct measurement, that a tRNA anticodon modification from guanosine to queuosine has coevolved with these genomic changes. Queuosine modification is present in most organisms but its function remains unclear. Modification levels vary across developmental stages in D. melanogaster, and, consistent with a causal effect, genes maximally expressed at each stage display selection for codons that are most accurate given stage-specific queuosine modification levels. In a kinetic model, the known increased affinity of queuosine-modified tRNA for ribosomes increases the accuracy of cognate codons while reducing the accuracy of near-cognate codons. Levels of queuosine modification in D. melanogaster reflect bioavailability of the precursor queuine, which eukaryotes scavenge from the tRNAs of bacteria and absorb in the gut. These results reveal a strikingly direct mechanism by which recoding of entire genomes results from changes in utilization of a nutrient. PMID:25489848

  13. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  14. CO-FIRING COAL: FEEDLOT AND LITTER BIOMASS FUELS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Kalyan Annamalai; Dr. John Sweeten; Dr. Sayeed Mukhtar

    2000-10-24

    The following are proposed activities for quarter 1 (6/15/00-9/14/00): (1) Finalize the allocation of funds within TAMU to co-principal investigators and the final task lists; (2) Acquire 3 D computer code for coal combustion and modify for cofiring Coal:Feedlot biomass and Coal:Litter biomass fuels; (3) Develop a simple one dimensional model for fixed bed gasifier cofired with coal:biomass fuels; and (4) Prepare the boiler burner for reburn tests with feedlot biomass fuels. The following were achieved During Quarter 5 (6/15/00-9/14/00): (1) Funds are being allocated to co-principal investigators; task list from Prof. Mukhtar has been received (Appendix A); (2) Ordermore » has been placed to acquire Pulverized Coal gasification and Combustion 3 D (PCGC-3) computer code for coal combustion and modify for cofiring Coal: Feedlot biomass and Coal: Litter biomass fuels. Reason for selecting this code is the availability of source code for modification to include biomass fuels; (3) A simplified one-dimensional model has been developed; however convergence had not yet been achieved; and (4) The length of the boiler burner has been increased to increase the residence time. A premixed propane burner has been installed to simulate coal combustion gases. First coal, as a reburn fuel will be used to generate base line data followed by methane, feedlot and litter biomass fuels.« less

  15. SHIELDING CONSIDERATIONS FOR THE SMALL ANIMAL RADIATION RESEARCH PLATFORM (SARRP)

    PubMed Central

    Sayler, Elaine; Dolney, Derek; Avery, Stephen; Koch, Cameron

    2014-01-01

    The Small Animal Radiation Research Platform (SARRP) is a commercially available platform designed to deliver conformal, image-guided radiation to small animals using a dual-anode kV x-ray source. At the University of Pennsylvania, a free-standing 2 m3 enclosure was designed to shield the SARRP according to federal code regulating cabinet x-ray systems. The initial design consisted of 4.0-mm-thick lead for all secondary barriers and proved wholly inadequate. Radiation levels outside the enclosure were 15 times higher than expected. Additionally, the leakage appeared to be distributed broadly within the enclosure, so concern arose that a subject might receive significant doses outside the intended treatment field. Thus, a detailed analysis was undertaken to identify and block all sources of leakage. Leakage sources were identified by Kodak X-OmatV (XV) film placed throughout the enclosure. Radiation inside the enclosure was quantified using Gafchromic film. Outside the enclosure, radiation was measured using a survey meter. Sources of leakage included (1) an unnecessarily broad beam exiting the tube, (2) failure of the secondary collimator to confine the primary beam entirely, (3) scatter from the secondary collimator, (4) lack of beam-stop below the treatment volume, and (5) incomplete shielding of the x-ray tube. The exit window was restricted, and a new collimator was designed to address problems (1–3). A beam-stop and additional tube shielding were installed. These modifications reduced internal scatter by more than 100-fold. Radiation outside the enclosure was reduced to levels compliant with federal regulations, provided the SARRP is operated using tube potentials of 175 kV or less. In addition, these simple and relatively inexpensive modifications eliminate the possibility of exposing a larger animal (such as a rat) to significant doses outside the treatment field. PMID:23532076

  16. Coding Strategies and Implementations of Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Tsai, Tsung-Han

    This dissertation studies the coding strategies of computational imaging to overcome the limitation of conventional sensing techniques. The information capacity of conventional sensing is limited by the physical properties of optics, such as aperture size, detector pixels, quantum efficiency, and sampling rate. These parameters determine the spatial, depth, spectral, temporal, and polarization sensitivity of each imager. To increase sensitivity in any dimension can significantly compromise the others. This research implements various coding strategies subject to optical multidimensional imaging and acoustic sensing in order to extend their sensing abilities. The proposed coding strategies combine hardware modification and signal processing to exploiting bandwidth and sensitivity from conventional sensors. We discuss the hardware architecture, compression strategies, sensing process modeling, and reconstruction algorithm of each sensing system. Optical multidimensional imaging measures three or more dimensional information of the optical signal. Traditional multidimensional imagers acquire extra dimensional information at the cost of degrading temporal or spatial resolution. Compressive multidimensional imaging multiplexes the transverse spatial, spectral, temporal, and polarization information on a two-dimensional (2D) detector. The corresponding spectral, temporal and polarization coding strategies adapt optics, electronic devices, and designed modulation techniques for multiplex measurement. This computational imaging technique provides multispectral, temporal super-resolution, and polarization imaging abilities with minimal loss in spatial resolution and noise level while maintaining or gaining higher temporal resolution. The experimental results prove that the appropriate coding strategies may improve hundreds times more sensing capacity. Human auditory system has the astonishing ability in localizing, tracking, and filtering the selected sound sources or information from a noisy environment. Using engineering efforts to accomplish the same task usually requires multiple detectors, advanced computational algorithms, or artificial intelligence systems. Compressive acoustic sensing incorporates acoustic metamaterials in compressive sensing theory to emulate the abilities of sound localization and selective attention. This research investigates and optimizes the sensing capacity and the spatial sensitivity of the acoustic sensor. The well-modeled acoustic sensor allows localizing multiple speakers in both stationary and dynamic auditory scene; and distinguishing mixed conversations from independent sources with high audio recognition rate.

  17. Differences in reported sepsis incidence according to study design: a literature review.

    PubMed

    Mariansdatter, Saga Elise; Eiset, Andreas Halgreen; Søgaard, Kirstine Kobberøe; Christiansen, Christian Fynbo

    2016-10-12

    Sepsis and severe sepsis are common conditions in hospital settings, and are associated with high rates of morbidity and mortality, but reported incidences vary considerably. In this literature review, we describe the variation in reported population-based incidences of sepsis and severe sepsis. We also examine methodological and demographic differences between studies that may explain this variation. We carried out a literature review searching three major databases and reference lists of relevant articles, to identify all original studies reporting the incidence of sepsis or severe sepsis in the general population. Two authors independently assessed all articles, and the final decision to exclude an article was reached by consensus. We extracted data according to predetermined variables, including study country, sepsis definition, and data source. We then calculated descriptive statistics for the reported incidences of sepsis and severe sepsis. The studies were classified according to the method used to identify cases of sepsis or severe sepsis: chart-based (i.e. review of patient charts) or code-based (i.e. predetermined International Classification of Diseases [ICD] codes). Among 482 articles initially screened, we identified 23 primary publications reporting incidence of sepsis and/or severe sepsis in the general population. The reported incidences ranged from 74 to 1180 per 100,000 person-years and 3 to 1074 per 100,000 person-years for sepsis and severe sepsis, respectively. Most chart-based studies used the Bone criteria (or a modification hereof) and Protein C Worldwide Evaluation in Severe Sepsis (PROWESS) study criteria to identify cases of sepsis and severe sepsis. Most code-based studies used ICD-9 codes, but the number of codes used ranged from 1 to more than 1200. We found that the incidence varied according to how sepsis was identified (chart-based vs. code-based), calendar year, data source, and world region. The reported incidences of sepsis and severe sepsis in the general population varied greatly between studies. Such differences may be attributable to differences in the methods used to collect the data, the study period, or the world region where the study was undertaken. This finding highlights the importance of standardised definitions and acquisition of data regarding sepsis and severe sepsis.

  18. Improvements to the fastex flutter analysis computer code

    NASA Technical Reports Server (NTRS)

    Taylor, Ronald F.

    1987-01-01

    Modifications to the FASTEX flutter analysis computer code (UDFASTEX) are described. The objectives were to increase the problem size capacity of FASTEX, reduce run times by modification of the modal interpolation procedure, and to add new user features. All modifications to the program are operable on the VAX 11/700 series computers under the VAX operating system. Interfaces were provided to aid in the inclusion of alternate aerodynamic and flutter eigenvalue calculations. Plots can be made of the flutter velocity, display and frequency data. A preliminary capability was also developed to plot contours of unsteady pressure amplitude and phase. The relevant equations of motion, modal interpolation procedures, and control system considerations are described and software developments are summarized. Additional information documenting input instructions, procedures, and details of the plate spline algorithm is found in the appendices.

  19. Schroedinger’s code: Source code availability and transparency in astrophysics

    NASA Astrophysics Data System (ADS)

    Ryan, PW; Allen, Alice; Teuben, Peter

    2018-01-01

    Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.

  20. A (72, 36; 15) box code

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1993-01-01

    A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.

  1. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary Kyle; Denman, Matthew R.

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less

  2. Genomic analysis of organismal complexity in the multicellular green alga Volvox carteri

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prochnik, Simon E.; Umen, James; Nedelcu, Aurora

    2010-07-01

    Analysis of the Volvox carteri genome reveals that this green alga's increased organismal complexity and multicellularity are associated with modifications in protein families shared with its unicellular ancestor, and not with large-scale innovations in protein coding capacity. The multicellular green alga Volvox carteri and its morphologically diverse close relatives (the volvocine algae) are uniquely suited for investigating the evolution of multicellularity and development. We sequenced the 138 Mb genome of V. carteri and compared its {approx}14,500 predicted proteins to those of its unicellular relative, Chlamydomonas reinhardtii. Despite fundamental differences in organismal complexity and life history, the two species have similarmore » protein-coding potentials, and few species-specific protein-coding gene predictions. Interestingly, volvocine algal-specific proteins are enriched in Volvox, including those associated with an expanded and highly compartmentalized extracellular matrix. Our analysis shows that increases in organismal complexity can be associated with modifications of lineage-specific proteins rather than large-scale invention of protein-coding capacity.« less

  3. TEMPEST code modifications and testing for erosion-resisting sludge simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Y.; Trent, D.S.

    The TEMPEST computer code has been used to address many waste retrieval operational and safety questions regarding waste mobilization, mixing, and gas retention. Because the amount of sludge retrieved from the tank is directly related to the sludge yield strength and the shear stress acting upon it, it is important to incorporate the sludge yield strength into simulations of erosion-resisting tank waste retrieval operations. This report describes current efforts to modify the TEMPEST code to simulate pump jet mixing of erosion-resisting tank wastes and the models used to test for erosion of waste sludge with yield strength. Test results formore » solid deposition and diluent/slurry jet injection into sludge layers in simplified tank conditions show that the modified TEMPEST code has a basic ability to simulate both the mobility and immobility of the sludges with yield strength. Further testing, modification, calibration, and verification of the sludge mobilization/immobilization model are planned using erosion data as they apply to waste tank sludges.« less

  4. Transition to international classification of disease version 10, clinical modification: the impact on internal medicine and internal medicine subspecialties.

    PubMed

    Caskey, Rachel N; Abutahoun, Angelos; Polick, Anne; Barnes, Michelle; Srivastava, Pavan; Boyd, Andrew D

    2018-05-04

    The US health care system uses diagnostic codes for billing and reimbursement as well as quality assessment and measuring clinical outcomes. The US transitioned to the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) on October, 2015. Little is known about the impact of ICD-10-CM on internal medicine and medicine subspecialists. We used a state-wide data set from Illinois Medicaid specified for Internal Medicine providers and subspecialists. A total of 3191 ICD-9-CM codes were used for 51,078 patient encounters, for a total cost of US $26,022,022 for all internal medicine. We categorized all of the ICD-9-CM codes based on the complexity of mapping to ICD-10-CM as codes with complex mapping could result in billing or administrative errors during the transition. Codes found to have complex mapping and frequently used codes (n = 295) were analyzed for clinical accuracy of mapping to ICD-10-CM. Each subspecialty was analyzed for complexity of codes used and proportion of reimbursement associated with complex codes. Twenty-five percent of internal medicine codes have convoluted mapping to ICD-10-CM, which represent 22% of Illinois Medicaid patients, and 30% of reimbursements. Rheumatology and Endocrinology had the greatest proportion of visits and reimbursement associated with complex codes. We found 14.5% of ICD-9-CM codes used by internists, when mapped to ICD-10-CM, resulted in potential clinical inaccuracies. We identified that 43% of diagnostic codes evaluated and used by internists and that account for 14% of internal medicine reimbursements are associated with codes which could result in administrative errors.

  5. Multitasking the INS3D-LU code on the Cray Y-MP

    NASA Technical Reports Server (NTRS)

    Fatoohi, Rod; Yoon, Seokkwan

    1991-01-01

    This paper presents the results of multitasking the INS3D-LU code on eight processors. The code is a full Navier-Stokes solver for incompressible fluid in three dimensional generalized coordinates using a lower-upper symmetric-Gauss-Seidel implicit scheme. This code has been fully vectorized on oblique planes of sweep and parallelized using autotasking with some directives and minor modifications. The timing results for five grid sizes are presented and analyzed. The code has achieved a processing rate of over one Gflops.

  6. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  7. The Nuremberg Code: its history and implications.

    PubMed

    Kious, B M

    2001-01-01

    The Nuremberg Code is a foundational document in the ethics of medical research and human experimentation; the principle its authors espoused in 1946 have provided the framework for modern codes that address the same issues, and have received little challenge and only slight modification in decades since. By analyzing the Code's tragic genesis and its normative implications, it is possible to understand some of the essence of modern experimental ethics, as well as certain outstanding controversies that still plague medical science.

  8. Operational rate-distortion performance for joint source and channel coding of images.

    PubMed

    Ruf, M J; Modestino, J W

    1999-01-01

    This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.

  9. Assessment and modification of an ion source grid design in KSTAR neutral beam system.

    PubMed

    Lee, Dong Won; Shin, Kyu In; Jin, Hyung Gon; Choi, Bo Guen; Kim, Tae-Seong; Jeong, Seung Ho

    2014-02-01

    A new 2 MW NB (Neutral Beam) ion source for supplying 3.5 MW NB heating for the KSTAR campaign was developed in 2012 and its grid was made from OFHC (Oxygen Free High Conductivity) copper with rectangular cooling channels. However, the plastic deformation such as a bulging in the plasma grid of the ion source was found during the overhaul period after the 2012 campaign. A thermal-hydraulic and a thermo-mechanical analysis using the conventional code, ANSYS, were carried out and the thermal fatigue life assessment was evaluated. It was found that the thermal fatigue life of the OFHC copper grid was about 335 cycles in case of 0.165 MW/m(2) heat flux and it gave too short fatigue life to be used as a KSTAR NB ion source grid. To overcome the limited fatigue life of the current design, the following methods were proposed in the present study: (1) changing the OHFC copper to CuCrZr, copper-alloy or (2) adopting a new design with a pure Mo metal grid and CuCrZr tubes. It is confirmed that the proposed methods meet the requirements by performing the same assessment.

  10. Simulation of ICD-9 to ICD-10-CM Transition for Family Medicine: Simple or Convoluted?

    PubMed

    Grief, Samuel N; Patel, Jesal; Kochendorfer, Karl M; Green, Lee A; Lussier, Yves A; Li, Jianrong; Burton, Michael; Boyd, Andrew D

    2016-01-01

    The objective of this study was to examine the impact of the transition from International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM), to Interactional Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM), on family medicine and to identify areas where additional training might be required. Family medicine ICD-9-CM codes were obtained from an Illinois Medicaid data set (113,000 patient visits and $5.5 million in claims). Using the science of networks, we evaluated each ICD-9-CM code used by family medicine physicians to determine whether the transition was simple or convoluted. A simple transition is defined as 1 ICD-9-CM code mapping to 1 ICD-10-CM code, or 1 ICD-9-CM code mapping to multiple ICD-10-CM codes. A convoluted transition is where the transitions between coding systems is nonreciprocal and complex, with multiple codes for which definitions become intertwined. Three family medicine physicians evaluated the most frequently encountered complex mappings for clinical accuracy. Of the 1635 diagnosis codes used by family medicine physicians, 70% of the codes were categorized as simple, 27% of codes were convoluted, and 3% had no mapping. For the visits, 75%, 24%, and 1% corresponded with simple, convoluted, and no mapping, respectively. Payment for submitted claims was similarly aligned. Of the frequently encountered convoluted codes, 3 diagnosis codes were clinically incorrect, but they represent only <0.1% of the overall diagnosis codes. The transition to ICD-10-CM is simple for 70% or more of diagnosis codes, visits, and reimbursement for a family medicine physician. However, some frequently used codes for disease management are convoluted and incorrect, and for which additional resources need to be invested to ensure a successful transition to ICD-10-CM. © Copyright 2016 by the American Board of Family Medicine.

  11. Simulation of ICD-9 to ICD-10-CM transition for family medicine: simple or convoluted?

    PubMed Central

    Grief, Samuel N.; Patel, Jesal; Lussier, Yves A.; Li, Jianrong; Burton, Michael; Boyd, Andrew D.

    2017-01-01

    Objectives The objective of this study was to examine the impact of the transition from International Classification of Disease Version Nine Clinical Modification (ICD-9-CM) to Interactional Classification of Disease Version Ten Clinical Modification (ICD-10-CM) on family medicine and identify areas where additional training might be required. Methods Family medicine ICD-9-CM codes were obtained from an Illinois Medicaid data set (113,000 patient visits and $5.5 million dollars in claims). Using the science of networks we evaluated each ICD-9-CM code used by family medicine physicians to determine if the transition was simple or convoluted.1 A simple translation is defined as one ICD-9-CM code mapping to one ICD-10-CM code or one ICD-9-CM code mapping to multiple ICD-10-CM codes. A convoluted transition is where the transitions between coding systems is non-reciprocal and complex with multiple codes where definitions become intertwined. Three family medicine physicians evaluated the most frequently encountered complex mappings for clinical accuracy. Results Of the 1635 diagnosis codes used by the family medicine physicians, 70% of the codes were categorized as simple, 27% of the diagnosis codes were convoluted and 3% were found to have no mapping. For the visits, 75%, 24%, and 1% corresponded with simple, convoluted, and no mapping, respectively. Payment for submitted claims were similarly aligned. Of the frequently encountered convoluted codes, 3 diagnosis codes were clinically incorrect, but they represent only < 0.1% of the overall diagnosis codes. Conclusions The transition to ICD-10-CM is simple for 70% or more of diagnosis codes, visits, and reimbursement for a family medicine physician. However, some frequently used codes for disease management are convoluted and incorrect, where additional resources need to be invested to ensure a successful transition to ICD-10-CM. PMID:26769875

  12. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  13. 77 FR 14438 - Petitions for Modification of Application of Existing Mandatory Safety Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-09

    ... the Code of Federal Regulations (30 CFR), part 44 govern the application, processing, and disposition.... Regulation Affected: 30 CFR 75.1200(d) & (i) Mine map). Modification Request: The petitioner requests a... affected area. The petitioner asserts that the proposed alternative method will provide no less than the...

  14. 76 FR 17736 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-30

    ... DEPARTMENT OF TRANSPORTATION Federal Railroad Administration Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System Pursuant to Title 49 Code of Federal Regulations (CFR) Part 235 and 49 U.S.C. 20502(a), the following railroad has petitioned the Federal Railroad...

  15. Analyzing Relational Control in Family Therapy Interviews.

    ERIC Educational Resources Information Center

    Friedlander, Myrna L.; Heatherington, Laurie

    1989-01-01

    Introduces a modification of Ericson and Rogers' (1973) dyadic Relational Communication Control Coding System (RCCCS) for family contexts involving three or more persons. New coding rules were necessary because in families messages are not always reciprocal or direct. An illustrative excerpt shows the kinds of indexes provided by the system.(TE)

  16. AMS 4.0: consensus prediction of post-translational modifications in protein sequences.

    PubMed

    Plewczynski, Dariusz; Basu, Subhadip; Saha, Indrajit

    2012-08-01

    We present here the 2011 update of the AutoMotif Service (AMS 4.0) that predicts the wide selection of 88 different types of the single amino acid post-translational modifications (PTM) in protein sequences. The selection of experimentally confirmed modifications is acquired from the latest UniProt and Phospho.ELM databases for training. The sequence vicinity of each modified residue is represented using amino acids physico-chemical features encoded using high quality indices (HQI) obtaining by automatic clustering of known indices extracted from AAindex database. For each type of the numerical representation, the method builds the ensemble of Multi-Layer Perceptron (MLP) pattern classifiers, each optimising different objectives during the training (for example the recall, precision or area under the ROC curve (AUC)). The consensus is built using brainstorming technology, which combines multi-objective instances of machine learning algorithm, and the data fusion of different training objects representations, in order to boost the overall prediction accuracy of conserved short sequence motifs. The performance of AMS 4.0 is compared with the accuracy of previous versions, which were constructed using single machine learning methods (artificial neural networks, support vector machine). Our software improves the average AUC score of the earlier version by close to 7 % as calculated on the test datasets of all 88 PTM types. Moreover, for the selected most-difficult sequence motifs types it is able to improve the prediction performance by almost 32 %, when compared with previously used single machine learning methods. Summarising, the brainstorming consensus meta-learning methodology on the average boosts the AUC score up to around 89 %, averaged over all 88 PTM types. Detailed results for single machine learning methods and the consensus methodology are also provided, together with the comparison to previously published methods and state-of-the-art software tools. The source code and precompiled binaries of brainstorming tool are available at http://code.google.com/p/automotifserver/ under Apache 2.0 licensing.

  17. PCTDSE: A parallel Cartesian-grid-based TDSE solver for modeling laser-atom interactions

    NASA Astrophysics Data System (ADS)

    Fu, Yongsheng; Zeng, Jiaolong; Yuan, Jianmin

    2017-01-01

    We present a parallel Cartesian-grid-based time-dependent Schrödinger equation (TDSE) solver for modeling laser-atom interactions. It can simulate the single-electron dynamics of atoms in arbitrary time-dependent vector potentials. We use a split-operator method combined with fast Fourier transforms (FFT), on a three-dimensional (3D) Cartesian grid. Parallelization is realized using a 2D decomposition strategy based on the Message Passing Interface (MPI) library, which results in a good parallel scaling on modern supercomputers. We give simple applications for the hydrogen atom using the benchmark problems coming from the references and obtain repeatable results. The extensions to other laser-atom systems are straightforward with minimal modifications of the source code.

  18. Measuring diagnoses: ICD code accuracy.

    PubMed

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-10-01

    To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.

  19. Complex Interplay among DNA Modification, Noncoding RNA Expression and Protein-Coding RNA Expression in Salvia miltiorrhiza Chloroplast Genome

    PubMed Central

    Chen, Haimei; Zhang, Jianhui; Yuan, George; Liu, Chang

    2014-01-01

    Salvia miltiorrhiza is one of the most widely used medicinal plants. As a first step to develop a chloroplast-based genetic engineering method for the over-production of active components from S. miltiorrhiza, we have analyzed the genome, transcriptome, and base modifications of the S. miltiorrhiza chloroplast. Total genomic DNA and RNA were extracted from fresh leaves and then subjected to strand-specific RNA-Seq and Single-Molecule Real-Time (SMRT) sequencing analyses. Mapping the RNA-Seq reads to the genome assembly allowed us to determine the relative expression levels of 80 protein-coding genes. In addition, we identified 19 polycistronic transcription units and 136 putative antisense and intergenic noncoding RNA (ncRNA) genes. Comparison of the abundance of protein-coding transcripts (cRNA) with and without overlapping antisense ncRNAs (asRNA) suggest that the presence of asRNA is associated with increased cRNA abundance (p<0.05). Using the SMRT Portal software (v1.3.2), 2687 potential DNA modification sites and two potential DNA modification motifs were predicted. The two motifs include a TATA box–like motif (CPGDMM1, “TATANNNATNA”), and an unknown motif (CPGDMM2 “WNYANTGAW”). Specifically, 35 of the 97 CPGDMM1 motifs (36.1%) and 91 of the 369 CPGDMM2 motifs (24.7%) were found to be significantly modified (p<0.01). Analysis of genes downstream of the CPGDMM1 motif revealed the significantly increased abundance of ncRNA genes that are less than 400 bp away from the significantly modified CPGDMM1motif (p<0.01). Taking together, the present study revealed a complex interplay among DNA modifications, ncRNA and cRNA expression in chloroplast genome. PMID:24914614

  20. Complex interplay among DNA modification, noncoding RNA expression and protein-coding RNA expression in Salvia miltiorrhiza chloroplast genome.

    PubMed

    Chen, Haimei; Zhang, Jianhui; Yuan, George; Liu, Chang

    2014-01-01

    Salvia miltiorrhiza is one of the most widely used medicinal plants. As a first step to develop a chloroplast-based genetic engineering method for the over-production of active components from S. miltiorrhiza, we have analyzed the genome, transcriptome, and base modifications of the S. miltiorrhiza chloroplast. Total genomic DNA and RNA were extracted from fresh leaves and then subjected to strand-specific RNA-Seq and Single-Molecule Real-Time (SMRT) sequencing analyses. Mapping the RNA-Seq reads to the genome assembly allowed us to determine the relative expression levels of 80 protein-coding genes. In addition, we identified 19 polycistronic transcription units and 136 putative antisense and intergenic noncoding RNA (ncRNA) genes. Comparison of the abundance of protein-coding transcripts (cRNA) with and without overlapping antisense ncRNAs (asRNA) suggest that the presence of asRNA is associated with increased cRNA abundance (p<0.05). Using the SMRT Portal software (v1.3.2), 2687 potential DNA modification sites and two potential DNA modification motifs were predicted. The two motifs include a TATA box-like motif (CPGDMM1, "TATANNNATNA"), and an unknown motif (CPGDMM2 "WNYANTGAW"). Specifically, 35 of the 97 CPGDMM1 motifs (36.1%) and 91 of the 369 CPGDMM2 motifs (24.7%) were found to be significantly modified (p<0.01). Analysis of genes downstream of the CPGDMM1 motif revealed the significantly increased abundance of ncRNA genes that are less than 400 bp away from the significantly modified CPGDMM1motif (p<0.01). Taking together, the present study revealed a complex interplay among DNA modifications, ncRNA and cRNA expression in chloroplast genome.

  1. Pseudospectral method for gravitational wave collapse

    NASA Astrophysics Data System (ADS)

    Hilditch, David; Weyhausen, Andreas; Brügmann, Bernd

    2016-03-01

    We present a new pseudospectral code, bamps, for numerical relativity written with the evolution of collapsing gravitational waves in mind. We employ the first-order generalized harmonic gauge formulation. The relevant theory is reviewed, and the numerical method is critically examined and specialized for the task at hand. In particular, we investigate formulation parameters—gauge- and constraint-preserving boundary conditions well suited to nonvanishing gauge source functions. Different types of axisymmetric twist-free moment-of-time-symmetry gravitational wave initial data are discussed. A treatment of the axisymmetric apparent horizon condition is presented with careful attention to regularity on axis. Our apparent horizon finder is then evaluated in a number of test cases. Moving on to evolutions, we investigate modifications to the generalized harmonic gauge constraint damping scheme to improve conservation in the strong-field regime. We demonstrate strong-scaling of our pseudospectral penalty code. We employ the Cartoon method to efficiently evolve axisymmetric data in our 3 +1 -dimensional code. We perform test evolutions of the Schwarzschild spacetime perturbed by gravitational waves and by gauge pulses, both to demonstrate the use of our black-hole excision scheme and for comparison with earlier results. Finally, numerical evolutions of supercritical Brill waves are presented to demonstrate durability of the excision scheme for the dynamical formation of a black hole.

  2. New algorithm for tensor contractions on multi-core CPUs, GPUs, and accelerators enables CCSD and EOM-CCSD calculations with over 1000 basis functions on a single compute node.

    PubMed

    Kaliman, Ilya A; Krylov, Anna I

    2017-04-30

    A new hardware-agnostic contraction algorithm for tensors of arbitrary symmetry and sparsity is presented. The algorithm is implemented as a stand-alone open-source code libxm. This code is also integrated with general tensor library libtensor and with the Q-Chem quantum-chemistry package. An overview of the algorithm, its implementation, and benchmarks are presented. Similarly to other tensor software, the algorithm exploits efficient matrix multiplication libraries and assumes that tensors are stored in a block-tensor form. The distinguishing features of the algorithm are: (i) efficient repackaging of the individual blocks into large matrices and back, which affords efficient graphics processing unit (GPU)-enabled calculations without modifications of higher-level codes; (ii) fully asynchronous data transfer between disk storage and fast memory. The algorithm enables canonical all-electron coupled-cluster and equation-of-motion coupled-cluster calculations with single and double substitutions (CCSD and EOM-CCSD) with over 1000 basis functions on a single quad-GPU machine. We show that the algorithm exhibits predicted theoretical scaling for canonical CCSD calculations, O(N 6 ), irrespective of the data size on disk. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. NIFTY - Numerical Information Field Theory. A versatile PYTHON library for signal inference

    NASA Astrophysics Data System (ADS)

    Selig, M.; Bell, M. R.; Junklewitz, H.; Oppermann, N.; Reinecke, M.; Greiner, M.; Pachajoa, C.; Enßlin, T. A.

    2013-06-01

    NIFTy (Numerical Information Field Theory) is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency. NIFTy offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTy permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTy operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined. NIFTy homepage http://www.mpa-garching.mpg.de/ift/nifty/; Excerpts of this paper are part of the NIFTy source code and documentation.

  4. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  5. Brief surgical procedure code lists for outcomes measurement and quality improvement in resource-limited settings.

    PubMed

    Liu, Charles; Kayima, Peter; Riesel, Johanna; Situma, Martin; Chang, David; Firth, Paul

    2017-11-01

    The lack of a classification system for surgical procedures in resource-limited settings hinders outcomes measurement and reporting. Existing procedure coding systems are prohibitively large and expensive to implement. We describe the creation and prospective validation of 3 brief procedure code lists applicable in low-resource settings, based on analysis of surgical procedures performed at Mbarara Regional Referral Hospital, Uganda's second largest public hospital. We reviewed operating room logbooks to identify all surgical operations performed at Mbarara Regional Referral Hospital during 2014. Based on the documented indication for surgery and procedure(s) performed, we assigned each operation up to 4 procedure codes from the International Classification of Diseases, 9th Revision, Clinical Modification. Coding of procedures was performed by 2 investigators, and a random 20% of procedures were coded by both investigators. These codes were aggregated to generate procedure code lists. During 2014, 6,464 surgical procedures were performed at Mbarara Regional Referral Hospital, to which we assigned 435 unique procedure codes. Substantial inter-rater reliability was achieved (κ = 0.7037). The 111 most common procedure codes accounted for 90% of all codes assigned, 180 accounted for 95%, and 278 accounted for 98%. We considered these sets of codes as 3 procedure code lists. In a prospective validation, we found that these lists described 83.2%, 89.2%, and 92.6% of surgical procedures performed at Mbarara Regional Referral Hospital during August to September of 2015, respectively. Empirically generated brief procedure code lists based on International Classification of Diseases, 9th Revision, Clinical Modification can be used to classify almost all surgical procedures performed at a Ugandan referral hospital. Such a standardized procedure coding system may enable better surgical data collection for administration, research, and quality improvement in resource-limited settings. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Selective inhibitors of trypanosomal uridylyl transferase RET1 establish druggability of RNA post-transcriptional modifications

    PubMed Central

    Cording, Amy; Gormally, Michael; Bond, Peter J.; Carrington, Mark; Balasubramanian, Shankar; Miska, Eric A.; Thomas, Beth

    2017-01-01

    ABSTRACT Non-coding RNAs are crucial regulators for a vast array of cellular processes and have been implicated in human disease. These biological processes represent a hitherto untapped resource in our fight against disease. In this work we identify small molecule inhibitors of a non-coding RNA uridylylation pathway. The TUTase family of enzymes is important for modulating non-coding RNA pathways in both human cancer and pathogen systems. We demonstrate that this new class of drug target can be accessed with traditional drug discovery techniques. Using the Trypanosoma brucei TUTase, RET1, we identify TUTase inhibitors and lay the groundwork for the use of this new target class as a therapeutic opportunity for the under-served disease area of African Trypanosomiasis. In a broader sense this work demonstrates the therapeutic potential for targeting RNA post-transcriptional modifications with small molecules in human disease. PMID:26786754

  7. Selective inhibitors of trypanosomal uridylyl transferase RET1 establish druggability of RNA post-transcriptional modifications.

    PubMed

    Cording, Amy; Gormally, Michael; Bond, Peter J; Carrington, Mark; Balasubramanian, Shankar; Miska, Eric A; Thomas, Beth

    2017-05-04

    Non-coding RNAs are crucial regulators for a vast array of cellular processes and have been implicated in human disease. These biological processes represent a hitherto untapped resource in our fight against disease. In this work we identify small molecule inhibitors of a non-coding RNA uridylylation pathway. The TUTase family of enzymes is important for modulating non-coding RNA pathways in both human cancer and pathogen systems. We demonstrate that this new class of drug target can be accessed with traditional drug discovery techniques. Using the Trypanosoma brucei TUTase, RET1, we identify TUTase inhibitors and lay the groundwork for the use of this new target class as a therapeutic opportunity for the under-served disease area of African Trypanosomiasis. In a broader sense this work demonstrates the therapeutic potential for targeting RNA post-transcriptional modifications with small molecules in human disease.

  8. Lunar module voice recorder

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A feasibility unit suitable for use as a voice recorder on the space shuttle was developed. A modification, development, and test program is described. A LM-DSEA recorder was modified to achieve the following goals: (1) redesign case to allow in-flight cartridge change; (2) time code change from LM code to IRIG-B 100 pps code; (3) delete cold plate requirements (also requires deletion of long-term thermal vacuum operation at 0.00001 MMHg); (4) implement track sequence reset during cartridge change; (5) reduce record time per cartridge because of unavailability of LM thin-base tape; and (6) add an internal Vox key circuit to turn on/off transport and electronics with voice data input signal. The recorder was tested at both the LM and shuttle vibration levels. The modified recorder achieved the same level of flutter during vibration as the DSEA recorder prior to modification. Several improvements were made over the specification requirements. The high manufacturing cost is discussed.

  9. Computer code for preliminary sizing analysis of axial-flow turbines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    This mean diameter flow analysis uses a stage average velocity diagram as the basis for the computational efficiency. Input design requirements include power or pressure ratio, flow rate, temperature, pressure, and rotative speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse) or for any specified stage swirl split. Exit turning vanes can be included in the design. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, and last stage absolute and relative Mach numbers. An analysis is presented along with a description of the computer program input and output with sample cases. The analysis and code presented herein are modifications of those described in NASA-TN-D-6702. These modifications improve modeling rigor and extend code applicability.

  10. Preliminary designs for X-ray source modifications for the Marshall Space Flight Center's X-ray calibration facility

    NASA Technical Reports Server (NTRS)

    Croft, W. L.

    1986-01-01

    The objective of this investigation is to develop preliminary designs for modifications to the X-ray source of the MSFC X-Ray Calibration Facility. Recommendations are made regarding: (1) the production of an unpolarized X-ray beam, (2) modification of the source to provide characteristic X-rays with energies up to 40 keV, and (3) addition of the capability to calibrate instruments in the extreme ultraviolet wavelength region.

  11. Identifying clinically disruptive International Classification of Diseases 10th Revision Clinical Modification conversions to mitigate financial costs using an online tool.

    PubMed

    Venepalli, Neeta K; Qamruzzaman, Yusuf; Li, Jianrong John; Lussier, Yves A; Boyd, Andrew D

    2014-03-01

    To quantify coding ambiguity in International Classification of Diseases Ninth Revision Clinical Modification conversions (ICD-9-CM) to ICD-10-CM mappings for hematology-oncology diagnoses within an Illinois Medicaid database and an academic cancer center database (University of Illinois Cancer Center [UICC]) with the goal of anticipating challenges during ICD-10-CM transition. One data set of ICD-9-CM diagnosis codes came from the 2010 Illinois Department of Medicaid, filtered for diagnoses generated by hematology-oncology providers. The other data set of ICD-9-CM diagnosis codes came from UICC. Using a translational methodology via the Motif Web portal ICD-9-CM conversion tool, ICD-9-CM to ICD-10-CM code conversions were graphically mapped and evaluated for clinical loss of information. The transition to ICD-10-CM led to significant information loss, affecting 8% of total Medicaid codes and 1% of UICC codes; 39 ICD-9-CM codes with information loss accounted for 2.9% of total Medicaid reimbursements and 5.3% of UICC billing charges. Prior work stated hematology-oncology would be the least affected medical specialty. However, information loss affecting 5% of billing costs could evaporate the operating margin of a practice. By identifying codes at risk for complex transitions, the analytic tools described can be replicated for oncology practices to forecast areas requiring additional training and resource allocation. In summary, complex transitions and diagnosis codes associated with information loss within clinical oncology require additional attention during the transition to ICD-10-CM.

  12. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  13. Experimental benchmark of kinetic simulations of capacitively coupled plasmas in molecular gases

    NASA Astrophysics Data System (ADS)

    Donkó, Z.; Derzsi, A.; Korolov, I.; Hartmann, P.; Brandt, S.; Schulze, J.; Berger, B.; Koepke, M.; Bruneau, B.; Johnson, E.; Lafleur, T.; Booth, J.-P.; Gibson, A. R.; O'Connell, D.; Gans, T.

    2018-01-01

    We discuss the origin of uncertainties in the results of numerical simulations of low-temperature plasma sources, focusing on capacitively coupled plasmas. These sources can be operated in various gases/gas mixtures, over a wide domain of excitation frequency, voltage, and gas pressure. At low pressures, the non-equilibrium character of the charged particle transport prevails and particle-based simulations become the primary tools for their numerical description. The particle-in-cell method, complemented with Monte Carlo type description of collision processes, is a well-established approach for this purpose. Codes based on this technique have been developed by several authors/groups, and have been benchmarked with each other in some cases. Such benchmarking demonstrates the correctness of the codes, but the underlying physical model remains unvalidated. This is a key point, as this model should ideally account for all important plasma chemical reactions as well as for the plasma-surface interaction via including specific surface reaction coefficients (electron yields, sticking coefficients, etc). In order to test the models rigorously, comparison with experimental ‘benchmark data’ is necessary. Examples will be given regarding the studies of electron power absorption modes in O2, and CF4-Ar discharges, as well as on the effect of modifications of the parameters of certain elementary processes on the computed discharge characteristics in O2 capacitively coupled plasmas.

  14. SUPRA: open-source software-defined ultrasound processing for real-time applications : A 2D and 3D pipeline from beamforming to B-mode.

    PubMed

    Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph

    2018-06-01

    Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.

  15. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  16. Development and Characterization of a Hybrid Atmospheric Pressure Plasma Electrospinning System for Nanofiber Enhancement

    NASA Astrophysics Data System (ADS)

    Nowak, Joshua Michael

    A hybrid atmospheric pressure-electrospinning plasma system was developed to be used for the production of nanofibers and enhance their performance for various applications. Electrospun nanofibers are excellent candidates for protective clothing in the field of chemical and biological warfare defense; however, nanofibers are structurally weak and easily abrade and tear. They can be strengthened through the support of a substrate fabric, but they do not adhere well to substrates. Through the use of the developed hybrid system with either pure He or He/O2 (99/1) feed gas, adherence to the substrate along with abrasion and flex resistance were improved. The plasma source was diagnosed electrically, thermally, and optically. An equivalent circuit model was developed for non-thermal, highly collisional plasmas that can solve for average electron temperature and electron number density. The obtained temperatures (~ 3eV) correlate very well with the results of a neutral Bremsstrahlung continuum matching technique that was also employed. Using the temperatures and number densities obtained from the circuit model and the optical spectroscopy, a global chemical kinetics code was written in order to solve for radical and ion concentrations. This code shows that there are significant concentrations of oxygen radicals present. The XPS analysis confirmed that there was an increase of surface oxygen from 11.1% up to 16.6% for the He/O2 plasma and that the C-O bonding, which was not present in the control samples, has increased to 45.4%. The adhesive strength to the substrate has a significant increase of 81% for helium plasma and 144% for He/O2 plasma; however, these values remain below the desired values for protective clothing applications. The hybrid system displayed the ability to oxygenate nanofibers as they are being electrospun and shows the feasibility of making other surface modifications. The developed circuit model and chemical kinetics code both show promise as tools for deterministic atmospheric pressure plasma research in the field of surface modifications.

  17. Transmutation Fuel Performance Code Thermal Model Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  18. Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.

    1972-01-01

    A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.

  19. Possibilities for the evolution of the genetic code from a preceding form

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1973-01-01

    Analysis of the interaction between mRNA codons and tRNA anticodons suggests a model for the evolution of the genetic code. Modification of the nucleic acid following the anticodon is at present essential in both eukaryotes and prokaryotes to ensure fidelity of translation of codons starting with A, and the amino acids which could be coded for before the evolution of the modifying enzymes can be deduced.

  20. Non-native (exotic) snake envenomations in the U.S., 2005-2011.

    PubMed

    Warrick, Brandon J; Boyer, Leslie V; Seifert, Steven A

    2014-09-29

    Non-native (exotic) snakes are a problematic source of envenomation worldwide. This manuscript describes the current demographics, outcomes and challenges of non-native snakebites in the United States (U.S.). We performed a retrospective case series of the National Poison Data System (NPDS) database between 2005 and 2011. There were 258 human exposures involving at least 61 unique exotic venomous species (average = 37 per year; range = 33-40). Males comprised 79% and females 21%. The average age was 33 years with 16% less than 20 years old. 70% of bites occurred in a private residence and 86% were treated at a healthcare facility. 35% of cases received antivenom and 10% were given antibiotics. This study is compared to our previous study (1994-2004) in which there was a substantial coding error rate. Software modifications significantly reduced coding errors. Identification and acquisition of appropriate antivenoms pose a number of logistical difficulties in the management of these envenomations. In the U.S., poison centers have valuable systems and clinical roles in the provision of expert consultation and in the management of these cases.

  1. Non-Native (Exotic) Snake Envenomations in the U.S., 2005–2011

    PubMed Central

    Warrick, Brandon J.; Boyer, Leslie V.; Seifert, Steven A.

    2014-01-01

    Non-native (exotic) snakes are a problematic source of envenomation worldwide. This manuscript describes the current demographics, outcomes and challenges of non-native snakebites in the United States (U.S.). We performed a retrospective case series of the National Poison Data System (NPDS) database between 2005 and 2011. There were 258 human exposures involving at least 61 unique exotic venomous species (average = 37 per year; range = 33–40). Males comprised 79% and females 21%. The average age was 33 years with 16% less than 20 years old. 70% of bites occurred in a private residence and 86% were treated at a healthcare facility. 35% of cases received antivenom and 10% were given antibiotics. This study is compared to our previous study (1994–2004) in which there was a substantial coding error rate. Software modifications significantly reduced coding errors. Identification and acquisition of appropriate antivenoms pose a number of logistical difficulties in the management of these envenomations. In the U.S., poison centers have valuable systems and clinical roles in the provision of expert consultation and in the management of these cases. PMID:25268980

  2. You've Written a Cool Astronomy Code! Now What Do You Do with It?

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Accomazzi, A.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P. J.; Wallin, J. F.

    2014-01-01

    Now that you've written a useful astronomy code for your soon-to-be-published research, you have to figure out what you want to do with it. Our suggestion? Share it! This presentation highlights the means and benefits of sharing your code. Make your code citable -- submit it to the Astrophysics Source Code Library and have it indexed by ADS! The Astrophysics Source Code Library (ASCL) is a free online registry of source codes of interest to astronomers and astrophysicists. With over 700 codes, it is continuing its rapid growth, with an average of 17 new codes a month. The editors seek out codes for inclusion; indexing by ADS improves the discoverability of codes and provides a way to cite codes as separate entries, especially codes without papers that describe them.

  3. Calculation Method of Lateral Strengths and Ductility Factors of Constructions with Shear Walls of Different Ductility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamaguchi, Nobuyoshi; Nakao, Masato; Murakami, Masahide

    2008-07-08

    For seismic design, ductility-related force modification factors are named R factor in Uniform Building Code of U.S, q factor in Euro Code 8 and Ds (inverse of R) factor in Japanese Building Code. These ductility-related force modification factors for each type of shear elements are appeared in those codes. Some constructions use various types of shear walls that have different ductility, especially for their retrofit or re-strengthening. In these cases, engineers puzzle the decision of force modification factors of the constructions. Solving this problem, new method to calculate lateral strengths of stories for simple shear wall systems is proposed andmore » named 'Stiffness--Potential Energy Addition Method' in this paper. This method uses two design lateral strengths for each type of shear walls in damage limit state and safety limit state. Two lateral strengths of stories in both limit states are calculated from these two design lateral strengths for each type of shear walls in both limit states. Calculated strengths have the same quality as values obtained by strength addition method using many steps of load-deformation data of shear walls. The new method to calculate ductility factors is also proposed in this paper. This method is based on the new method to calculate lateral strengths of stories. This method can solve the problem to obtain ductility factors of stories with shear walls of different ductility.« less

  4. GAPP: A Proteogenomic Software for Genome Annotation and Global Profiling of Post-translational Modifications in Prokaryotes.

    PubMed

    Zhang, Jia; Yang, Ming-Kun; Zeng, Honghui; Ge, Feng

    2016-11-01

    Although the number of sequenced prokaryotic genomes is growing rapidly, experimentally verified annotation of prokaryotic genome remains patchy and challenging. To facilitate genome annotation efforts for prokaryotes, we developed an open source software called GAPP for genome annotation and global profiling of post-translational modifications (PTMs) in prokaryotes. With a single command, it provides a standard workflow to validate and refine predicted genetic models and discover diverse PTM events. We demonstrated the utility of GAPP using proteomic data from Helicobacter pylori, one of the major human pathogens that is responsible for many gastric diseases. Our results confirmed 84.9% of the existing predicted H. pylori proteins, identified 20 novel protein coding genes, and corrected four existing gene models with regard to translation initiation sites. In particular, GAPP revealed a large repertoire of PTMs using the same proteomic data and provided a rich resource that can be used to examine the functions of reversible modifications in this human pathogen. This software is a powerful tool for genome annotation and global discovery of PTMs and is applicable to any sequenced prokaryotic organism; we expect that it will become an integral part of ongoing genome annotation efforts for prokaryotes. GAPP is freely available at https://sourceforge.net/projects/gappproteogenomic/. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. 78 FR 44189 - Petition for Modification of Single Car Air Brake Test Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-23

    ...] Petition for Modification of Single Car Air Brake Test Procedures In accordance with Part 232 of Title 49... Administration (FRA) per 49 CFR 232.307 to modify the single car air brake test procedures located in AAR Standard S-486, Code of Air Brake System Tests for Freight Equipment-- Single Car Test, and required...

  6. 75 FR 76069 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... DEPARTMENT OF TRANSPORTATION Federal Railroad Administration Docket Number FRA-2010-0154 Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System Pursuant to Title 49 Code of Federal Regulations (CFR) part 235 and 49 U.S.C. 20502(a), the following railroad has...

  7. Application of Exactly Linearized Error Transport Equations to AIAA CFD Prediction Workshops

    NASA Technical Reports Server (NTRS)

    Derlaga, Joseph M.; Park, Michael A.; Rallabhandi, Sriram

    2017-01-01

    The computational fluid dynamics (CFD) prediction workshops sponsored by the AIAA have created invaluable opportunities in which to discuss the predictive capabilities of CFD in areas in which it has struggled, e.g., cruise drag, high-lift, and sonic boom pre diction. While there are many factors that contribute to disagreement between simulated and experimental results, such as modeling or discretization error, quantifying the errors contained in a simulation is important for those who make decisions based on the computational results. The linearized error transport equations (ETE) combined with a truncation error estimate is a method to quantify one source of errors. The ETE are implemented with a complex-step method to provide an exact linearization with minimal source code modifications to CFD and multidisciplinary analysis methods. The equivalency of adjoint and linearized ETE functional error correction is demonstrated. Uniformly refined grids from a series of AIAA prediction workshops demonstrate the utility of ETE for multidisciplinary analysis with a connection between estimated discretization error and (resolved or under-resolved) flow features.

  8. Space shuttle main engine numerical modeling code modifications and analysis

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.

    1988-01-01

    The user of computational fluid dynamics (CFD) codes must be concerned with the accuracy and efficiency of the codes if they are to be used for timely design and analysis of complicated three-dimensional fluid flow configurations. A brief discussion of how accuracy and efficiency effect the CFD solution process is given. A more detailed discussion of how efficiency can be enhanced by using a few Cray Research Inc. utilities to address vectorization is presented and these utilities are applied to a three-dimensional Navier-Stokes CFD code (INS3D).

  9. Low-rate image coding using vector quantization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makur, A.

    1990-01-01

    This thesis deals with the development and analysis of a computationally simple vector quantization image compression system for coding monochrome images at low bit rate. Vector quantization has been known to be an effective compression scheme when a low bit rate is desirable, but the intensive computation required in a vector quantization encoder has been a handicap in using it for low rate image coding. The present work shows that, without substantially increasing the coder complexity, it is indeed possible to achieve acceptable picture quality while attaining a high compression ratio. Several modifications to the conventional vector quantization coder aremore » proposed in the thesis. These modifications are shown to offer better subjective quality when compared to the basic coder. Distributed blocks are used instead of spatial blocks to construct the input vectors. A class of input-dependent weighted distortion functions is used to incorporate psychovisual characteristics in the distortion measure. Computationally simple filtering techniques are applied to further improve the decoded image quality. Finally, unique designs of the vector quantization coder using electronic neural networks are described, so that the coding delay is reduced considerably.« less

  10. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    NASA Technical Reports Server (NTRS)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  11. The multidimensional Self-Adaptive Grid code, SAGE, version 2

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1995-01-01

    This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.

  12. The random energy model in a magnetic field and joint source channel coding

    NASA Astrophysics Data System (ADS)

    Merhav, Neri

    2008-09-01

    We demonstrate that there is an intimate relationship between the magnetic properties of Derrida’s random energy model (REM) of spin glasses and the problem of joint source-channel coding in Information Theory. In particular, typical patterns of erroneously decoded messages in the coding problem have “magnetization” properties that are analogous to those of the REM in certain phases, where the non-uniformity of the distribution of the source in the coding problem plays the role of an external magnetic field applied to the REM. We also relate the ensemble performance (random coding exponents) of joint source-channel codes to the free energy of the REM in its different phases.

  13. 40 CFR 52.2592 - Review of new sources and modifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Wisconsin § 52.2592 Review of new sources and modifications. Disapproval—On May 12, 2011, the Wisconsin Department of Natural...

  14. User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E)

    DTIC Science & Technology

    2014-06-01

    User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E) by James P. Larentzos...Laboratory Aberdeen Proving Ground, MD 21005-5069 ARL-SR-290 June 2014 User Manual and Source Code for a LAMMPS Implementation of Constant...3. DATES COVERED (From - To) September 2013–February 2014 4. TITLE AND SUBTITLE User Manual and Source Code for a LAMMPS Implementation of

  15. Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold

    1997-01-01

    The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.

  16. Enhanced capabilities and modified users manual for axial-flow compressor conceptual design code CSPAN

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.; Lavelle, Thomas M.

    1995-01-01

    Modifications made to the axial-flow compressor conceptual design code CSPAN are documented in this report. Endwall blockage and stall margin predictions were added. The loss-coefficient model was upgraded. Default correlations for rotor and stator solidity and aspect-ratio inputs and for stator-exit tangential velocity inputs were included in the code along with defaults for aerodynamic design limits. A complete description of input and output along with sample cases are included.

  17. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  18. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  19. Application Reuse Library for Software, Requirements, and Guidelines

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  20. Dealing with an Unconventional Genetic Code in  Mitochondria: The Biogenesis and Pathogenic  Defects of the 5-Formylcytosine Modification in  Mitochondrial tRNAMet.

    PubMed

    Van Haute, Lindsey; Powell, Christopher A; Minczuk, Michal

    2017-03-02

    Human mitochondria contain their own genome, which uses an unconventional genetic code. In addition to the standard AUG methionine codon, the single mitochondrial tRNA Methionine (mt-tRNAMet) also recognises AUA during translation initiation and elongation. Post-transcriptional modifications of tRNAs are important for structure, stability, correct folding and aminoacylation as well as decoding. The unique 5-formylcytosine (f5C) modification of position 34 in mt-tRNAMet has been long postulated to be crucial for decoding of unconventional methionine codons and efficient mitochondrial translation. However, the enzymes responsible for the formation of mitochondrial f5C have been identified only recently. The first step of the f5C pathway consists of methylation of cytosine by NSUN3. This is followed by further oxidation by ABH1. Here, we review the role of f5C, the latest breakthroughs in our understanding of the biogenesis of this unique mitochondrial tRNA modification and its involvement in human disease.

  1. Learning Disabled Adolescents' Use of Pragmatic Functions and Code-Switching.

    ERIC Educational Resources Information Center

    Biller, Maysoon F.

    The study examined whether a difference existed between 10 learning disabled (LD) and 10 normally achieving (NA) high school students in terms of comprehension and production or use of pragmatic skills. The skills examined were pragmatic function (i.e., an utterance spoken in context with specific intent), and code-switching (i.e., modification of…

  2. General review of the MOSTAS computer code for wind turbines

    NASA Technical Reports Server (NTRS)

    Dungundji, J.; Wendell, J. H.

    1981-01-01

    The MOSTAS computer code for wind turbine analysis is reviewed, and techniques and methods used in its analyses are described. Impressions of its strengths and weakness, and recommendations for its application, modification, and further development are made. Basic techniques used in wind turbine stability and response analyses for systems with constant and periodic coefficients are reviewed.

  3. Expanding the genetic code for site-specific labelling of tobacco mosaic virus coat protein and building biotin-functionalized virus-like particles.

    PubMed

    Wu, F C; Zhang, H; Zhou, Q; Wu, M; Ballard, Z; Tian, Y; Wang, J Y; Niu, Z W; Huang, Y

    2014-04-18

    A method for site-specific and high yield modification of tobacco mosaic virus coat protein (TMVCP) utilizing a genetic code expanding technology and copper free cycloaddition reaction has been established, and biotin-functionalized virus-like particles were built by the self-assembly of the protein monomers.

  4. 17 CFR 269.7 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710, Apr... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...

  5. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...

  6. Genetic Code Expansion: A Powerful Tool for Understanding the Physiological Consequences of Oxidative Stress Protein Modifications.

    PubMed

    Porter, Joseph J; Mehl, Ryan A

    2018-01-01

    Posttranslational modifications resulting from oxidation of proteins (Ox-PTMs) are present intracellularly under conditions of oxidative stress as well as basal conditions. In the past, these modifications were thought to be generic protein damage, but it has become increasingly clear that Ox-PTMs can have specific physiological effects. It is an arduous task to distinguish between the two cases, as multiple Ox-PTMs occur simultaneously on the same protein, convoluting analysis. Genetic code expansion (GCE) has emerged as a powerful tool to overcome this challenge as it allows for the site-specific incorporation of an Ox-PTM into translated protein. The resulting homogeneously modified protein products can then be rigorously characterized for the effects of individual Ox-PTMs. We outline the strengths and weaknesses of GCE as they relate to the field of oxidative stress and Ox-PTMs. An overview of the Ox-PTMs that have been genetically encoded and applications of GCE to the study of Ox-PTMs, including antibody validation and therapeutic development, is described.

  7. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  8. Terrestrial Background Reduction in RPM Systems by Direct Internal Shielding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, Sean M.; Ashbaker, Eric D.; Schweppe, John E.

    2008-11-19

    Gamma-ray detection systems that are close to the earth or other sources of background radiation often require shielding, especially when trying to detect a relatively weak source. One particular case of interest that we address in this paper is that encountered by the Radiation Portal Monitors (RPMs) systems placed at border-crossing Ports of Entry (POE). These RPM systems are used to screen for illicit radiological materials, and they are often placed in situations where terrestrial background is large. In such environments, it is desirable to consider simple physical modifications that could be implemented to reduce the effects from background radiationmore » without affecting the flow of traffic and the normal operation of the portal. Simple modifications include adding additional shielding to the environment, either inside or outside the apparatus. Previous work [2] has shown the utility of some of these shielding configurations for increasing the Signal to Noise Ratio (SNR) of gross-counting RPMs. Because the total cost for purchasing and installing RPM systems can be quite expensive, in the range of hundreds of thousands of dollars for each cargo-screening installation, these shielding variations may offer increases in detection capability for relatively small cost. Several modifications are considered here in regard to their real-world applicability, and are meant to give a general idea of the effectiveness of the schemes used to reduce background for both gross-counting and spectroscopic detectors. These scenarios are modeled via the Monte-Carlo N-Particle (MCNP) code package [1] for ease of altering shielding configurations, as well as enacting unusual scenarios prior to prototyping in the field. The objective of this paper is to provide results representative of real modifications that could enhance the sensitivity of this, as well as the next generation of radiation detectors. The models used in this work were designed to provide the most general results for an RPM. These results are therefore presented as general guidance on what shielding configurations will be the most valuable for a generalized RPM, considered in light of their economic and geometric possibility in the real world.« less

  9. Rule-based interface generation on mobile devices for structured documentation.

    PubMed

    Kock, Ann-Kristin; Andersen, Björn; Handels, Heinz; Ingenerf, Josef

    2014-01-01

    In many software systems to date, interactive graphical user interfaces (GUIs) are represented implicitly in the source code, together with the application logic. Hence, the re-use, development, and modification of these interfaces is often very laborious. Flexible adjustments of GUIs for various platforms and devices as well as individual user preferences are furthermore difficult to realize. These problems motivate a software-based separation of content and GUI models on the one hand, and application logic on the other. In this project, a software solution for structured reporting on mobile devices is developed. Clinical content archetypes developed in a previous project serve as the content model while the Android SDK provides the GUI model. The necessary bindings between the models are specified using the Jess Rule Language.

  10. New tool to assemble repetitive regions using next-generation sequencing data

    NASA Astrophysics Data System (ADS)

    Kuśmirek, Wiktor; Nowak, Robert M.; Neumann, Łukasz

    2017-08-01

    The next generation sequencing techniques produce a large amount of sequencing data. Some part of the genome are composed of repetitive DNA sequences, which are very problematic for the existing genome assemblers. We propose a modification of the algorithm for a DNA assembly, which uses the relative frequency of reads to properly reconstruct repetitive sequences. The new approach was implemented and tested, as a demonstration of the capability of our software we present some results for model organisms. The new implementation, using a three-layer software architecture was selected, where the presentation layer, data processing layer, and data storage layer were kept separate. Source code as well as demo application with web interface and the additional data are available at project web-page: http://dnaasm.sourceforge.net.

  11. Intra-Beam and Touschek Scattering Computations for Beam with Non-Gaussian Longitudinal Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, A.; Borland, M.

    Both intra-beamscattering (IBS) and the Touschek effect become prominent formulti-bend-achromat- (MBA-) based ultra-low-emittance storage rings. To mitigate the transverse emittance degradation and obtain a reasonably long beam lifetime, a higher harmonic rf cavity (HHC) is often proposed to lengthen the bunch. The use of such a cavity results in a non-gaussian longitudinal distribution. However, common methods for computing IBS and Touschek scattering assume Gaussian distributions. Modifications have been made to several simulation codes that are part of the elegant [1] toolkit to allow these computations for arbitrary longitudinal distributions. After describing thesemodifications, we review the results of detailed simulations formore » the proposed hybrid seven-bend-achromat (H7BA) upgrade lattice [2] for the Advanced Photon Source.« less

  12. Adaptive distributed source coding.

    PubMed

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  13. Phase II evaluation of clinical coding schemes: completeness, taxonomy, mapping, definitions, and clarity. CPRI Work Group on Codes and Structures.

    PubMed

    Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J

    1997-01-01

    To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.

  14. Method and apparatus for controlling carrier envelope phase

    DOEpatents

    Chang, Zenghu [Manhattan, KS; Li, Chengquan [Sunnyvale, CA; Moon, Eric [Manhattan, KS

    2011-12-06

    A chirped pulse amplification laser system. The system generally comprises a laser source, a pulse modification apparatus including first and second pulse modification elements separated by a separation distance, a positioning element, a measurement device, and a feedback controller. The laser source is operable to generate a laser pulse and the pulse modification apparatus operable to modify at least a portion of the laser pulse. The positioning element is operable to reposition at least a portion of the pulse modification apparatus to vary the separation distance. The measurement device is operable to measure the carrier envelope phase of the generated laser pulse and the feedback controller is operable to control the positioning element based on the measured carrier envelope phase to vary the separation distance of the pulse modification elements and control the carrier envelope phase of laser pulses generated by the laser source.

  15. Proton irradiation on materials

    NASA Technical Reports Server (NTRS)

    Chang, C. Ken

    1993-01-01

    A computer code is developed by utilizing a radiation transport code developed at NASA Langley Research Center to study the proton radiation effects on materials which have potential application in NASA's future space missions. The code covers the proton energy from 0.01 Mev to 100 Gev and is sufficient for energetic protons encountered in both low earth and geosynchronous orbits. With some modification, the code can be extended for particles heavier than proton as the radiation source. The code is capable of calculating the range, stopping power, exit energy, energy deposition coefficients, dose, and cumulative dose along the path of the proton in a target material. The target material can be any combination of the elements with atomic number ranging from 1 to 92, or any compound with known chemical composition. The generated cross section for a material is stored and is reused in future to save computer time. This information can be utilized to calculate the proton dose a material would receive in an orbit when the radiation environment is known. It can also be used to determine, in the laboratory, the parameters such as beam current of proton and irradiation time to attain the desired dosage for accelerated ground testing of any material. It is hoped that the present work be extended to include polymeric and composite materials which are prime candidates for use as coating, electronic components, and structure building. It is also desirable to determine, for ground testing these materials, the laboratory parameters in order to simulate the dose they would receive in space environments. A sample print-out for water subject to 1.5 Mev proton is included as a reference.

  16. Benchmarking kinetic calculations of resistive wall mode stability

    NASA Astrophysics Data System (ADS)

    Berkery, J. W.; Liu, Y. Q.; Wang, Z. R.; Sabbagh, S. A.; Logan, N. C.; Park, J.-K.; Manickam, J.; Betti, R.

    2014-05-01

    Validating the calculations of kinetic resistive wall mode (RWM) stability is important for confidently predicting RWM stable operating regions in ITER and other high performance tokamaks for disruption avoidance. Benchmarking the calculations of the Magnetohydrodynamic Resistive Spectrum—Kinetic (MARS-K) [Y. Liu et al., Phys. Plasmas 15, 112503 (2008)], Modification to Ideal Stability by Kinetic effects (MISK) [B. Hu et al., Phys. Plasmas 12, 057301 (2005)], and Perturbed Equilibrium Nonambipolar Transport PENT) [N. Logan et al., Phys. Plasmas 20, 122507 (2013)] codes for two Solov'ev analytical equilibria and a projected ITER equilibrium has demonstrated good agreement between the codes. The important particle frequencies, the frequency resonance energy integral in which they are used, the marginally stable eigenfunctions, perturbed Lagrangians, and fluid growth rates are all generally consistent between the codes. The most important kinetic effect at low rotation is the resonance between the mode rotation and the trapped thermal particle's precession drift, and MARS-K, MISK, and PENT show good agreement in this term. The different ways the rational surface contribution was treated historically in the codes is identified as a source of disagreement in the bounce and transit resonance terms at higher plasma rotation. Calculations from all of the codes support the present understanding that RWM stability can be increased by kinetic effects at low rotation through precession drift resonance and at high rotation by bounce and transit resonances, while intermediate rotation can remain susceptible to instability. The applicability of benchmarked kinetic stability calculations to experimental results is demonstrated by the prediction of MISK calculations of near marginal growth rates for experimental marginal stability points from the National Spherical Torus Experiment (NSTX) [M. Ono et al., Nucl. Fusion 40, 557 (2000)].

  17. [Genotoxic modification of nucleic acid bases and biological consequences of it. Review and prospects of experimental and computational investigations

    NASA Technical Reports Server (NTRS)

    Poltev, V. I.; Bruskov, V. I.; Shuliupina, N. V.; Rein, R.; Shibata, M.; Ornstein, R.; Miller, J.

    1993-01-01

    The review is presented of experimental and computational data on the influence of genotoxic modification of bases (deamination, alkylation, oxidation) on the structure and biological functioning of nucleic acids. Pathways are discussed for the influence of modification on coding properties of bases, on possible errors of nucleic acid biosynthesis, and on configurations of nucleotide mispairs. The atomic structure of nucleic acid fragments with modified bases and the role of base damages in mutagenesis and carcinogenesis are considered.

  18. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  19. The SENSEI Generic In Situ Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayachit, Utkarsh; Whitlock, Brad; Wolf, Matthew

    The SENSEI generic in situ interface is an API that promotes code portability and reusability. From the simulation view, a developer can instrument their code with the SENSEI API and then make make use of any number of in situ infrastructures. From the method view, a developer can write an in situ method using the SENSEI API, then expect it to run in any number of in situ infrastructures, or be invoked directly from a simulation code, with little or no modification. This paper presents the design principles underlying the SENSEI generic interface, along with some simplified coding examples.

  20. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  1. The effect of habitat modification on plant-pollinator network

    NASA Astrophysics Data System (ADS)

    Aminatun, Tien; Putra, Nugroho Susetya

    2017-08-01

    The research aimed to determine; (1) the mutualism interaction pattern of plant-pollinator on several habitat modifications; and (2) the habitat modification which showed the most stable pattern of interaction. The study was conducted in one planting season with 20 plots which each plot had 2x2 m2 width and 2 m spacing among plots, and each plot was planted with the same variety of tomato plants, i.e. "intan". Nitrogen manipulation treatment was conducted with four kinds of fertilizers, i.e. NPK (code PU), compost (code PKM), vermicompost (code PC), and manure (code PK). Each treatment had 5 plot replications. We observed the growth of tomato plants, weed and arthropod populationstwo weekly while pollinator visitation twice a week during tomato plant flowering with counting population and visitation frequence of each pollinator on each sample of tomato plants. The nectar of tomato plant flower of each treatment was tested in laboratory to see its reducing sugar and sucrose. Oganic matter and nitrogen of the soil samples of each treatment were tested in laboratory in the beginning and the end of this research. We analized the plant-pollinator network with bipartite program in R-statistics, and the abiotic and other biotic factors with descriptive analysis. The results of the research were; (1) the mutualism interaction pattern of plant-pollinator network of four treatments were varied, and (2) The pattern of plant-pollinator network of NPK fertilizer treatment showed the more stable interaction based on analysis of interaction evenness, Shannon diversity, frequency and longevity of pollinator visitation.

  2. Impacts of DNAPL Source Treatment: Experimental and Modeling Assessment of the Benefits of Partial DNAPL Source Removal

    DTIC Science & Technology

    2009-09-01

    nuclear industry for conducting performance assessment calculations. The analytical FORTRAN code for the DNAPL source function, REMChlor, was...project. The first was to apply existing deterministic codes , such as T2VOC and UTCHEM, to the DNAPL source zone to simulate the remediation processes...but describe the spatial variability of source zones unlike one-dimensional flow and transport codes that assume homogeneity. The Lagrangian models

  3. National Combustion Code Parallel Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Benyo, Theresa (Technical Monitor)

    2002-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.

  4. Total x-ray power measurements in the Sandia LIGA program.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malinowski, Michael E.; Ting, Aili

    2005-08-01

    Total X-ray power measurements using aluminum block calorimetry and other techniques were made at LIGA X-ray scanner synchrotron beamlines located at both the Advanced Light Source (ALS) and the Advanced Photon Source (APS). This block calorimetry work was initially performed on the LIGA beamline 3.3.1 of the ALS to provide experimental checks of predictions of the LEX-D (LIGA Exposure- Development) code for LIGA X-ray exposures, version 7.56, the version of the code in use at the time calorimetry was done. These experiments showed that it was necessary to use bend magnet field strengths and electron storage ring energies different frommore » the default values originally in the code in order to obtain good agreement between experiment and theory. The results indicated that agreement between LEX-D predictions and experiment could be as good as 5% only if (1) more accurate values of the ring energies, (2) local values of the magnet field at the beamline source point, and (3) the NIST database for X-ray/materials interactions were used as code inputs. These local magnetic field value and accurate ring energies, together with NIST database, are now defaults in the newest release of LEX-D, version 7.61. Three dimensional simulations of the temperature distributions in the aluminum calorimeter block for a typical ALS power measurement were made with the ABAQUS code and found to be in good agreement with the experimental temperature data. As an application of the block calorimetry technique, the X-ray power exiting the mirror in place at a LIGA scanner located at the APS beamline 10 BM was measured with a calorimeter similar to the one used at the ALS. The overall results at the APS demonstrated the utility of calorimetry in helping to characterize the total X-ray power in LIGA beamlines. In addition to the block calorimetry work at the ALS and APS, a preliminary comparison of the use of heat flux sensors, photodiodes and modified beam calorimeters as total X-ray power monitors was made at the ALS, beamline 3.3.1. This work showed that a modification of a commercially available, heat flux sensor could result in a simple, direct reading beam power meter that could be a useful for monitoring total X-ray power in Sandia's LIGA exposure stations at the ALS, APS and Stanford Synchrotron Radiation Laboratory (SSRL).« less

  5. Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide

    PubMed Central

    McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger

    2015-01-01

    Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837

  6. RETRAN analysis of multiple steam generator blow down caused by an auxiliary feedwater steam-line break

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.

    1987-01-01

    Analysis results for multiple steam generator blow down caused by an auxiliary feedwater steam-line break performed with the RETRAN-02 MOD 003 computer code are presented to demonstrate the capabilities of the RETRAN code to predict system transient response for verifying changes in operational procedures and supporting plant equipment modifications. A typical four-loop Westinghouse pressurized water reactor was modeled using best-estimate versus worst case licensing assumptions. This paper presents analyses performed to evaluate the necessity of implementing an auxiliary feedwater steam-line isolation modification. RETRAN transient analysis can be used to determine core cooling capability response, departure from nucleate boiling ratio (DNBR)more » status, and reactor trip signal actuation times.« less

  7. The development, evolution, and modifications of ICD-10: challenges to the international comparability of morbidity data.

    PubMed

    Jetté, Nathalie; Quan, Hude; Hemmelgarn, Brenda; Drosler, Saskia; Maass, Christina; Moskal, Lori; Paoin, Wansa; Sundararajan, Vijaya; Gao, Song; Jakob, Robert; Ustün, Bedihran; Ghali, William A

    2010-12-01

    The United States is about to make a major nationwide transition from ICD-9-CM coding of hospital discharges to ICD-10-CM, a country-specific modification of the World Health Organization's ICD-10. As this transition occurs, the WHO is already in the midst of developing ICD-11. Given this context, we undertook this review to discuss: (1) the history of the International Classification of Diseases (a core information "building block" for health systems everywhere) from its introduction to the current era of ICD-11 development; (2) differences across country-specific ICD-10 clinical modifications and the challenges that these differences pose to the international comparability of morbidity data; (3) potential strategic approaches to achieving better international ICD-11 comparability. A literature review and stakeholder consultation was carried out. The various ICD-10 clinical modifications (ICD-10-AM [Australia], ICD-10-CA [Canada], ICD-10-GM [Germany], ICD-10-TM [Thailand], ICD-10-CM [United States]) were compared. These ICD-10 modifications differ in their number of codes, chapters, and subcategories. Specific conditions are present in some but not all of the modifications. ICD-11, with a similar structure to ICD-10, will function in an electronic health records environment and also provide disease descriptive characteristics (eg, causal properties, functional impact, and treatment). The threat to the comparability of international clinical morbidity is growing with the development of many country-specific ICD-10 versions. One solution to this threat is to develop a meta-database including all country-specific modifications to ensure more efficient use of people and resources, decrease omissions and errors but most importantly provide a platform for future ICD updates.

  8. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343

  9. Recent optimization of the beam-optical characteristics of the 6 MV van de Graaff accelerator for high brightness beams at the iThemba LABS NMP facility

    NASA Astrophysics Data System (ADS)

    Conradie, J. L.; Eisa, M. E. M.; Celliers, P. J.; Delsink, J. L. G.; Fourie, D. T.; de Villiers, J. G.; Maine, P. M.; Springhorn, K. A.; Pineda-Vargas, C. A.

    2005-04-01

    With the aim of improving the reliability and stability of the beams delivered to the nuclear microprobe at iThemba LABS, as well as optimization of the beam characteristics along the van de Graaff accelerator beamlines in general, relevant modifications were implemented since the beginning of 2003. The design and layout of the beamlines were revised. The beam-optical characteristics through the accelerator, from the ion source up to the analysing magnet directly after the accelerator, were calculated and the design optimised, using the computer codes TRANSPORT, IGUN and TOSCA. The ion source characteristics and optimal operating conditions were determined on an ion source test bench. The measured optimal emittance for 90% of the beam intensity was about 50π mm mrad for an extraction voltage of 6 kV. These changes allow operation of the Nuclear Microprobe at proton energies in the range 1 MeV-4 MeV with beam intensities of tenths of a pA at the target surface. The capabilities of the nuclear microprobe facility were evaluated in the improved beamline, with particular emphasis to bio-medical samples.

  10. Improvements to Busquet's Non LTE algorithm in NRL's Hydro code

    NASA Astrophysics Data System (ADS)

    Klapisch, M.; Colombant, D.

    1996-11-01

    Implementation of the Non LTE model RADIOM (M. Busquet, Phys. Fluids B, 5, 4191 (1993)) in NRL's RAD2D Hydro code in conservative form was reported previously(M. Klapisch et al., Bull. Am. Phys. Soc., 40, 1806 (1995)).While the results were satisfactory, the algorithm was slow and not always converging. We describe here modifications that address the latter two shortcomings. This method is quicker and more stable than the original. It also gives information about the validity of the fitting. It turns out that the number and distribution of groups in the multigroup diffusion opacity tables - a basis for the computation of radiation effects in the ionization balance in RADIOM- has a large influence on the robustness of the algorithm. These modifications give insight about the algorithm, and allow to check that the obtained average charge state is the true average. In addition, code optimization resulted in greatly reduced computing time: The ratio of Non LTE to LTE computing times being now between 1.5 and 2.

  11. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  12. shiftNMFk 1.1: Robust Nonnegative matrix factorization with kmeans clustering and signal shift, for allocation of unknown physical sources, toy version for open sourcing with publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Boian S.; Lliev, Filip L.; Stanev, Valentin G.

    This code is a toy (short) version of CODE-2016-83. From a general perspective, the code represents an unsupervised adaptive machine learning algorithm that allows efficient and high performance de-mixing and feature extraction of a multitude of non-negative signals mixed and recorded by a network of uncorrelated sensor arrays. The code identifies the number of the mixed original signals and their locations. Further, the code also allows deciphering of signals that have been delayed in regards to the mixing process in each sensor. This code is high customizable and it can be efficiently used for a fast macro-analyses of data. Themore » code is applicable to a plethora of distinct problems: chemical decomposition, pressure transient decomposition, unknown sources/signal allocation, EM signal decomposition. An additional procedure for allocation of the unknown sources is incorporated in the code.« less

  13. BSPS Program (ESI-Mass Spectrometry) Biological Sample Data Analysis; Disruption of Bacteria Spores

    DTIC Science & Technology

    2005-10-01

    the original usage of the translational as a broad description of the entire process by which the polymer of the three-letter code in the mRNA is...translated. There is extensive review of post transnational modifications of proteins by Finn Wold(1981)24, given as in vivo chemical modifications... thiolation , biotin, bromination, carbamylation, deamidation, methylation, glu- cosylation, lipoyl, phosphorylation,, pyridoxal phosphate

  14. The InSAR Scientific Computing Environment

    NASA Technical Reports Server (NTRS)

    Rosen, Paul A.; Gurrola, Eric; Sacco, Gian Franco; Zebker, Howard

    2012-01-01

    We have developed a flexible and extensible Interferometric SAR (InSAR) Scientific Computing Environment (ISCE) for geodetic image processing. ISCE was designed from the ground up as a geophysics community tool for generating stacks of interferograms that lend themselves to various forms of time-series analysis, with attention paid to accuracy, extensibility, and modularity. The framework is python-based, with code elements rigorously componentized by separating input/output operations from the processing engines. This allows greater flexibility and extensibility in the data models, and creates algorithmic code that is less susceptible to unnecessary modification when new data types and sensors are available. In addition, the components support provenance and checkpointing to facilitate reprocessing and algorithm exploration. The algorithms, based on legacy processing codes, have been adapted to assume a common reference track approach for all images acquired from nearby orbits, simplifying and systematizing the geometry for time-series analysis. The framework is designed to easily allow user contributions, and is distributed for free use by researchers. ISCE can process data from the ALOS, ERS, EnviSAT, Cosmo-SkyMed, RadarSAT-1, RadarSAT-2, and TerraSAR-X platforms, starting from Level-0 or Level 1 as provided from the data source, and going as far as Level 3 geocoded deformation products. With its flexible design, it can be extended with raw/meta data parsers to enable it to work with radar data from other platforms

  15. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    NASA Astrophysics Data System (ADS)

    Guillemot, Christine; Siohan, Pierre

    2005-12-01

    Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  16. [The new Colombian criminal code and biotechnology].

    PubMed

    González de Cancino, Emilssen

    2002-01-01

    The author describes the process by which new offenses concerning biotechnology have been included in Colombia's Penal Code and discusses some of the more controversial aspects involved. She examines the various stages of the passage of the Bill through Parliament and the modifications undergone. She also provides well-argued criticism of the text, with appropriate reference to Constitutional provisions regarding the rights concerned.

  17. Procedures for the computation of unsteady transonic flows including viscous effects

    NASA Technical Reports Server (NTRS)

    Rizzetta, D. P.

    1982-01-01

    Modifications of the code LTRAN2, developed by Ballhaus and Goorjian, which account for viscous effects in the computation of planar unsteady transonic flows are presented. Two models are considered and their theoretical development and numerical implementation is discussed. Computational examples employing both models are compared with inviscid solutions and with experimental data. Use of the modified code is described.

  18. The MCNP-DSP code for calculations of time and frequency analysis parameters for subcritical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valentine, T.E.; Mihalczo, J.T.

    1995-12-31

    This paper describes a modified version of the MCNP code, the MCNP-DSP. Variance reduction features were disabled to have strictly analog particle tracking in order to follow fluctuating processes more accurately. Some of the neutron and photon physics routines were modified to better represent the production of particles. Other modifications are discussed.

  19. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...

  20. 17 CFR 269.7 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710, Apr... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...

  1. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  2. Multidimensional incremental parsing for universal source coding.

    PubMed

    Bae, Soo Hyun; Juang, Biing-Hwang

    2008-10-01

    A multidimensional incremental parsing algorithm (MDIP) for multidimensional discrete sources, as a generalization of the Lempel-Ziv coding algorithm, is investigated. It consists of three essential component schemes, maximum decimation matching, hierarchical structure of multidimensional source coding, and dictionary augmentation. As a counterpart of the longest match search in the Lempel-Ziv algorithm, two classes of maximum decimation matching are studied. Also, an underlying behavior of the dictionary augmentation scheme for estimating the source statistics is examined. For an m-dimensional source, m augmentative patches are appended into the dictionary at each coding epoch, thus requiring the transmission of a substantial amount of information to the decoder. The property of the hierarchical structure of the source coding algorithm resolves this issue by successively incorporating lower dimensional coding procedures in the scheme. In regard to universal lossy source coders, we propose two distortion functions, the local average distortion and the local minimax distortion with a set of threshold levels for each source symbol. For performance evaluation, we implemented three image compression algorithms based upon the MDIP; one is lossless and the others are lossy. The lossless image compression algorithm does not perform better than the Lempel-Ziv-Welch coding, but experimentally shows efficiency in capturing the source structure. The two lossy image compression algorithms are implemented using the two distortion functions, respectively. The algorithm based on the local average distortion is efficient at minimizing the signal distortion, but the images by the one with the local minimax distortion have a good perceptual fidelity among other compression algorithms. Our insights inspire future research on feature extraction of multidimensional discrete sources.

  3. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  4. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  5. Recent advances in coding theory for near error-free communications

    NASA Technical Reports Server (NTRS)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  6. Assessment of polarization effect on aerosol retrievals from MODIS

    NASA Astrophysics Data System (ADS)

    Korkin, S.; Lyapustin, A.

    2010-12-01

    Light polarization affects the total intensity of scattered radiation. In this work, we compare aerosol retrievals performed by code MAIAC [1] with and without taking polarization into account. The MAIAC retrievals are based on the look-up tables (LUT). For this work, MAIAC was run using two different LUTs, the first one generated using the scalar code SHARM [2], and the second one generated with the vector code Modified Vector Discrete Ordinates Method (MVDOM). MVDOM is a new code suitable for computations with highly anisotropic phase functions, including cirrus clouds and snow [3]. To this end, the solution of the vector radiative transfer equation (VRTE) is represented as a sum of anisotropic and regular components. The anisotropic component is evaluated in the Small Angle Modification of the Spherical Harmonics Method (MSH) [4]. The MSH is formulated in the frame of reference of the solar beam where z-axis lies along the solar beam direction. In this case, the MSH solution for anisotropic part is nearly symmetric in azimuth, and is computed analytically. In scalar case, this solution coincides with the Goudsmit-Saunderson small-angle approximation [5]. To correct for an analytical separation of the anisotropic part of the signal, the transfer equation for the regular part contains a correction source function term [6]. Several examples of polarization impact on aerosol retrievals over different surface types will be presented. 1. Lyapustin A., Wang Y., Laszlo I., Kahn R., Korkin S., Remer L., Levy R., and Reid J. S. Multi-Angle Implementation of Atmospheric Correction (MAIAC): Part 2. Aerosol Algorithm. J. Geophys. Res., submitted (2010). 2. Lyapustin A., Muldashev T., Wang Y. Code SHARM: fast and accurate radiative transfer over spatially variable anisotropic surfaces. In: Light Scattering Reviews 5. Chichester: Springer, 205 - 247 (2010). 3. Budak, V.P., Korkin S.V. On the solution of a vectorial radiative transfer equation in an arbitrary three-dimensional turbid medium with anisotropic scattering. JQSRT, 109, 220-234 (2008). 4. Budak V.P., Sarmin S.E. Solution of radiative transfer equation by the method of spherical harmonics in the small angle modification. Atmospheric and Oceanic Optics, 3, 898-903 (1990). 5. Goudsmit S., Saunderson J.L. Multiple scattering of electrons. Phys. Rev., 57, 24-29 (1940). 6. Budak V.P, Klyuykov D.A., Korkin S.V. Convergence acceleration of radiative transfer equation solution at strongly anisotropic scattering. In: Light Scattering Reviews 5. Chichester: Springer, 147 - 204 (2010).

  7. Proposed modifications to the RCRA post-closure permit for the Upper East Fork Poplar Creek Hydrogeologic Regime at the U.S. Department of Energy Y-12 Plant, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-05-01

    This report presents proposed modifications to the Resource Conservation and Recovery Act (RCRA) Post-Closure Permit (PCP) for the Upper East Fork Poplar Creek Hydrogeologic Regime (permit number TNHW-088, EPA ID No. TN3 89 009 0001). The modifications are proposed to: (1) revise the current text for two of the Permit Conditions included in Permit Section II - General Facility Conditions, and (2) update the PCP with revised versions of the Y-12 Plant Groundwater Protection Program (GWPP) technical field procedures included in several of the Permit Attachments. The updated field procedures and editorial revisions are Class 1 permit modifications, as specifiedmore » in Title 40, Code of Federal Regulations (CFR) {section}270.42; Appendix I - Classification of Permit Modifications. These modifications are summarized below.« less

  8. Histone modification: cause or cog?

    PubMed

    Henikoff, Steven; Shilatifard, Ali

    2011-10-01

    Histone modifications are key components of chromatin packaging but whether they constitute a 'code' has been contested. We believe that the central issue is causality: are histone modifications responsible for differences between chromatin states, or are differences in modifications mostly consequences of dynamic processes, such as transcription and nucleosome remodeling? We find that inferences of causality are often based on correlation and that patterns of some key histone modifications are more easily explained as consequences of nucleosome disruption in the presence of histone modifying enzymes. We suggest that the 35-year-old DNA accessibility paradigm provides a mechanistically sound basis for understanding the role of nucleosomes in gene regulation and epigenetic inheritance. Based on this view, histone modifications and variants contribute to diversification of a chromatin landscape shaped by dynamic processes that are driven primarily by transcription and nucleosome remodeling. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Hybrid concatenated codes and iterative decoding

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Pollara, Fabrizio (Inventor)

    2000-01-01

    Several improved turbo code apparatuses and methods. The invention encompasses several classes: (1) A data source is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each encoder outputs a code element which may be transmitted or stored. A parallel decoder provides the ability to decode the code elements to derive the original source information d without use of a received data signal corresponding to d. The output may be coupled to a multilevel trellis-coded modulator (TCM). (2) A data source d is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each of the encoders outputs a code element. In addition, the original data source d is output from the encoder. All of the output elements are coupled to a TCM. (3) At least two data sources are applied to two or more encoders with an interleaver between each source and each of the second and subsequent encoders. The output may be coupled to a TCM. (4) At least two data sources are applied to two or more encoders with at least two interleavers between each source and each of the second and subsequent encoders. (5) At least one data source is applied to one or more serially linked encoders through at least one interleaver. The output may be coupled to a TCM. The invention includes a novel way of terminating a turbo coder.

  10. Geant4 Modifications for Accurate Fission Simulations

    NASA Astrophysics Data System (ADS)

    Tan, Jiawei; Bendahan, Joseph

    Monte Carlo is one of the methods to simulate the generation and transport of radiation through matter. The most widely used radiation simulation codes are MCNP and Geant4. The simulation of fission production and transport by MCNP has been thoroughly benchmarked. There is an increasing number of users that prefer using Geant4 due to the flexibility of adding features. However, it has been found that Geant4 does not have the proper fission-production cross sections and does not produce the correct fission products. To achieve accurate results for studies in fissionable material applications, Geant4 was modified to correct these inaccuracies and to add new capabilities. The fission model developed by the Lawrence Livermore National Laboratory was integrated into the neutron-fission modeling package. The photofission simulation capability was enabled using the same neutron-fission library under the assumption that nuclei fission in the same way, independent of the excitation source. The modified fission code provides the correct multiplicity of prompt neutrons and gamma rays, and produces delayed gamma rays and neutrons with time and energy dependencies that are consistent with ENDF/B-VII. The delayed neutrons are now directly produced by a custom package that bypasses the fragment cascade model. The modifications were made for U-235, U-238 and Pu-239 isotopes; however, the new framework allows adding new isotopes easily. The SLAC nuclear data library is used for simulation of isotopes with an atomic number above 92 because it is not available in Geant4. Results of the modified Geant4.10.1 package of neutron-fission and photofission for prompt and delayed radiation are compared with ENDFB-VII and with results produced with the original package.

  11. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    DOEpatents

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  12. Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.

    PubMed

    Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger

    2015-01-01

    To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zehtabian, M; Zaker, N; Sina, S

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less

  14. Empirical source strength correlations for rans-based acoustic analogy methods

    NASA Astrophysics Data System (ADS)

    Kube-McDowell, Matthew Tyndall

    JeNo is a jet noise prediction code based on an acoustic analogy method developed by Mani, Gliebe, Balsa, and Khavaran. Using the flow predictions from a standard Reynolds-averaged Navier-Stokes computational fluid dynamics solver, JeNo predicts the overall sound pressure level and angular spectra for high-speed hot jets over a range of observer angles, with a processing time suitable for rapid design purposes. JeNo models the noise from hot jets as a combination of two types of noise sources; quadrupole sources dependent on velocity fluctuations, which represent the major noise of turbulent mixing, and dipole sources dependent on enthalpy fluctuations, which represent the effects of thermal variation. These two sources are modeled by JeNo as propagating independently into the far-field, with no cross-correlation at the observer location. However, high-fidelity computational fluid dynamics solutions demonstrate that this assumption is false. In this thesis, the theory, assumptions, and limitations of the JeNo code are briefly discussed, and a modification to the acoustic analogy method is proposed in which the cross-correlation of the two primary noise sources is allowed to vary with the speed of the jet and the observer location. As a proof-of-concept implementation, an empirical correlation correction function is derived from comparisons between JeNo's noise predictions and a set of experimental measurements taken for the Air Force Aero-Propulsion Laboratory. The empirical correlation correction is then applied to JeNo's predictions of a separate data set of hot jets tested at NASA's Glenn Research Center. Metrics are derived to measure the qualitative and quantitative performance of JeNo's acoustic predictions, and the empirical correction is shown to provide a quantitative improvement in the noise prediction at low observer angles with no freestream flow, and a qualitative improvement in the presence of freestream flow. However, the results also demonstrate that there are underlying flaws in JeNo's ability to predict the behavior of a hot jet's acoustic signature at certain rear observer angles, and that this correlation correction is not able to correct these flaws.

  15. Survey Of Lossless Image Coding Techniques

    NASA Astrophysics Data System (ADS)

    Melnychuck, Paul W.; Rabbani, Majid

    1989-04-01

    Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.

  16. 77 FR 50720 - Notice of Permit Modification Received Under the Antarctic Conservation Act of 1978

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ...The National Science Foundation (NSF) is required to publish a notice of requests to modify permits issued to conduct activities regulated under the Antarctic Conservation Act of 1978,, Public Law 95- 541. NSF has published regulations under the Antarctic Conservation Act at Title 45 Part 670 of the Code of Federal Regulations. This is the required notice of a requested permit modification.

  17. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    ERIC Educational Resources Information Center

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  18. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  19. Functional Equivalence Acceptance Testing of FUN3D for Entry Descent and Landing Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Wood, William A.; Kleb, William L.; Alter, Stephen J.; Glass, Christopher E.; Padilla, Jose F.; Hammond, Dana P.; White, Jeffery A.

    2013-01-01

    The functional equivalence of the unstructured grid code FUN3D to the the structured grid code LAURA (Langley Aerothermodynamic Upwind Relaxation Algorithm) is documented for applications of interest to the Entry, Descent, and Landing (EDL) community. Examples from an existing suite of regression tests are used to demonstrate the functional equivalence, encompassing various thermochemical models and vehicle configurations. Algorithm modifications required for the node-based unstructured grid code (FUN3D) to reproduce functionality of the cell-centered structured code (LAURA) are also documented. Challenges associated with computation on tetrahedral grids versus computation on structured-grid derived hexahedral systems are discussed.

  20. Summary of Documentation for DYNA3D-ParaDyn's Software Quality Assurance Regression Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zywicz, Edward

    The Software Quality Assurance (SQA) regression test suite for DYNA3D (Zywicz and Lin, 2015) and ParaDyn (DeGroot, et al., 2015) currently contains approximately 600 problems divided into 21 suites, and is a required component of ParaDyn’s SQA plan (Ferencz and Oliver, 2013). The regression suite allows developers to ensure that software modifications do not unintentionally alter the code response. The entire regression suite is run prior to permanently incorporating any software modification or addition. When code modifications alter test problem results, the specific cause must be determined and fully understood before the software changes and revised test answers can bemore » incorporated. The regression suite is executed on LLNL platforms using a Python script and an associated data file. The user specifies the DYNA3D or ParaDyn executable, number of processors to use, test problems to run, and other options to the script. The data file details how each problem and its answer extraction scripts are executed. For each problem in the regression suite there exists an input deck, an eight-processor partition file, an answer file, and various extraction scripts. These scripts assemble a temporary answer file in a specific format from the simulation results. The temporary and stored answer files are compared to a specific level of numerical precision, and when differences are detected the test problem is flagged as failed. Presently, numerical results are stored and compared to 16 digits. At this accuracy level different processor types, compilers, number of partitions, etc. impact the results to various degrees. Thus, for consistency purposes the regression suite is run with ParaDyn using 8 processors on machines with a specific processor type (currently the Intel Xeon E5530 processor). For non-parallel regression problems, i.e., the two XFEM problems, DYNA3D is used instead. When environments or platforms change, executables using the current source code and the new resource are created and the regression suite is run. If differences in answers arise, the new answers are retained provided that the differences are inconsequential. This bootstrap approach allows the test suite answers to evolve in a controlled manner with a high level of confidence. Developers also run the entire regression suite with (serial) DYNA3D. While these results normally differ from the stored (parallel) answers, abnormal termination or wildly different values are strong indicators of potential issues.« less

  1. Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.

    PubMed

    Padula, William V; McQueen, Robert Brett; Pronovost, Peter J

    2017-11-01

    The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.

  2. Technical Modification Within the Healthcare Industry: Improving Both the Efficacy of the National Drug Code Carrier and the Accessibility of Electronic Health Records to Reduce Adverse Drug Events

    DTIC Science & Technology

    2013-06-01

    with an EHR .................................................. 97 C. SWOT ANALYSIS OF USING QR CODES WITH THE NDC AND WITH EHRS...96 Figure 41. SWOT analysis ................................................................................... 99 xiii LIST OF...Coordinator for Health Information Technology OTC Over-the-Counter PHI Personal Health Information QR Quick Response SWOT Strengths, Weaknesses

  3. 44 CFR 72.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... or hydraulic characteristics of a flooding source and thus result in the modification of the existing... hydrologic or hydraulic characteristics of a flooding source and thus result in the modification of the... generally based on physical measures that affect the hydrologic or hydraulic characteristics of a flooding...

  4. 44 CFR 72.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... or hydraulic characteristics of a flooding source and thus result in the modification of the existing... hydrologic or hydraulic characteristics of a flooding source and thus result in the modification of the... generally based on physical measures that affect the hydrologic or hydraulic characteristics of a flooding...

  5. 44 CFR 72.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... or hydraulic characteristics of a flooding source and thus result in the modification of the existing... hydrologic or hydraulic characteristics of a flooding source and thus result in the modification of the... generally based on physical measures that affect the hydrologic or hydraulic characteristics of a flooding...

  6. 44 CFR 72.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... or hydraulic characteristics of a flooding source and thus result in the modification of the existing... hydrologic or hydraulic characteristics of a flooding source and thus result in the modification of the... generally based on physical measures that affect the hydrologic or hydraulic characteristics of a flooding...

  7. 44 CFR 72.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... or hydraulic characteristics of a flooding source and thus result in the modification of the existing... hydrologic or hydraulic characteristics of a flooding source and thus result in the modification of the... generally based on physical measures that affect the hydrologic or hydraulic characteristics of a flooding...

  8. The complete mitochondrial genome of the gall-forming fly, Fergusonina taylori Nelson and Yeates (Diptera: Fergusoninidae).

    PubMed

    Nelson, Leigh A; Cameron, Stephen L; Yeates, David K

    2011-10-01

    The monogeneric family Fergusoninidae consists of gall-forming flies that, together with Fergusobia (Tylenchida: Neotylenchidae) nematodes, form the only known mutualistic association between insects and nematodes. In this study, the entire 16,000 bp mitochondrial genome of Fergusonina taylori Nelson and Yeates was sequenced. The circular genome contains one encoding region including 27 genes and one non-coding A+T-rich region. The arrangement of the protein-coding, ribosomal RNA (rRNA) and transfer RNA (tRNA) genes was the same as that found in the ancestral insect. Nucleotide composition is highly A+T biased. All of the protein initiation codons are ATN, except for nad1 which begins with TTT. All 22 tRNA anticodons of F. taylori match those observed in Drosophila yakuba, and all form the typical cloverleaf structure except for tRNA-Ser((AGN)) which lacks a dihydrouridine (DHU) arm. Secondary structural features of the rRNA genes of Fergusonina are similar to those proposed for other insects, with minor modifications. The mitochondrial genome of Fergusonina presented here may prove valuable for resolving the sister group to the Fergusoninidae, and expands the available mtDNA data sources for acalyptrates overall.

  9. Web-Based Environment for Maintaining Legacy Software

    NASA Technical Reports Server (NTRS)

    Tigges, Michael; Thompson, Nelson; Orr, Mark; Fox, Richard

    2007-01-01

    Advanced Tool Integration Environment (ATIE) is the name of both a software system and a Web-based environment created by the system for maintaining an archive of legacy software and expertise involved in developing the legacy software. ATIE can also be used in modifying legacy software and developing new software. The information that can be encapsulated in ATIE includes experts documentation, input and output data of tests cases, source code, and compilation scripts. All of this information is available within a common environment and retained in a database for ease of access and recovery by use of powerful search engines. ATIE also accommodates the embedment of supporting software that users require for their work, and even enables access to supporting commercial-off-the-shelf (COTS) software within the flow of the experts work. The flow of work can be captured by saving the sequence of computer programs that the expert uses. A user gains access to ATIE via a Web browser. A modern Web-based graphical user interface promotes efficiency in the retrieval, execution, and modification of legacy code. Thus, ATIE saves time and money in the support of new and pre-existing programs.

  10. EPA Office of Water (OW): 12-digit Hydrologic Unit Boundaries of the United States

    EPA Pesticide Factsheets

    The Watershed Boundary Dataset (WBD) is a complete digital hydrologic unit national boundary layer that is at the Subwatershed (12-digit) level. It is composed of the watershed boundaries delineated by state agencies at the 1:24,000 scale. Please refer to the individual state metadata as the primary reference source. To access state specific metadata, go to the following link to view documentation created by agencies that performed the watershed delineation. This data set is a complete digital hydrologic unit boundary layer to the Subwatershed (12-digit) 6th level. This data set consists of geo-referenced digital data and associated attributes created in accordance with the FGDC Proposal, Version 1.0 - Federal Standards For Delineation of Hydrologic Unit Boundaries 3/01/02. Polygons are attributed with hydrologic unit codes for 4th level sub-basins, 5th level watersheds, 6th level subwatersheds, name, size, downstream hydrologic unit, type of watershed, non-contributing areas and flow modification. Arcs are attributed with the highest hydrologic unit code for each watershed, linesource and a metadata reference file.Please refer to the Metadata contact if you want access to the WBD national data set.

  11. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude data for point sources. The six codes and their definitions are: (1) Coordinate Data Source Code: The... physical piece of or a closely related set of equipment. The EPA's reporting format for a given inventory...

  12. The Astrophysics Source Code Library by the numbers

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.

  13. Astrophysics Source Code Library: Incite to Cite!

    NASA Astrophysics Data System (ADS)

    DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.

    2014-05-01

    The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.

  14. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  15. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  16. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 8: Cooling Flow/heat Transfer Analysis User's Manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.

    1994-01-01

    The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. This user's manual describes how to use the ADPAC code as developed in Task 5, NAS3-25270, including the modifications made to date in Tasks 7 and 8, NAS3-25270.

  17. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    NASA Astrophysics Data System (ADS)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  18. GSOSTATS Database: USAF Synchronous Satellite Catalog Data Conversion Software. User's Guide and Software Maintenance Manual, Version 2.1

    NASA Technical Reports Server (NTRS)

    Mallasch, Paul G.; Babic, Slavoljub

    1994-01-01

    The United States Air Force (USAF) provides NASA Lewis Research Center with monthly reports containing the Synchronous Satellite Catalog and the associated Two Line Mean Element Sets. The USAF Synchronous Satellite Catalog supplies satellite orbital parameters collected by an automated monitoring system and provided to Lewis Research Center as text files on magnetic tape. Software was developed to facilitate automated formatting, data normalization, cross-referencing, and error correction of Synchronous Satellite Catalog files before loading into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). This document contains the User's Guide and Software Maintenance Manual with information necessary for installation, initialization, start-up, operation, error recovery, and termination of the software application. It also contains implementation details, modification aids, and software source code adaptations for use in future revisions.

  19. Combinatorial Histone Acetylation Patterns Are Generated by Motif-Specific Reactions.

    PubMed

    Blasi, Thomas; Feller, Christian; Feigelman, Justin; Hasenauer, Jan; Imhof, Axel; Theis, Fabian J; Becker, Peter B; Marr, Carsten

    2016-01-27

    Post-translational modifications (PTMs) are pivotal to cellular information processing, but how combinatorial PTM patterns ("motifs") are set remains elusive. We develop a computational framework, which we provide as open source code, to investigate the design principles generating the combinatorial acetylation patterns on histone H4 in Drosophila melanogaster. We find that models assuming purely unspecific or lysine site-specific acetylation rates were insufficient to explain the experimentally determined motif abundances. Rather, these abundances were best described by an ensemble of models with acetylation rates that were specific to motifs. The model ensemble converged upon four acetylation pathways; we validated three of these using independent data from a systematic enzyme depletion study. Our findings suggest that histone acetylation patterns originate through specific pathways involving motif-specific acetylation activity. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Monte Carlo Simulation Tool Installation and Operation Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.

    2013-09-02

    This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection bymore » an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.« less

  1. The effect of subject measurement error on joint kinematics in the conventional gait model: Insights from the open-source pyCGM tool using high performance computing methods.

    PubMed

    Schwartz, Mathew; Dixon, Philippe C

    2018-01-01

    The conventional gait model (CGM) is a widely used biomechanical model which has been validated over many years. The CGM relies on retro-reflective markers placed along anatomical landmarks, a static calibration pose, and subject measurements as inputs for joint angle calculations. While past literature has shown the possible errors caused by improper marker placement, studies on the effects of inaccurate subject measurements are lacking. Moreover, as many laboratories rely on the commercial version of the CGM, released as the Plug-in Gait (Vicon Motion Systems Ltd, Oxford, UK), integrating improvements into the CGM code is not easily accomplished. This paper introduces a Python implementation for the CGM, referred to as pyCGM, which is an open-source, easily modifiable, cross platform, and high performance computational implementation. The aims of pyCGM are to (1) reproduce joint kinematic outputs from the Vicon CGM and (2) be implemented in a parallel approach to allow integration on a high performance computer. The aims of this paper are to (1) demonstrate that pyCGM can systematically and efficiently examine the effect of subject measurements on joint angles and (2) be updated to include new calculation methods suggested in the literature. The results show that the calculated joint angles from pyCGM agree with Vicon CGM outputs, with a maximum lower body joint angle difference of less than 10-5 degrees. Through the hierarchical system, the ankle joint is the most vulnerable to subject measurement error. Leg length has the greatest effect on all joints as a percentage of measurement error. When compared to the errors previously found through inter-laboratory measurements, the impact of subject measurements is minimal, and researchers should rather focus on marker placement. Finally, we showed that code modifications can be performed to include improved hip, knee, and ankle joint centre estimations suggested in the existing literature. The pyCGM code is provided in open source format and available at https://github.com/cadop/pyCGM.

  2. The effect of subject measurement error on joint kinematics in the conventional gait model: Insights from the open-source pyCGM tool using high performance computing methods

    PubMed Central

    Dixon, Philippe C.

    2018-01-01

    The conventional gait model (CGM) is a widely used biomechanical model which has been validated over many years. The CGM relies on retro-reflective markers placed along anatomical landmarks, a static calibration pose, and subject measurements as inputs for joint angle calculations. While past literature has shown the possible errors caused by improper marker placement, studies on the effects of inaccurate subject measurements are lacking. Moreover, as many laboratories rely on the commercial version of the CGM, released as the Plug-in Gait (Vicon Motion Systems Ltd, Oxford, UK), integrating improvements into the CGM code is not easily accomplished. This paper introduces a Python implementation for the CGM, referred to as pyCGM, which is an open-source, easily modifiable, cross platform, and high performance computational implementation. The aims of pyCGM are to (1) reproduce joint kinematic outputs from the Vicon CGM and (2) be implemented in a parallel approach to allow integration on a high performance computer. The aims of this paper are to (1) demonstrate that pyCGM can systematically and efficiently examine the effect of subject measurements on joint angles and (2) be updated to include new calculation methods suggested in the literature. The results show that the calculated joint angles from pyCGM agree with Vicon CGM outputs, with a maximum lower body joint angle difference of less than 10-5 degrees. Through the hierarchical system, the ankle joint is the most vulnerable to subject measurement error. Leg length has the greatest effect on all joints as a percentage of measurement error. When compared to the errors previously found through inter-laboratory measurements, the impact of subject measurements is minimal, and researchers should rather focus on marker placement. Finally, we showed that code modifications can be performed to include improved hip, knee, and ankle joint centre estimations suggested in the existing literature. The pyCGM code is provided in open source format and available at https://github.com/cadop/pyCGM. PMID:29293565

  3. Modification and benchmarking of MCNP for low-energy tungsten spectra.

    PubMed

    Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M

    2000-12-01

    The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.

  4. Data compression for satellite images

    NASA Technical Reports Server (NTRS)

    Chen, P. H.; Wintz, P. A.

    1976-01-01

    An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.

  5. Genetic Code Expansion: A Powerful Tool for Understanding the Physiological Consequences of Oxidative Stress Protein Modifications

    PubMed Central

    2018-01-01

    Posttranslational modifications resulting from oxidation of proteins (Ox-PTMs) are present intracellularly under conditions of oxidative stress as well as basal conditions. In the past, these modifications were thought to be generic protein damage, but it has become increasingly clear that Ox-PTMs can have specific physiological effects. It is an arduous task to distinguish between the two cases, as multiple Ox-PTMs occur simultaneously on the same protein, convoluting analysis. Genetic code expansion (GCE) has emerged as a powerful tool to overcome this challenge as it allows for the site-specific incorporation of an Ox-PTM into translated protein. The resulting homogeneously modified protein products can then be rigorously characterized for the effects of individual Ox-PTMs. We outline the strengths and weaknesses of GCE as they relate to the field of oxidative stress and Ox-PTMs. An overview of the Ox-PTMs that have been genetically encoded and applications of GCE to the study of Ox-PTMs, including antibody validation and therapeutic development, is described. PMID:29849913

  6. User Manual for Beta Version of TURBO-GRD: A Software System for Interactive Two-Dimensional Boundary/ Field Grid Generation, Modification, and Refinement

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee

    1998-01-01

    TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.

  7. Oilseed crops as renewable sources of industrial chemicals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKeon, T.A.; Lin, Jiann-Tsyh; Goodrich-Tanrikulu, M.

    1995-12-01

    The presence of specific functional groups on a fatty acid confers value for industrial uses. The plant kingdom contains numerous examples of plants that produce seed oils containing fatty acids with epoxy groups, hydroxyl groups, triple bonds or with unusual double bond positions. These fatty acids can be used directly or are readily modified for use in specialty lubricants, plastics and coatings. Many of these plants are not cultivated in the U.S. due to unsuitable climate or growth habit. Such plants provide a source of genes coding for enzymes that will carry out the desired fatty acid modification. Genetic technologymore » allows the transfer of these genes into domestically grown crops such as rapeseed or soybean, with consequent production of the desired fatty acid in the seed oil. One biotechnology company has commercialized a transgenic oilseed crop with an altered fatty acid composition. This talk will review current and projected plans for developing oilseed crops to serve as renewable resources that meet current industrial needs or provide chemical feedstocks for new uses.« less

  8. Montana SIP: Table c, (viii) Administrative Rules of Montana, Subchapter 10, Preconstruction Permit Requirements for Major Stationary Sources or Major Modifications Locating Within Attainment or Unclassified Areas

    EPA Pesticide Factsheets

    Montana SIP: Table c, (viii) Administrative Rules of Montana, Subchapter 10, Preconstruction Permit Requirements for Major Stationary Sources or Major Modifications Locating Within Attainment or Unclassified Areas

  9. 10 CFR 40.71 - Modification and revocation of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Modification and revocation of licenses. 40.71 Section 40.71 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Modification and Revocation of Licenses § 40.71 Modification and revocation of licenses. (a) The terms and conditions of each...

  10. Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    PubMed Central

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

  11. State-Chart Autocoder

    NASA Technical Reports Server (NTRS)

    Clark, Kenneth; Watney, Garth; Murray, Alexander; Benowitz, Edward

    2007-01-01

    A computer program translates Unified Modeling Language (UML) representations of state charts into source code in the C, C++, and Python computing languages. ( State charts signifies graphical descriptions of states and state transitions of a spacecraft or other complex system.) The UML representations constituting the input to this program are generated by using a UML-compliant graphical design program to draw the state charts. The generated source code is consistent with the "quantum programming" approach, which is so named because it involves discrete states and state transitions that have features in common with states and state transitions in quantum mechanics. Quantum programming enables efficient implementation of state charts, suitable for real-time embedded flight software. In addition to source code, the autocoder program generates a graphical-user-interface (GUI) program that, in turn, generates a display of state transitions in response to events triggered by the user. The GUI program is wrapped around, and can be used to exercise the state-chart behavior of, the generated source code. Once the expected state-chart behavior is confirmed, the generated source code can be augmented with a software interface to the rest of the software with which the source code is required to interact.

  12. Practices in source code sharing in astrophysics

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly

    2013-02-01

    While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.

  13. Machado-Joseph Disease

    MedlinePlus

    ... Caregiver Education » Fact Sheets Machado-Joseph Disease Fact Sheet What is Machado-Joseph disease? What are the ... the repeat is in a protein-producing or coding region of the gene. Modifications of the mutant ...

  14. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less

  15. 40 CFR 52.1225 - Review of new sources and modifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...

  16. 40 CFR 52.1225 - Review of new sources and modifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...

  17. 40 CFR 52.1225 - Review of new sources and modifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...

  18. 40 CFR 52.1225 - Review of new sources and modifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...

  19. 40 CFR 52.1225 - Review of new sources and modifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...

  20. 40 CFR 52.1824 - Review of new sources and modifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Environmental Engineering, stated: To clarify this issue, the State of North Dakota will commit to meeting all... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Review of new sources and modifications. 52.1824 Section 52.1824 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR...

  1. 40 CFR 52.2775 - Review of new sources and modifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 5 2014-07-01 2014-07-01 false Review of new sources and modifications. 52.2775 Section 52.2775 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR..., by prominent advertisement in the local news media, of the opportunity for public comment on the...

  2. 40 CFR 52.2775 - Review of new sources and modifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Review of new sources and modifications. 52.2775 Section 52.2775 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR..., by prominent advertisement in the local news media, of the opportunity for public comment on the...

  3. 40 CFR 52.2775 - Review of new sources and modifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 5 2013-07-01 2013-07-01 false Review of new sources and modifications. 52.2775 Section 52.2775 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR..., by prominent advertisement in the local news media, of the opportunity for public comment on the...

  4. 40 CFR 52.2775 - Review of new sources and modifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 5 2012-07-01 2012-07-01 false Review of new sources and modifications. 52.2775 Section 52.2775 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR..., by prominent advertisement in the local news media, of the opportunity for public comment on the...

  5. 40 CFR 52.2775 - Review of new sources and modifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 4 2011-07-01 2011-07-01 false Review of new sources and modifications. 52.2775 Section 52.2775 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR..., by prominent advertisement in the local news media, of the opportunity for public comment on the...

  6. Authorship attribution of source code by using back propagation neural network based on particle swarm optimization

    PubMed Central

    Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao

    2017-01-01

    Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead. PMID:29095934

  7. Chromatin replication: TRANSmitting the histone code

    PubMed Central

    Chang, Han-Wen; Studitsky, Vasily M.

    2017-01-01

    Efficient overcoming of the nucleosomal barrier and accurate maintenance of associated histone marks during chromatin replication are essential for normal functioning of the cell. Recent studies revealed new protein factors and histone modifications contributing to overcoming the nucleosomal barrier, and suggested an important role for DNA looping in survival of the original histones during replication. These studies suggest new possible mechanisms for transmitting the histone code to next generations of cells. PMID:28393112

  8. Aeroacoustic Codes for Rotor Harmonic and BVI Noise. CAMRAD.Mod1/HIRES: Methodology and Users' Manual

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.

    1998-01-01

    This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.

  9. Comparison of Einstein-Boltzmann solvers for testing general relativity

    NASA Astrophysics Data System (ADS)

    Bellini, E.; Barreira, A.; Frusciante, N.; Hu, B.; Peirone, S.; Raveri, M.; Zumalacárregui, M.; Avilez-Lopez, A.; Ballardini, M.; Battye, R. A.; Bolliet, B.; Calabrese, E.; Dirian, Y.; Ferreira, P. G.; Finelli, F.; Huang, Z.; Ivanov, M. M.; Lesgourgues, J.; Li, B.; Lima, N. A.; Pace, F.; Paoletti, D.; Sawicki, I.; Silvestri, A.; Skordis, C.; Umiltà, C.; Vernizzi, F.

    2018-01-01

    We compare Einstein-Boltzmann solvers that include modifications to general relativity and find that, for a wide range of models and parameters, they agree to a high level of precision. We look at three general purpose codes that primarily model general scalar-tensor theories, three codes that model Jordan-Brans-Dicke (JBD) gravity, a code that models f (R ) gravity, a code that models covariant Galileons, a code that models Hořava-Lifschitz gravity, and two codes that model nonlocal models of gravity. Comparing predictions of the angular power spectrum of the cosmic microwave background and the power spectrum of dark matter for a suite of different models, we find agreement at the subpercent level. This means that this suite of Einstein-Boltzmann solvers is now sufficiently accurate for precision constraints on cosmological and gravitational parameters.

  10. Spectral fitting, shock layer modeling, and production of nitrogen oxides and excited nitrogen

    NASA Technical Reports Server (NTRS)

    Blackwell, H. E.

    1991-01-01

    An analysis was made of N2 emission from 8.72 MJ/kg shock layer at 2.54, 1.91, and 1.27 cm positions and vibrational state distributions, temperatures, and relative electronic state populations was obtained from data sets. Other recorded arc jet N2 and air spectral data were reviewed and NO emission characteristics were studied. A review of operational procedures of the DSMC code was made. Information on other appropriate codes and modifications, including ionization, were made as well as a determination of the applicability of codes reviewed to task requirement. A review was also made of computational procedures used in CFD codes of Li and other codes on JSC computers. An analysis was made of problems associated with integration of specific chemical kinetics applicable to task into CFD codes.

  11. The mathematical theory of signal processing and compression-designs

    NASA Astrophysics Data System (ADS)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  12. Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter

    DTIC Science & Technology

    2007-08-31

    latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced

  13. OLTARIS: On-Line Tool for the Assessment of Radiation in Space

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.; Blattnig, Steve R.; Clowdsley, Martha S.; Qualls, Garry D.; Sandridge, Chris A.; Simonsen, Lisa C.; Norbury, John W.; Slaba, Tony C.; Walker, Steve A.; Badavi, Francis F.; hide

    2009-01-01

    The On-Line Tool for the Assessment of Radiation In Space (OLTARIS) is a World Wide Web based tool that assesses the effects of space radiation to humans in items such as spacecraft, habitats, rovers, and spacesuits. This document explains the basis behind the interface and framework used to input the data, perform the assessment, and output the results to the user as well as the physics, engineering, and computer science used to develop OLTARIS. The physics is based on the HZETRN2005 and NUCFRG2 research codes. The OLTARIS website is the successor to the SIREST website from the early 2000 s. Modifications have been made to the code to enable easy maintenance, additions, and configuration management along with a more modern web interface. Over all, the code has been verified, tested, and modified to enable faster and more accurate assessments. The next major areas of modification are more accurate transport algorithms, better uncertainty estimates, and electronic response functions. Improvements in the existing algorithms and data occur continuously and are logged in the change log section of the website.

  14. Review of particle-in-cell modeling for the extraction region of large negative hydrogen ion sources for fusion

    NASA Astrophysics Data System (ADS)

    Wünderlich, D.; Mochalskyy, S.; Montellano, I. M.; Revel, A.

    2018-05-01

    Particle-in-cell (PIC) codes are used since the early 1960s for calculating self-consistently the motion of charged particles in plasmas, taking into account external electric and magnetic fields as well as the fields created by the particles itself. Due to the used very small time steps (in the order of the inverse plasma frequency) and mesh size, the computational requirements can be very high and they drastically increase with increasing plasma density and size of the calculation domain. Thus, usually small computational domains and/or reduced dimensionality are used. In the last years, the available central processing unit (CPU) power strongly increased. Together with a massive parallelization of the codes, it is now possible to describe in 3D the extraction of charged particles from a plasma, using calculation domains with an edge length of several centimeters, consisting of one extraction aperture, the plasma in direct vicinity of the aperture, and a part of the extraction system. Large negative hydrogen or deuterium ion sources are essential parts of the neutral beam injection (NBI) system in future fusion devices like the international fusion experiment ITER and the demonstration reactor (DEMO). For ITER NBI RF driven sources with a source area of 0.9 × 1.9 m2 and 1280 extraction apertures will be used. The extraction of negative ions is accompanied by the co-extraction of electrons which are deflected onto an electron dump. Typically, the maximum negative extracted ion current is limited by the amount and the temporal instability of the co-extracted electrons, especially for operation in deuterium. Different PIC codes are available for the extraction region of large driven negative ion sources for fusion. Additionally, some effort is ongoing in developing codes that describe in a simplified manner (coarser mesh or reduced dimensionality) the plasma of the whole ion source. The presentation first gives a brief overview of the current status of the ion source development for ITER NBI and of the PIC method. Different PIC codes for the extraction region are introduced as well as the coupling to codes describing the whole source (PIC codes or fluid codes). Presented and discussed are different physical and numerical aspects of applying PIC codes to negative hydrogen ion sources for fusion as well as selected code results. The main focus of future calculations will be the meniscus formation and identifying measures for reducing the co-extracted electrons, in particular for deuterium operation. The recent results of the 3D PIC code ONIX (calculation domain: one extraction aperture and its vicinity) for the ITER prototype source (1/8 size of the ITER NBI source) are presented.

  15. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  16. TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, G.J.; Pruess

    1992-11-01

    The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less

  17. Surface modification of ferritic steels using MEVVA and duoplasmatron ion sources

    NASA Astrophysics Data System (ADS)

    Kulevoy, Timur V.; Chalyhk, Boris B.; Fedin, Petr A.; Sitnikov, Alexey L.; Kozlov, Alexander V.; Kuibeda, Rostislav P.; Andrianov, Stanislav L.; Orlov, Nikolay N.; Kravchuk, Konstantin S.; Rogozhkin, Sergey V.; Useinov, Alexey S.; Oks, Efim M.; Bogachev, Alexey A.; Nikitin, Alexander A.; Iskandarov, Nasib A.; Golubev, Alexander A.

    2016-02-01

    Metal Vapor Vacuum Arc (MEVVA) ion source (IS) is a unique tool for production of high intensity metal ion beam that can be used for material surface modification. From the other hand, the duoplasmatron ion source provides the high intensity gas ion beams. The MEVVA and duoplasmatron IS developed in Institute for Theoretical and Experimental Physics were used for the reactor steel surface modification experiments. Response of ferritic-martensitic steel specimens on titanium and nitrogen ions implantation and consequent vacuum annealing was investigated. Increase in microhardness of near surface region of irradiated specimens was observed. Local chemical analysis shows atom mixing and redistribution in the implanted layer followed with formation of ultrafine precipitates after annealing.

  18. Epigenomics, Pharmacoepigenomics, and Personalized Medicine in Cervical Cancer.

    PubMed

    Kabekkodu, Shama Prasada; Chakrabarty, Sanjiban; Ghosh, Supriti; Brand, Angela; Satyamoorthy, Kapaettu

    2017-01-01

    Epigenomics encompasses the study of genome-wide changes in DNA methylation, histone modifications and noncoding RNAs leading to altered transcription, chromatin structure, and posttranscription RNA processing, respectively, resulting in an altered rate of gene expression. The role of epigenetic modifications facilitating human diseases is well established. Previous studies have identified histone and cytosine code during normal and pathological conditions with special emphasis on how these modifications regulate transcriptional events. Recent studies have also mapped these epigenetic modification and pathways leading to carcinogenesis. Discovery of drugs that target proteins/enzymes in the epigenetic pathways may provide better therapeutic opportunities, and identification of such modulators for DNA methylation, histone modifications, and expression of noncoding RNAs for several cancer types is underway. In this review, we provide a detailed description of recent developments in the field of epigenetics and its impact on personalized medicine to manage cervical cancer. © 2017 S. Karger AG, Basel.

  19. Analysing how negative emotions emerge and are addressed in veterinary consultations, using the Verona Coding Definitions of Emotional Sequences (VR-CoDES).

    PubMed

    Vijfhuizen, Malou; Bok, Harold; Matthew, Susan M; Del Piccolo, Lidia; McArthur, Michelle

    2017-04-01

    To explore the applicability, need for modifications and reliability of the VR-CoDES in a veterinary setting while also gaining a deeper understanding of clients' expressions of negative emotion and how they are addressed by veterinarians. The Verona Coding Definitions of Emotional Sequences for client cues and concerns (VR-CoDES-CC) and health provider responses (VR-CoDES-P) were used to analyse 20 audiotaped veterinary consultations. Inter-rater reliability was established. The applicability of definitions of the VR-CoDES was identified, together with the need for specific modifications to suit veterinary consultations. The VR-CoDES-CC and VR-CoDES-P generally applied to veterinary consultations. Cue and concern reliability was found satisfactory for most types of cues, but not for concerns. Response reliability was satisfactory for explicitness, and for providing and reducing space for further disclosure. Modifications to the original coding system were necessary to accurately reflect the veterinary context and included minor additions to the VR-CoDES-CC. Using minor additions to the VR-CoDES including guilt, reassurance and cost discussions it can be reliably adopted to assess clients' implicit expressions of negative emotion and veterinarians' responses. The modified VR-CoDES could be of great value when combined with existing frameworks used for teaching and researching veterinary communication. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Monte Carlo dose calculations of beta-emitting sources for intravascular brachytherapy: a comparison between EGS4, EGSnrc, and MCNP.

    PubMed

    Wang, R; Li, X A

    2001-02-01

    The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.

  1. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    PubMed

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-07-09

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.

  2. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    PubMed Central

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  3. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. Tomore » overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.« less

  4. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  5. Modeling unsaturated zone flow and runoff processes by integrating MODFLOW-LGR and VSF, and creating the new CFL package

    USGS Publications Warehouse

    Borsia, I.; Rossetto, R.; Schifani, C.; Hill, Mary C.

    2013-01-01

    In this paper two modifications to the MODFLOW code are presented. One concerns an extension of Local Grid Refinement (LGR) to Variable Saturated Flow process (VSF) capability. This modification allows the user to solve the 3D Richards’ equation only in selected parts of the model domain. The second modification introduces a new package, named CFL (Cascading Flow), which improves the computation of overland flow when ground surface saturation is simulated using either VSF or the Unsaturated Zone Flow (UZF) package. The modeling concepts are presented and demonstrated. Programmer documentation is included in appendices.

  6. Oligo/Polynucleotide-Based Gene Modification: Strategies and Therapeutic Potential

    PubMed Central

    Sargent, R. Geoffrey; Kim, Soya

    2011-01-01

    Oligonucleotide- and polynucleotide-based gene modification strategies were developed as an alternative to transgene-based and classical gene targeting-based gene therapy approaches for treatment of genetic disorders. Unlike the transgene-based strategies, oligo/polynucleotide gene targeting approaches maintain gene integrity and the relationship between the protein coding and gene-specific regulatory sequences. Oligo/polynucleotide-based gene modification also has several advantages over classical vector-based homologous recombination approaches. These include essentially complete homology to the target sequence and the potential to rapidly engineer patient-specific oligo/polynucleotide gene modification reagents. Several oligo/polynucleotide-based approaches have been shown to successfully mediate sequence-specific modification of genomic DNA in mammalian cells. The strategies involve the use of polynucleotide small DNA fragments, triplex-forming oligonucleotides, and single-stranded oligodeoxynucleotides to mediate homologous exchange. The primary focus of this review will be on the mechanistic aspects of the small fragment homologous replacement, triplex-forming oligonucleotide-mediated, and single-stranded oligodeoxynucleotide-mediated gene modification strategies as it relates to their therapeutic potential. PMID:21417933

  7. The modification at CSNS ion source

    NASA Astrophysics Data System (ADS)

    Liu, S.; Ouyang, H.; Huang, T.; Xiao, Y.; Cao, X.; Lv, Y.; Xue, K.; Chen, W.

    2017-08-01

    The commissioning of CSNS front end has been finished. Above 15 mA beam intensity is obtained at the end of RFQ. For CSNS ion source, it is a type of penning surface plasma ion source, similar to ISIS ion source. To improve the operation stability and reduce spark rate, some modifications have been performed, including Penning field, extraction optics and post acceleration. PBGUNS is applied to optimize beam extraction. The co-extraction electrons are considered at PBGUNS simulation and various extracted structure are simulated aiming to make the beam through the extracted electrode without loss. The stability of ion source is improved further.

  8. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  9. 78 FR 1759 - Notice of Approval of Clean Air Act Outer Continental Shelf Minor Source/Title V Minor Permit...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-09

    ... Outer Continental Shelf Minor Source/Title V Minor Permit Modification Issued to Shell Offshore, Inc. for the Kulluk Conical Drilling Unit AGENCY: United States Environmental Protection Agency (EPA... decision granting Shell Offshore Inc.'s (``Shell'') request for minor modifications of Clean Air Act Outer...

  10. Simulation of TunneLadder traveling-wave tube cold-test characteristics: Implementation of the three-dimensional, electromagnetic circuit analysis code micro-SOS

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Wilson, Jeffrey D.

    1993-01-01

    The three-dimensional, electromagnetic circuit analysis code, Micro-SOS, can be used to reduce expensive time-consuming experimental 'cold-testing' of traveling-wave tube (TWT) circuits. The frequency-phase dispersion characteristics and beam interaction impedance of a TunneLadder traveling-wave tube slow-wave structure were simulated using the code. When reasonable dimensional adjustments are made, computer results agree closely with experimental data. Modifications to the circuit geometry that would make the TunneLadder TWT easier to fabricate for higher frequency operation are explored.

  11. Entropy-Based Bounds On Redundancies Of Huffman Codes

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.

    1992-01-01

    Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.

  12. 40 CFR Appendix A to Subpart A of... - Tables

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... phone number ✓ ✓ (6) FIPS code ✓ ✓ (7) Facility ID codes ✓ ✓ (8) Unit ID code ✓ ✓ (9) Process ID code... for Reporting on Emissions From Nonpoint Sources and Nonroad Mobile Sources, Where Required by 40 CFR... start date ✓ ✓ (3) Inventory end date ✓ ✓ (4) Contact name ✓ ✓ (5) Contact phone number ✓ ✓ (6) FIPS...

  13. SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMillan, D.B.

    1960-06-01

    >A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)

  14. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  15. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  16. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  17. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...

  18. A Combinatorial H4 Tail Library to Explore the Histone Code

    PubMed Central

    Garske, Adam L.; Craciun, Gheorghe; Denu, John M.

    2008-01-01

    Histone modifications modulate chromatin structure and function. A posttranslational modification-randomized, combinatorial library based on the first twenty-one residues of histone H4 was designed for systematic examination of proteins that interpret a histone code. The 800-member library represented all permutations of most known modifications within the N-terminal tail of histone H4. To determine its utility in a protein-binding assay, the on-bead library was screened with an antibody directed against phosphoserine 1 of H4. Among the hits, 59/60 sequences were phosphorylated at S1, while 30/30 of those selected from the non-hits were unphosphorylated. A 512-member version of the library was then used to determine the binding specificity of the double tudor domain of hJMJD2A, a histone demethylase involved in transcriptional repression. Global linear least squares fitting of modifications from the identified peptides (40 hits and 34 non-hits) indicated that methylation of K20 was the primary determinant for binding, but that phosphorylation/acetylation on neighboring sites attenuated the interaction. To validate the on-bead screen, isothermal titration calorimetry was performed with thirteen H4 peptides. Dissociation constants ranged from 1 mM - 1μM and corroborated the screening results. The general approach should be useful for probing the specificity of any histone-binding protein. PMID:18616348

  19. Swan: A tool for porting CUDA programs to OpenCL

    NASA Astrophysics Data System (ADS)

    Harvey, M. J.; De Fabritiis, G.

    2011-04-01

    The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal

  20. Binding Sites Analyser (BiSA): Software for Genomic Binding Sites Archiving and Overlap Analysis

    PubMed Central

    Khushi, Matloob; Liddle, Christopher; Clarke, Christine L.; Graham, J. Dinny

    2014-01-01

    Genome-wide mapping of transcription factor binding and histone modification reveals complex patterns of interactions. Identifying overlaps in binding patterns by different factors is a major objective of genomic studies, but existing methods to archive large numbers of datasets in a personalised database lack sophistication and utility. Therefore we have developed transcription factor DNA binding site analyser software (BiSA), for archiving of binding regions and easy identification of overlap with or proximity to other regions of interest. Analysis results can be restricted by chromosome or base pair overlap between regions or maximum distance between binding peaks. BiSA is capable of reporting overlapping regions that share common base pairs; regions that are nearby; regions that are not overlapping; and average region sizes. BiSA can identify genes located near binding regions of interest, genomic features near a gene or locus of interest and statistical significance of overlapping regions can also be reported. Overlapping results can be visualized as Venn diagrams. A major strength of BiSA is that it is supported by a comprehensive database of publicly available transcription factor binding sites and histone modifications, which can be directly compared to user data. The documentation and source code are available on http://bisa.sourceforge.net PMID:24533055

  1. Distinct 3-O-Sulfated Heparan Sulfate Modification Patterns Are Required for kal-1−Dependent Neurite Branching in a Context-Dependent Manner in Caenorhabditis elegans

    PubMed Central

    Tecle, Eillen; Diaz-Balzac, Carlos A.; Bülow, Hannes E.

    2013-01-01

    Heparan sulfate (HS) is an unbranched glycosaminoglycan exhibiting substantial molecular diversity due to multiple, nonuniformly introduced modifications, including sulfations, epimerization, and acetylation. HS modifications serve specific and instructive roles in neuronal development, leading to the hypothesis of a HS code that regulates nervous system patterning. Although the in vivo roles of many of the HS modifications have been investigated, very little is known about the function of HS 3-O-sulfation in vivo. By examining patterning of the Caenorhabditis elegans nervous system in loss of function mutants of the two 3-O-sulfotransferases, hst-3.1 and hst-3.2, we found HS 3-O-sulfation to be largely dispensable for overall neural development. However, generation of stereotypical neurite branches in hermaphroditic-specific neurons required hst-3.1, hst-3.2, as well as an extracellular cell adhesion molecule encoded by kal-1, the homolog of Kallmann Syndrome associated gene 1/anosmin-1. In contrast, kal-1−dependent neurite branching in AIY neurons required catalytic activity of hst-3.2 but not hst-3.1. The context-dependent requirement for hst-3.2 and hst-3.1 indicates that both enzymes generate distinct types of HS modification patterns in different cell types, which regulate kal-1 to promote neurite branching. We conclude that HS 3-O-sulfation does not play a general role in establishing the HS code in C. elegans but rather plays a specialized role in a context-dependent manner to establish defined aspects of neuronal circuits. PMID:23451335

  2. LDPC-based iterative joint source-channel decoding for JPEG2000.

    PubMed

    Pu, Lingling; Wu, Zhenyu; Bilgin, Ali; Marcellin, Michael W; Vasic, Bane

    2007-02-01

    A framework is proposed for iterative joint source-channel decoding of JPEG2000 codestreams. At the encoder, JPEG2000 is used to perform source coding with certain error-resilience (ER) modes, and LDPC codes are used to perform channel coding. During decoding, the source decoder uses the ER modes to identify corrupt sections of the codestream and provides this information to the channel decoder. Decoding is carried out jointly in an iterative fashion. Experimental results indicate that the proposed method requires fewer iterations and improves overall system performance.

  3. The Colossus of ubiquitylation –decrypting a cellular code

    PubMed Central

    Williamson, Adam; Werner, Achim; Rape, Michael

    2013-01-01

    Ubiquitylation is an essential posttranslational modification that can regulate the stability, activity, or localization of thousands of proteins. The reversible attachment of ubiquitin as well as interpretation of the ubiquitin signal depend on dynamic protein networks that are challenging to analyze. In this perspective, we discuss tools of the trade that have recently been developed to dissect mechanisms of ubiquitin-dependent signaling, thereby revealing the critical features of an important cellular code. PMID:23438855

  4. Epidemiologic studies of electric and magnetic fields and cancer: strategies for extending knowledge.

    PubMed Central

    Savitz, D A

    1993-01-01

    Epidemiologic research concerning electric and magnetic fields in relation to cancer has focused on the potential etiologic roles of residential exposure on childhood cancer and occupational exposure on adult leukemia and brain cancer. Future residential studies must concentrate on exposure assessment that is enhanced by developing models of historical exposure, assessment of the relation between magnetic fields and wire codes, and consideration of alternate exposure indices. Study design issues deserving attention include possible biases in random digit dialing control selection, consideration of the temporal course of exposure and disease, and acquisition of the necessary information to assess the potential value of ecologic studies. Highest priorities are comprehensive evaluation of exposure patterns and sources and examination of the sociology and geography of residential wire codes. Future occupational studies should also concentrate on improved exposure assessment with increased attention to nonutility worker populations and development of historical exposure indicators that are superior to job titles alone. Potential carcinogens in the workplace that could act as confounders need to be more carefully examined. The temporal relation between exposure and disease and possible effect modification by other workplace agents should be incorporated into future studies. The most pressing need is for measurement of exposure patterns in a variety of worker populations and performance of traditional epidemiologic evaluations of cancer occurrence. The principal source of bias toward the null is nondifferential misclassification of exposure with improvements expected to enhance any true etiologic association that is present. Biases away from the null might include biased control selection in residential studies and chemical carcinogens acting as confounders in occupational studies. PMID:8206046

  5. POTAMOS mass spectrometry calculator: computer aided mass spectrometry to the post-translational modifications of proteins. A focus on histones.

    PubMed

    Vlachopanos, A; Soupsana, E; Politou, A S; Papamokos, G V

    2014-12-01

    Mass spectrometry is a widely used technique for protein identification and it has also become the method of choice in order to detect and characterize the post-translational modifications (PTMs) of proteins. Many software tools have been developed to deal with this complication. In this paper we introduce a new, free and user friendly online software tool, named POTAMOS Mass Spectrometry Calculator, which was developed in the open source application framework Ruby on Rails. It can provide calculated mass spectrometry data in a time saving manner, independently of instrumentation. In this web application we have focused on a well known protein family of histones whose PTMs are believed to play a crucial role in gene regulation, as suggested by the so called "histone code" hypothesis. The PTMs implemented in this software are: methylations of arginines and lysines, acetylations of lysines and phosphorylations of serines and threonines. The application is able to calculate the kind, the number and the combinations of the possible PTMs corresponding to a given peptide sequence and a given mass along with the full set of the unique primary structures produced by the possible distributions along the amino acid sequence. It can also calculate the masses and charges of a fragmented histone variant, which carries predefined modifications already implemented. Additional functionality is provided by the calculation of the masses of fragments produced upon protein cleavage by the proteolytic enzymes that are most widely used in proteomics studies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Users Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE)

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.; Berkowitz, Brian M.

    1990-01-01

    LEWICE is an ice accretion prediction code that applies a time-stepping procedure to calculate the shape of an ice accretion. The potential flow field is calculated in LEWICE using the Douglas Hess-Smith 2-D panel code (S24Y). This potential flow field is then used to calculate the trajectories of particles and the impingement points on the body. These calculations are performed to determine the distribution of liquid water impinging on the body, which then serves as input to the icing thermodynamic code. The icing thermodynamic model is based on the work of Messinger, but contains several major modifications and improvements. This model is used to calculate the ice growth rate at each point on the surface of the geometry. By specifying an icing time increment, the ice growth rate can be interpreted as an ice thickness which is added to the body, resulting in the generation of new coordinates. This procedure is repeated, beginning with the potential flow calculations, until the desired icing time is reached. The operation of LEWICE is illustrated through the use of five examples. These examples are representative of the types of applications expected for LEWICE. All input and output is discussed, along with many of the diagnostic messages contained in the code. Several error conditions that may occur in the code for certain icing conditions are identified, and a course of action is recommended. LEWICE has been used to calculate a variety of ice shapes, but should still be considered a research code. The code should be exercised further to identify any shortcomings and inadequacies. Any modifications identified as a result of these cases, or of additional experimental results, should be incorporated into the model. Using it as a test bed for improvements to the ice accretion model is one important application of LEWICE.

  7. Poet Fostoria Approval

    EPA Pesticide Factsheets

    This August 9, 2016 letter from EPA approves, with modifications, the petition from Poet Biorefining-Fostoria, LLC, regarding non-grandfathered ethanol produced through a dry mill process, qualifying under the Clean Air Act for renewable fuel (D-code 6)

  8. Poet Leipsic Approval

    EPA Pesticide Factsheets

    This August 9, 2016 letter from EPA approves,wtih modifications, the petition from Poet Biorefining-Leipsic, LLC, regarding non-grandfathered ethanol produced through a dry mill process, qualifying under the Clean Air Act for renewable fuel (D-code 6) RINs

  9. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...

  10. Calibration and comparison of the NASA Lewis free-piston Stirling engine model predictions with RE-1000 test data

    NASA Technical Reports Server (NTRS)

    Geng, Steven M.

    1987-01-01

    A free-piston Stirling engine performance code is being upgraded and validated at the NASA Lewis Research Center under an interagency agreement between the Department of Energy's Oak Ridge National Laboratory and NASA Lewis. Many modifications were made to the free-piston code in an attempt to decrease the calibration effort. A procedure was developed that made the code calibration process more systematic. Engine-specific calibration parameters are often used to bring predictions and experimental data into better agreement. The code was calibrated to a matrix of six experimental data points. Predictions of the calibrated free-piston code are compared with RE-1000 free-piston Stirling engine sensitivity test data taken at NASA Lewis. Reasonable agreement was obtained between the code prediction and the experimental data over a wide range of engine operating conditions.

  11. Calibration and comparison of the NASA Lewis free-piston Stirling engine model predictions with RE-1000 test data

    NASA Technical Reports Server (NTRS)

    Geng, Steven M.

    1987-01-01

    A free-piston Stirling engine performance code is being upgraded and validated at the NASA Lewis Research Center under an interagency agreement between the Department of Energy's Oak Ridge National Laboratory and NASA Lewis. Many modifications were made to the free-piston code in an attempt to decrease the calibration effort. A procedure was developed that made the code calibration process more systematic. Engine-specific calibration parameters are often used to bring predictions and experimental data into better agreement. The code was calibrated to a matrix of six experimental data points. Predictions of the calibrated free-piston code are compared with RE-1000 free-piston Stirling engine sensitivity test data taken at NASA Lewis. Resonable agreement was obtained between the code predictions and the experimental data over a wide range of engine operating conditions.

  12. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  13. A modification of the Regional Nutrient Management model (ReNuMa) to identify long-term changes in riverine nitrogen sources

    NASA Astrophysics Data System (ADS)

    Hu, Minpeng; Liu, Yanmei; Wang, Jiahui; Dahlgren, Randy A.; Chen, Dingjiang

    2018-06-01

    Source apportionment is critical for guiding development of efficient watershed nitrogen (N) pollution control measures. The ReNuMa (Regional Nutrient Management) model, a semi-empirical, semi-process-oriented model with modest data requirements, has been widely used for riverine N source apportionment. However, the ReNuMa model contains limitations for addressing long-term N dynamics by ignoring temporal changes in atmospheric N deposition rates and N-leaching lag effects. This work modified the ReNuMa model by revising the source code to allow yearly changes in atmospheric N deposition and incorporation of N-leaching lag effects into N transport processes. The appropriate N-leaching lag time was determined from cross-correlation analysis between annual watershed individual N source inputs and riverine N export. Accuracy of the modified ReNuMa model was demonstrated through analysis of a 31-year water quality record (1980-2010) from the Yongan watershed in eastern China. The revisions considerably improved the accuracy (Nash-Sutcliff coefficient increased by ∼0.2) of the modified ReNuMa model for predicting riverine N loads. The modified model explicitly identified annual and seasonal changes in contributions of various N sources (i.e., point vs. nonpoint source, surface runoff vs. groundwater) to riverine N loads as well as the fate of watershed anthropogenic N inputs. Model results were consistent with previously modeled or observed lag time length as well as changes in riverine chloride and nitrate concentrations during the low-flow regime and available N levels in agricultural soils of this watershed. The modified ReNuMa model is applicable for addressing long-term changes in riverine N sources, providing decision-makers with critical information for guiding watershed N pollution control strategies.

  14. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    NASA Astrophysics Data System (ADS)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaurov, Alexander A., E-mail: kaurov@uchicago.edu

    We explore a time-dependent energy dissipation of the energetic electrons in the inhomogeneous intergalactic medium (IGM) during the epoch of cosmic reionization. In addition to the atomic processes, we take into account the inverse Compton (IC) scattering of the electrons on the cosmic microwave background photons, which is the dominant channel of energy loss for electrons with energies above a few MeV. We show that: (1) the effect on the IGM has both local (atomic processes) and non-local (IC radiation) components; (2) the energy distribution between hydrogen and helium ionizations depends on the initial energy of an electron; (3) themore » local baryon overdensity significantly affects the fractions of energy distributed in each channel; and (4) the relativistic effect of the atomic cross-section becomes important during the epoch of cosmic reionization. We release our code as open source for further modification by the community.« less

  16. Multi Station Frequency Response and Polarization of ELF/VLF Signals Generated via Ionospheric Modification

    NASA Astrophysics Data System (ADS)

    Maxworth, Ashanthi; Golkowski, Mark; University of Colorado Denver Team

    2013-10-01

    ELF/VLF wave generation via HF modulated ionospheric heating has been practiced for many years as a unique way to generate waves in the ELF/VLF band (3 Hz - 30 kHz). This paper presents experimental results and associated theoretical modeling from work performed at the High Frequency Active Auroral Research Program (HAARP) facility in Alaska, USA. An experiment was designed to investigate the modulation frequency dependence of the generated ELF/VLF signal amplitudes and polarization at multiple sites at distances of 37 km, 50 km and 99 km from the facility. While no difference is observed for X mode versus O mode modulation of the heating wave, it is found that ELF/VLF amplitude and polarization as a function of modulated ELF/VLF frequency is different for each site. An ionospheric heating code is used to determine the primary current sources leading to the observations.

  17. UV-POSIT: Web-Based Tools for Rapid and Facile Structural Interpretation of Ultraviolet Photodissociation (UVPD) Mass Spectra

    NASA Astrophysics Data System (ADS)

    Rosenberg, Jake; Parker, W. Ryan; Cammarata, Michael B.; Brodbelt, Jennifer S.

    2018-04-01

    UV-POSIT (Ultraviolet Photodissociation Online Structure Interrogation Tools) is a suite of web-based tools designed to facilitate the rapid interpretation of data from native mass spectrometry experiments making use of 193 nm ultraviolet photodissociation (UVPD). The suite includes four separate utilities which assist in the calculation of fragment ion abundances as a function of backbone cleavage sites and sequence position; the localization of charge sites in intact proteins; the calculation of hydrogen elimination propensity for a-type fragment ions; and mass-offset searching of UVPD spectra to identify unknown modifications and assess false positive fragment identifications. UV-POSIT is implemented as a Python/Flask web application hosted at http://uv-posit.cm.utexas.edu. UV-POSIT is available under the MIT license, and the source code is available at https://github.com/jarosenb/UV_POSIT. [Figure not available: see fulltext.

  18. UV-POSIT: Web-Based Tools for Rapid and Facile Structural Interpretation of Ultraviolet Photodissociation (UVPD) Mass Spectra.

    PubMed

    Rosenberg, Jake; Parker, W Ryan; Cammarata, Michael B; Brodbelt, Jennifer S

    2018-06-01

    UV-POSIT (Ultraviolet Photodissociation Online Structure Interrogation Tools) is a suite of web-based tools designed to facilitate the rapid interpretation of data from native mass spectrometry experiments making use of 193 nm ultraviolet photodissociation (UVPD). The suite includes four separate utilities which assist in the calculation of fragment ion abundances as a function of backbone cleavage sites and sequence position; the localization of charge sites in intact proteins; the calculation of hydrogen elimination propensity for a-type fragment ions; and mass-offset searching of UVPD spectra to identify unknown modifications and assess false positive fragment identifications. UV-POSIT is implemented as a Python/Flask web application hosted at http://uv-posit.cm.utexas.edu . UV-POSIT is available under the MIT license, and the source code is available at https://github.com/jarosenb/UV_POSIT . Graphical Abstract.

  19. Two-terminal video coding.

    PubMed

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  20. Global alterations of the transcriptional landscape during yeast growth and development in the absence of Ume6-dependent chromatin modification

    PubMed Central

    Lardenois, Aurélie; Becker, Emmanuelle; Walther, Thomas; Law, Michael J.; Xie, Bingning; Demougin, Philippe; Strich, Randy

    2017-01-01

    Chromatin modification enzymes are important regulators of gene expression and some are evolutionarily conserved from yeast to human. Saccharomyces cerevisiae is a major model organism for genome-wide studies that aim at the identification of target genes under the control of conserved epigenetic regulators. Ume6 interacts with the upstream repressor site 1 (URS1) and represses transcription by recruiting both the conserved histone deacetylase Rpd3 (through the co-repressor Sin3) and the chromatin-remodeling factor Isw2. Cells lacking Ume6 are defective in growth, stress response, and meiotic development. RNA profiling studies and in vivo protein-DNA binding assays identified mRNAs or transcript isoforms that are directly repressed by Ume6 in mitosis. However, a comprehensive understanding of the transcriptional alterations, which underlie the complex ume6Δ mutant phenotype during fermentation, respiration, or sporulation, is lacking. We report the protein-coding transcriptome of a diploid MATa/α wild-type and ume6/ume6 mutant strains cultured in rich media with glucose or acetate as a carbon source, or sporulation-inducing medium. We distinguished direct from indirect effects on mRNA levels by combining GeneChip data with URS1 motif predictions and published high-throughput in vivo Ume6-DNA binding data. To gain insight into the molecular interactions between successive waves of Ume6-dependent meiotic genes, we integrated expression data with information on protein networks. Our work identifies novel Ume6 repressed genes during growth and development and reveals a strong effect of the carbon source on the derepression pattern of transcripts in growing and developmentally arrested ume6/ume6 mutant cells. Since yeast is a useful model organism for chromatin-mediated effects on gene expression, our results provide a rich source for further genetic and molecular biological work on the regulation of cell growth and cell differentiation in eukaryotes. PMID:25957495

  1. Global alterations of the transcriptional landscape during yeast growth and development in the absence of Ume6-dependent chromatin modification.

    PubMed

    Lardenois, Aurélie; Becker, Emmanuelle; Walther, Thomas; Law, Michael J; Xie, Bingning; Demougin, Philippe; Strich, Randy; Primig, Michael

    2015-10-01

    Chromatin modification enzymes are important regulators of gene expression and some are evolutionarily conserved from yeast to human. Saccharomyces cerevisiae is a major model organism for genome-wide studies that aim at the identification of target genes under the control of conserved epigenetic regulators. Ume6 interacts with the upstream repressor site 1 (URS1) and represses transcription by recruiting both the conserved histone deacetylase Rpd3 (through the co-repressor Sin3) and the chromatin-remodeling factor Isw2. Cells lacking Ume6 are defective in growth, stress response, and meiotic development. RNA profiling studies and in vivo protein-DNA binding assays identified mRNAs or transcript isoforms that are directly repressed by Ume6 in mitosis. However, a comprehensive understanding of the transcriptional alterations, which underlie the complex ume6Δ mutant phenotype during fermentation, respiration, or sporulation, is lacking. We report the protein-coding transcriptome of a diploid MAT a/α wild-type and ume6/ume6 mutant strains cultured in rich media with glucose or acetate as a carbon source, or sporulation-inducing medium. We distinguished direct from indirect effects on mRNA levels by combining GeneChip data with URS1 motif predictions and published high-throughput in vivo Ume6-DNA binding data. To gain insight into the molecular interactions between successive waves of Ume6-dependent meiotic genes, we integrated expression data with information on protein networks. Our work identifies novel Ume6 repressed genes during growth and development and reveals a strong effect of the carbon source on the derepression pattern of transcripts in growing and developmentally arrested ume6/ume6 mutant cells. Since yeast is a useful model organism for chromatin-mediated effects on gene expression, our results provide a rich source for further genetic and molecular biological work on the regulation of cell growth and cell differentiation in eukaryotes.

  2. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  3. VINE-A NUMERICAL CODE FOR SIMULATING ASTROPHYSICAL SYSTEMS USING PARTICLES. II. IMPLEMENTATION AND PERFORMANCE CHARACTERISTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Andrew F.; Wetzstein, M.; Naab, T.

    2009-10-01

    We continue our presentation of VINE. In this paper, we begin with a description of relevant architectural properties of the serial and shared memory parallel computers on which VINE is intended to run, and describe their influences on the design of the code itself. We continue with a detailed description of a number of optimizations made to the layout of the particle data in memory and to our implementation of a binary tree used to access that data for use in gravitational force calculations and searches for smoothed particle hydrodynamics (SPH) neighbor particles. We describe the modifications to the codemore » necessary to obtain forces efficiently from special purpose 'GRAPE' hardware, the interfaces required to allow transparent substitution of those forces in the code instead of those obtained from the tree, and the modifications necessary to use both tree and GRAPE together as a fused GRAPE/tree combination. We conclude with an extensive series of performance tests, which demonstrate that the code can be run efficiently and without modification in serial on small workstations or in parallel using the OpenMP compiler directives on large-scale, shared memory parallel machines. We analyze the effects of the code optimizations and estimate that they improve its overall performance by more than an order of magnitude over that obtained by many other tree codes. Scaled parallel performance of the gravity and SPH calculations, together the most costly components of most simulations, is nearly linear up to at least 120 processors on moderate sized test problems using the Origin 3000 architecture, and to the maximum machine sizes available to us on several other architectures. At similar accuracy, performance of VINE, used in GRAPE-tree mode, is approximately a factor 2 slower than that of VINE, used in host-only mode. Further optimizations of the GRAPE/host communications could improve the speed by as much as a factor of 3, but have not yet been implemented in VINE. Finally, we find that although parallel performance on small problems may reach a plateau beyond which more processors bring no additional speedup, performance never decreases, a factor important for running large simulations on many processors with individual time steps, where only a small fraction of the total particles require updates at any given moment.« less

  4. The histone codes for meiosis.

    PubMed

    Wang, Lina; Xu, Zhiliang; Khawar, Muhammad Babar; Liu, Chao; Li, Wei

    2017-09-01

    Meiosis is a specialized process that produces haploid gametes from diploid cells by a single round of DNA replication followed by two successive cell divisions. It contains many special events, such as programmed DNA double-strand break (DSB) formation, homologous recombination, crossover formation and resolution. These events are associated with dynamically regulated chromosomal structures, the dynamic transcriptional regulation and chromatin remodeling are mainly modulated by histone modifications, termed 'histone codes'. The purpose of this review is to summarize the histone codes that are required for meiosis during spermatogenesis and oogenesis, involving meiosis resumption, meiotic asymmetric division and other cellular processes. We not only systematically review the functional roles of histone codes in meiosis but also discuss future trends and perspectives in this field. © 2017 Society for Reproduction and Fertility.

  5. AQUIFEM-SALT; a finite-element model for aquifers containing a seawater interface

    USGS Publications Warehouse

    Voss, C.I.

    1984-01-01

    Described are modifications to AQUIFEM, a finite element areal ground-water flow model for aquifer evaluation. The modified model, AQUIFEM-SALT, simulates an aquifer containing a freshwater body that freely floats on seawater. Parts of the freshwater lens may be confined above and below by less permeable units. Theory, code modifications, and model verification are discussed. A modified input data list is included. This report is intended as a companion to the original AQUIFEM documentation. (USGS)

  6. Target signature modeling and bistatic scattering measurement studies

    NASA Technical Reports Server (NTRS)

    Burnside, W. D.; Lee, T. H.; Rojas, R.; Marhefka, R. J.; Bensman, D.

    1989-01-01

    Four areas of study are summarized: bistatic scattering measurements studies for a compact range; target signature modeling for test and evaluation hardware in the loop situation; aircraft code modification study; and SATCOM antenna studies on aircraft.

  7. NASA Lewis Stirling SPRE testing and analysis with reduced number of cooler tubes

    NASA Technical Reports Server (NTRS)

    Wong, Wayne A.; Cairelli, James E.; Swec, Diane M.; Doeberling, Thomas J.; Lakatos, Thomas F.; Madi, Frank J.

    1992-01-01

    Free-piston Stirling power converters are candidates for high capacity space power applications. The Space Power Research Engine (SPRE), a free-piston Stirling engine coupled with a linear alternator, is being tested at the NASA Lewis Research Center in support of the Civil Space Technology Initiative. The SPRE is used as a test bed for evaluating converter modifications which have the potential to improve the converter performance and for validating computer code predictions. Reducing the number of cooler tubes on the SPRE has been identified as a modification with the potential to significantly improve power and efficiency. Experimental tests designed to investigate the effects of reducing the number of cooler tubes on converter power, efficiency and dynamics are described. Presented are test results from the converter operating with a reduced number of cooler tubes and comparisons between this data and both baseline test data and computer code predictions.

  8. Conical Euler solution for a highly-swept delta wing undergoing wing-rock motion

    NASA Technical Reports Server (NTRS)

    Lee, Elizabeth M.; Batina, John T.

    1990-01-01

    Modifications to an unsteady conical Euler code for the free-to-roll analysis of highly-swept delta wings are described. The modifications involve the addition of the rolling rigid-body equation of motion for its simultaneous time-integration with the governing flow equations. The flow solver utilized in the Euler code includes a multistage Runge-Kutta time-stepping scheme which uses a finite-volume spatial discretization on an unstructured mesh made up of triangles. Steady and unsteady results are presented for a 75 deg swept delta wing at a freestream Mach number of 1.2 and an angle of attack of 30 deg. The unsteady results consist of forced harmonic and free-to-roll calculations. The free-to-roll case exhibits a wing rock response produced by unsteady aerodynamics consistent with the aerodynamics of the forced harmonic results. Similarities are shown with a wing-rock time history from a low-speed wind tunnel test.

  9. Using the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.

    2013-01-01

    The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.

  10. McSKY: A hybrid Monte-Carlo lime-beam code for shielded gamma skyshine calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shultis, J.K.; Faw, R.E.; Stedry, M.H.

    1994-07-01

    McSKY evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated into either a vertical cone or a vertical structure with an N-sided polygon cross section. The code assumes an overhead shield of two materials, through the user can specify zero shield thickness for an unshielded calculation. The code uses a Monte-Carlo algorithm to evaluate transport through source shields and the integral line source to describe photon transport through the atmosphere. The source energy must be between 0.02 and 100 MeV. For heavily shielded sources with energies above 20 MeV, McSKY results must be used cautiously, especially at detectormore » locations near the source.« less

  11. Source term evaluation for combustion modeling

    NASA Technical Reports Server (NTRS)

    Sussman, Myles A.

    1993-01-01

    A modification is developed for application to the source terms used in combustion modeling. The modification accounts for the error of the finite difference scheme in regions where chain-branching chemical reactions produce exponential growth of species densities. The modification is first applied to a one-dimensional scalar model problem. It is then generalized to multiple chemical species, and used in quasi-one-dimensional computations of shock-induced combustion in a channel. Grid refinement studies demonstrate the improved accuracy of the method using this modification. The algorithm is applied in two spatial dimensions and used in simulations of steady and unsteady shock-induced combustion. Comparisons with ballistic range experiments give confidence in the numerical technique and the 9-species hydrogen-air chemistry model.

  12. Intricate Effects of α-Amino and Lysine Modifications on Arginine Methylation of the N-Terminal Tail of Histone H4.

    PubMed

    Fulton, Melody D; Zhang, Jing; He, Maomao; Ho, Meng-Chiao; Zheng, Y George

    2017-07-18

    Chemical modifications of the DNA and nucleosomal histones tightly control the gene transcription program in eukaryotic cells. The "histone code" hypothesis proposes that the frequency, combination, and location of post-translational modifications (PTMs) of the core histones compose a complex network of epigenetic regulation. Currently, there are at least 23 different types and >450 histone PTMs that have been discovered, and the PTMs of lysine and arginine residues account for a crucial part of the histone code. Although significant progress has been achieved in recent years, the molecular basis for the histone code is far from being fully understood. In this study, we investigated how naturally occurring N-terminal acetylation and PTMs of histone H4 lysine-5 (H4K5) affect arginine-3 methylation catalyzed by both type I and type II PRMTs at the biochemical level. Our studies found that acylations of H4K5 resulted in decreased levels of arginine methylation by PRMT1, PRMT3, and PRMT8. In contrast, PRMT5 exhibits an increased rate of arginine methylation upon H4K5 acetylation, propionylation, and crotonylation, but not upon H4K5 methylation, butyrylation, or 2-hydroxyisobutyrylation. Methylation of H4K5 did not affect arginine methylation by PRMT1 or PRMT5. There was a small increase in the rate of arginine methylation by PRMT8. Strikingly, a marked increase in the rate of arginine methylation was observed for PRMT3. Finally, N-terminal acetylation reduced the rate of arginine methylation by PRMT3 but had little influence on PRMT1, -5, and -8 activity. These results together highlight the underlying mechanistic differences in substrate recognition among different PRMTs and pave the way for the elucidation of the complex interplay of histone modifications.

  13. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  14. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  15. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  16. The Astrophysics Source Code Library: Where Do We Go from Here?

    NASA Astrophysics Data System (ADS)

    Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J.

    2014-05-01

    The Astrophysics Source Code Library1, started in 1999, has in the past three years grown from a repository for 40 codes to a registry of over 700 codes that are now indexed by ADS. What comes next? We examine the future of the , the challenges facing it, the rationale behind its practices, and the need to balance what we might do with what we have the resources to accomplish.

  17. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  18. Small nucleolar RNAs that guide modification in trypanosomatids: repertoire, targets, genome organisation, and unique functions.

    PubMed

    Uliel, Shai; Liang, Xue-hai; Unger, Ron; Michaeli, Shulamit

    2004-03-29

    Small nucleolar RNAs constitute a family of newly discovered non-coding small RNAs, most of which function in guiding RNA modifications. Two prevalent types of modifications are 2'-O-methylation and pseudouridylation. The modification is directed by the formation of a canonical small nucleolar RNA-target duplex. Initially, RNA-guided modification was shown to take place on rRNA, but recent studies suggest that small nuclear RNA, mRNA, tRNA, and the trypanosome spliced leader RNA also undergo guided modifications. Trypanosomes contain more modifications and potentially more small nucleolar RNAs than yeast, and the increased number of modifications may help to preserve ribosome function under adverse environmental conditions during the cycling between the insect and mammalian host. The genome organisation in clusters carrying the two types of small nucleolar RNAs, C/D and H/ACA-like RNAs, resembles that in plants. However, the trypanosomatid H/ACA RNAs are similar to those found in Archaea and are composed of a single hairpin that may represent the primordial H/ACA RNA. In this review we summarise this new field of trypanosome small nucleolar RNAs, emphasising the open questions regarding the number of small nucleolar RNAs, the repertoire, genome organisation, and the unique function of guided modifications in these protozoan parasites.

  19. Radiative transfer code SHARM for atmospheric and terrestrial applications

    NASA Astrophysics Data System (ADS)

    Lyapustin, A. I.

    2005-12-01

    An overview of the publicly available radiative transfer Spherical Harmonics code (SHARM) is presented. SHARM is a rigorous code, as accurate as the Discrete Ordinate Radiative Transfer (DISORT) code, yet faster. It performs simultaneous calculations for different solar zenith angles, view zenith angles, and view azimuths and allows the user to make multiwavelength calculations in one run. The Δ-M method is implemented for calculations with highly anisotropic phase functions. Rayleigh scattering is automatically included as a function of wavelength, surface elevation, and the selected vertical profile of one of the standard atmospheric models. The current version of the SHARM code does not explicitly include atmospheric gaseous absorption, which should be provided by the user. The SHARM code has several built-in models of the bidirectional reflectance of land and wind-ruffled water surfaces that are most widely used in research and satellite data processing. A modification of the SHARM code with the built-in Mie algorithm designed for calculations with spherical aerosols is also described.

  20. Radiative transfer code SHARM for atmospheric and terrestrial applications.

    PubMed

    Lyapustin, A I

    2005-12-20

    An overview of the publicly available radiative transfer Spherical Harmonics code (SHARM) is presented. SHARM is a rigorous code, as accurate as the Discrete Ordinate Radiative Transfer (DISORT) code, yet faster. It performs simultaneous calculations for different solar zenith angles, view zenith angles, and view azimuths and allows the user to make multiwavelength calculations in one run. The Delta-M method is implemented for calculations with highly anisotropic phase functions. Rayleigh scattering is automatically included as a function of wavelength, surface elevation, and the selected vertical profile of one of the standard atmospheric models. The current version of the SHARM code does not explicitly include atmospheric gaseous absorption, which should be provided by the user. The SHARM code has several built-in models of the bidirectional reflectance of land and wind-ruffled water surfaces that are most widely used in research and satellite data processing. A modification of the SHARM code with the built-in Mie algorithm designed for calculations with spherical aerosols is also described.

  1. Boundary layer simulator improvement

    NASA Technical Reports Server (NTRS)

    Praharaj, S. C.; Schmitz, C.; Frost, C.; Engel, C. D.; Fuller, C. E.; Bender, R. L.; Pond, J.

    1984-01-01

    High chamber pressure expander cycles proposed for orbit transfer vehicles depend primarily on the heat energy transmitted from the combustion products through the thrust wall chamber wall. The heat transfer to the nozzle wall is affected by such variables as wall roughness, relamarization, and the presence of particles in the flow. Motor performance loss for these nozzles with thick boundary layers is inaccurate using the existing procedure coded BLIMPJ. Modifications and innovations to the code are examined. Updated routines are listed.

  2. Adapting a Navier-Stokes code to the ICL-DAP

    NASA Technical Reports Server (NTRS)

    Grosch, C. E.

    1985-01-01

    The results of an experiment are reported, i.c., to adapt a Navier-Stokes code, originally developed on a serial computer, to concurrent processing on the CL Distributed Array Processor (DAP). The algorithm used in solving the Navier-Stokes equations is briefly described. The architecture of the DAP and DAP FORTRAN are also described. The modifications of the algorithm so as to fit the DAP are given and discussed. Finally, performance results are given and conclusions are drawn.

  3. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  4. GPU Linear Algebra Libraries and GPGPU Programming for Accelerating MOPAC Semiempirical Quantum Chemistry Calculations.

    PubMed

    Maia, Julio Daniel Carvalho; Urquiza Carvalho, Gabriel Aires; Mangueira, Carlos Peixoto; Santana, Sidney Ramos; Cabral, Lucidio Anjos Formiga; Rocha, Gerd B

    2012-09-11

    In this study, we present some modifications in the semiempirical quantum chemistry MOPAC2009 code that accelerate single-point energy calculations (1SCF) of medium-size (up to 2500 atoms) molecular systems using GPU coprocessors and multithreaded shared-memory CPUs. Our modifications consisted of using a combination of highly optimized linear algebra libraries for both CPU (LAPACK and BLAS from Intel MKL) and GPU (MAGMA and CUBLAS) to hasten time-consuming parts of MOPAC such as the pseudodiagonalization, full diagonalization, and density matrix assembling. We have shown that it is possible to obtain large speedups just by using CPU serial linear algebra libraries in the MOPAC code. As a special case, we show a speedup of up to 14 times for a methanol simulation box containing 2400 atoms and 4800 basis functions, with even greater gains in performance when using multithreaded CPUs (2.1 times in relation to the single-threaded CPU code using linear algebra libraries) and GPUs (3.8 times). This degree of acceleration opens new perspectives for modeling larger structures which appear in inorganic chemistry (such as zeolites and MOFs), biochemistry (such as polysaccharides, small proteins, and DNA fragments), and materials science (such as nanotubes and fullerenes). In addition, we believe that this parallel (GPU-GPU) MOPAC code will make it feasible to use semiempirical methods in lengthy molecular simulations using both hybrid QM/MM and QM/QM potentials.

  5. 78 FR 21230 - Airworthiness Directives; Airbus Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ... Failed Post-Modification Operational Test After accomplishment of the modification specified in paragraph... of the GFI or deactivation of the associated fuel pump following failure of any post-modification operational test of the GFI. We are issuing this AD to prevent the potential of ignition sources inside fuel...

  6. Epigenetics of oropharyngeal squamous cell carcinoma: opportunities for novel chemotherapeutic targets.

    PubMed

    Lindsay, Cameron; Seikaly, Hadi; Biron, Vincent L

    2017-01-31

    Epigenetic modifications are heritable changes in gene expression that do not directly alter DNA sequence. These modifications include DNA methylation, histone post-translational modifications, small and non-coding RNAs. Alterations in epigenetic profiles cause deregulation of fundamental gene expression pathways associated with carcinogenesis. The role of epigenetics in oropharyngeal squamous cell carcinoma (OPSCC) has recently been recognized, with implications for novel biomarkers, molecular diagnostics and chemotherapeutics. In this review, important epigenetic pathways in human papillomavirus (HPV) positive and negative OPSCC are summarized, as well as the potential clinical utility of this knowledge.This material has never been published and is not currently under evaluation in any other peer-reviewed publication.

  7. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review

    PubMed Central

    Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-01-01

    Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883

  8. Fingerprints of Modified RNA Bases from Deep Sequencing Profiles.

    PubMed

    Kietrys, Anna M; Velema, Willem A; Kool, Eric T

    2017-11-29

    Posttranscriptional modifications of RNA bases are not only found in many noncoding RNAs but have also recently been identified in coding (messenger) RNAs as well. They require complex and laborious methods to locate, and many still lack methods for localized detection. Here we test the ability of next-generation sequencing (NGS) to detect and distinguish between ten modified bases in synthetic RNAs. We compare ultradeep sequencing patterns of modified bases, including miscoding, insertions and deletions (indels), and truncations, to unmodified bases in the same contexts. The data show widely varied responses to modification, ranging from no response, to high levels of mutations, insertions, deletions, and truncations. The patterns are distinct for several of the modifications, and suggest the future use of ultradeep sequencing as a fingerprinting strategy for locating and identifying modifications in cellular RNAs.

  9. High-fidelity real-time maritime scene rendering

    NASA Astrophysics Data System (ADS)

    Shyu, Hawjye; Taczak, Thomas M.; Cox, Kevin; Gover, Robert; Maraviglia, Carlos; Cahill, Colin

    2011-06-01

    The ability to simulate authentic engagements using real-world hardware is an increasingly important tool. For rendering maritime environments, scene generators must be capable of rendering radiometrically accurate scenes with correct temporal and spatial characteristics. When the simulation is used as input to real-world hardware or human observers, the scene generator must operate in real-time. This paper introduces a novel, real-time scene generation capability for rendering radiometrically accurate scenes of backgrounds and targets in maritime environments. The new model is an optimized and parallelized version of the US Navy CRUISE_Missiles rendering engine. It was designed to accept environmental descriptions and engagement geometry data from external sources, render a scene, transform the radiometric scene using the electro-optical response functions of a sensor under test, and output the resulting signal to real-world hardware. This paper reviews components of the scene rendering algorithm, and details the modifications required to run this code in real-time. A description of the simulation architecture and interfaces to external hardware and models is presented. Performance assessments of the frame rate and radiometric accuracy of the new code are summarized. This work was completed in FY10 under Office of Secretary of Defense (OSD) Central Test and Evaluation Investment Program (CTEIP) funding and will undergo a validation process in FY11.

  10. Aquarius Project: Research in the System Architecture of Accelerators for the High Performance Execution of Logic Programs.

    DTIC Science & Technology

    1991-05-31

    benchmarks ............ .... . .. .. . . .. 220 Appendix G : Source code of the Aquarius Prolog compiler ........ . 224 Chapter I Introduction "You’re given...notation, a tool that is used throughout the compiler’s implementation. Appendix F lists the source code of the C and Prolog benchmarks. Appendix G lists the...source code of the compilcr. 5 "- standard form Prolog / a-sfomadon / head umrvln Convert to tmeikernel Prol g vrans~fonaon 1symbolic execution

  11. Direct Profiling the Post-Translational Modification Codes of a Single Protein Immobilized on a Surface Using Cu-free Click Chemistry.

    PubMed

    Kim, Kyung Lock; Park, Kyeng Min; Murray, James; Kim, Kimoon; Ryu, Sung Ho

    2018-05-23

    Combinatorial post-translational modifications (PTMs), which can serve as dynamic "molecular barcodes", have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.

  12. Direct Profiling the Post-Translational Modification Codes of a Single Protein Immobilized on a Surface Using Cu-free Click Chemistry

    PubMed Central

    2018-01-01

    Combinatorial post-translational modifications (PTMs), which can serve as dynamic “molecular barcodes”, have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.

  13. A validation of LTRAN2 with high frequency extensions by comparisons with experimental measurements of unsteady transonic flows

    NASA Technical Reports Server (NTRS)

    Hessenius, K. A.; Goorjian, P. M.

    1981-01-01

    A high frequency extension of the unsteady, transonic code LTRAN2 was created and is evaluated by comparisons with experimental results. The experimental test case is a NACA 64A010 airfoil in pitching motion at a Mach number of 0.8 over a range of reduced frequencies. Comparisons indicate that the modified code is an improvement of the original LTRAN2 and provides closer agreement with experimental lift and moment coefficients. A discussion of the code modifications, which involve the addition of high frequency terms of the boundary conditions of the numerical algorithm, is included.

  14. Opening up Architectures of Software-Intensive Systems: A Functional Decomposition to Support System Comprehension

    DTIC Science & Technology

    2007-10-01

    Architecture ................................................................................ 14 Figure 2. Eclipse Java Model...16 Figure 3. Eclipse Java Model at the Source Code Level...24 Figure 9. Java Source Code

  15. Scalable Video Transmission Over Multi-Rate Multiple Access Channels

    DTIC Science & Technology

    2007-06-01

    Rate - compatible punctured convolutional codes (RCPC codes ) and their ap- plications,” IEEE...source encoded using the MPEG-4 video codec. The source encoded bitstream is then channel encoded with Rate Compatible Punctured Convolutional (RCPC...Clark, and J. M. Geist, “ Punctured convolutional codes or rate (n-1)/n and simplified maximum likelihood decoding,” IEEE Transactions on

  16. Comparison of TG‐43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes

    PubMed Central

    Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460

  17. A Common histone modification code on C4 genes in maize and its conservation in Sorghum and Setaria italica.

    PubMed

    Heimann, Louisa; Horst, Ina; Perduns, Renke; Dreesen, Björn; Offermann, Sascha; Peterhansel, Christoph

    2013-05-01

    C4 photosynthesis evolved more than 60 times independently in different plant lineages. Each time, multiple genes were recruited into C4 metabolism. The corresponding promoters acquired new regulatory features such as high expression, light induction, or cell type-specific expression in mesophyll or bundle sheath cells. We have previously shown that histone modifications contribute to the regulation of the model C4 phosphoenolpyruvate carboxylase (C4-Pepc) promoter in maize (Zea mays). We here tested the light- and cell type-specific responses of three selected histone acetylations and two histone methylations on five additional C4 genes (C4-Ca, C4-Ppdk, C4-Me, C4-Pepck, and C4-RbcS2) in maize. Histone acetylation and nucleosome occupancy assays indicated extended promoter regions with regulatory upstream regions more than 1,000 bp from the transcription initiation site for most of these genes. Despite any detectable homology of the promoters on the primary sequence level, histone modification patterns were highly coregulated. Specifically, H3K9ac was regulated by illumination, whereas H3K4me3 was regulated in a cell type-specific manner. We further compared histone modifications on the C4-Pepc and C4-Me genes from maize and the homologous genes from sorghum (Sorghum bicolor) and Setaria italica. Whereas sorghum and maize share a common C4 origin, C4 metabolism evolved independently in S. italica. The distribution of histone modifications over the promoters differed between the species, but differential regulation of light-induced histone acetylation and cell type-specific histone methylation were evident in all three species. We propose that a preexisting histone code was recruited into C4 promoter control during the evolution of C4 metabolism.

  18. Coding conventions and principles for a National Land-Change Modeling Framework

    USGS Publications Warehouse

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  19. ROS Hexapod

    NASA Technical Reports Server (NTRS)

    Davis, Kirsch; Bankieris, Derek

    2016-01-01

    As an intern project for NASA Johnson Space Center (JSC), my job was to familiarize myself and operate a Robotics Operating System (ROS). The project outcome converted existing software assets into ROS using nodes, enabling a robotic Hexapod to communicate to be functional and controlled by an existing PlayStation 3 (PS3) controller. Existing control algorithms and current libraries have no ROS capabilities within the Hexapod C++ source code when the internship started, but that has changed throughout my internship. Conversion of C++ codes to ROS enabled existing code to be compatible with ROS, and is now controlled using an existing PS3 controller. Furthermore, my job description was to design ROS messages and script programs that enabled assets to participate in the ROS ecosystem by subscribing and publishing messages. Software programming source code is written in directories using C++. Testing of software assets included compiling code within the Linux environment using a terminal. The terminal ran the code from a directory. Several problems occurred while compiling code and the code would not compile. So modifying code to where C++ can read the source code were made. Once the code was compiled and ran, the code was uploaded to Hexapod and then controlled by a PS3 controller. The project outcome has the Hexapod fully functional and compatible with ROS and operates using the PlayStation 3 controller. In addition, an open source software (IDE) Arduino board will be integrated into the ecosystem with designing circuitry on a breadboard to add additional behavior with push buttons, potentiometers and other simple elements in the electrical circuitry. Other projects with the Arduino will be a GPS module, digital clock that will run off 22 satellites to show accurate real time using a GPS signal and an internal patch antenna to communicate with satellites. In addition, this internship experience has led me to pursue myself to learn coding more efficiently and effectively to write, subscribe and publish my own source code in different programming languages. With some familiarity with software programming, it will enhance my skills in the electrical engineering field. In contrast, my experience here at JSC with the Simulation and Graphics Branch (ER7) has led me to take my coding skill to be more proficient to increase my knowledge in software programming, and also enhancing my skills in ROS. This knowledge will be taken back to my university to implement coding in a school project that will use source coding and ROS to work on the PR2 robot which is controlled by ROS software. My skills learned here will be used to integrate messages to subscribe and publish ROS messages to a PR2 robot. The PR2 robot will be controlled by an existing PS3 controller by changing C++ coding to subscribe and publish messages to ROS. Overall the skills that were obtained here will not be lost, but increased.

  20. Improvements in the EQ-10 electrodeless Z-pinch EUV source for metrology applications

    NASA Astrophysics Data System (ADS)

    Horne, Stephen F.; Gustafson, Deborah; Partlow, Matthew J.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.

    2011-04-01

    Now that EUV lithography systems are beginning to ship into the fabs for next generation chips it is more critical that the EUV infrastructure developments are keeping pace. Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinch™ light source since 2005. The source is currently being used for metrology, mask inspection, and resist development. These applications require especially stable performance in both power and source size. Over the last 5 years Energetiq has made many source modifications which have included better thermal management as well as high pulse rate operation6. Recently we have further increased the system power handling and electrical pulse reproducibility. The impact of these modifications on source performance will be reported.

  1. Runtime Detection of C-Style Errors in UPC Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirkelbauer, P; Liao, C; Panas, T

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less

  2. 76 FR 38747 - Review of New Sources and Modifications in Indian Country

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-01

    ...The EPA is finalizing a Federal Implementation Plan (FIP) under the Clean Air Act (CAA or Act) for Indian country. The FIP includes two New Source Review (NSR) regulations for the protection of air resources in Indian country. The first rule applies to new and modified minor stationary sources (minor sources) and to minor modifications at existing major stationary sources (major sources) throughout Indian country. The second rule (nonattainment major NSR rule) applies to new and modified major sources in areas of Indian country that are designated as not attaining the National Ambient Air Quality Standards (NAAQS). These rules will be implemented by EPA or a delegate Tribal agency assisting EPA with administration of the rules, until replaced by an EPA-approved implementation plan.

  3. Life cycle-dependent cytoskeletal modifications in Plasmodium falciparum infected erythrocytes.

    PubMed

    Shi, Hui; Liu, Zhuo; Li, Ang; Yin, Jing; Chong, Alvin G L; Tan, Kevin S W; Zhang, Yong; Lim, Chwee Teck

    2013-01-01

    Plasmodium falciparum infection of human erythrocytes is known to result in the modification of the host cell cytoskeleton by parasite-coded proteins. However, such modifications and corresponding implications in malaria pathogenesis have not been fully explored. Here, we probed the gradual modification of infected erythrocyte cytoskeleton with advancing stages of infection using atomic force microscopy (AFM). We reported a novel strategy to derive accurate and quantitative information on the knob structures and their connections with the spectrin network by performing AFM-based imaging analysis of the cytoplasmic surface of infected erythrocytes. Significant changes on the red cell cytoskeleton were observed from the expansion of spectrin network mesh size, extension of spectrin tetramers and the decrease of spectrin abundance with advancing stages of infection. The spectrin network appeared to aggregate around knobs but also appeared sparser at non-knob areas as the parasite matured. This dramatic modification of the erythrocyte skeleton during the advancing stage of malaria infection could contribute to the loss of deformability of the infected erythrocyte.

  4. Effect of the diffusion parameters on the observed γ-ray spectrum of sources and their contribution to the local all-electron spectrum: The EDGE code

    NASA Astrophysics Data System (ADS)

    López-Coto, R.; Hahn, J.; BenZvi, S.; Dingus, B.; Hinton, J.; Nisa, M. U.; Parsons, R. D.; Greus, F. Salesa; Zhang, H.; Zhou, H.

    2018-11-01

    The positron excess measured by PAMELA and AMS can only be explained if there is one or several sources injecting them. Moreover, at the highest energies, it requires the presence of nearby ( ∼ hundreds of parsecs) and middle age (maximum of ∼ hundreds of kyr) sources. Pulsars, as factories of electrons and positrons, are one of the proposed candidates to explain the origin of this excess. To calculate the contribution of these sources to the electron and positron flux at the Earth, we developed EDGE (Electron Diffusion and Gamma rays to the Earth), a code to treat the propagation of electrons and compute their diffusion from a central source with a flexible injection spectrum. Using this code, we can derive the source's gamma-ray spectrum, spatial extension, the all-electron density in space, the electron and positron flux reaching the Earth and the positron fraction measured at the Earth. We present in this paper the foundations of the code and study how different parameters affect the gamma-ray spectrum of a source and the electron flux measured at the Earth. We also studied the effect of several approximations usually performed in these studies. This code has been used to derive the results of the positron flux measured at the Earth in [1].

  5. Earthquake Early Warning ShakeAlert System: Testing and certification platform

    USGS Publications Warehouse

    Cochran, Elizabeth S.; Kohler, Monica D.; Given, Douglas; Guiwits, Stephen; Andrews, Jennifer; Meier, Men-Andrin; Ahmad, Mohammad; Henson, Ivan; Hartog, Renate; Smith, Deborah

    2017-01-01

    Earthquake early warning systems provide warnings to end users of incoming moderate to strong ground shaking from earthquakes. An earthquake early warning system, ShakeAlert, is providing alerts to beta end users in the western United States, specifically California, Oregon, and Washington. An essential aspect of the earthquake early warning system is the development of a framework to test modifications to code to ensure functionality and assess performance. In 2016, a Testing and Certification Platform (TCP) was included in the development of the Production Prototype version of ShakeAlert. The purpose of the TCP is to evaluate the robustness of candidate code that is proposed for deployment on ShakeAlert Production Prototype servers. TCP consists of two main components: a real‐time in situ test that replicates the real‐time production system and an offline playback system to replay test suites. The real‐time tests of system performance assess code optimization and stability. The offline tests comprise a stress test of candidate code to assess if the code is production ready. The test suite includes over 120 events including local, regional, and teleseismic historic earthquakes, recentering and calibration events, and other anomalous and potentially problematic signals. Two assessments of alert performance are conducted. First, point‐source assessments are undertaken to compare magnitude, epicentral location, and origin time with the Advanced National Seismic System Comprehensive Catalog, as well as to evaluate alert latency. Second, we describe assessment of the quality of ground‐motion predictions at end‐user sites by comparing predicted shaking intensities to ShakeMaps for historic events and implement a threshold‐based approach that assesses how often end users initiate the appropriate action, based on their ground‐shaking threshold. TCP has been developed to be a convenient streamlined procedure for objectively testing algorithms, and it has been designed with flexibility to accommodate significant changes in development of new or modified system code. It is expected that the TCP will continue to evolve along with the ShakeAlert system, and the framework we describe here provides one example of how earthquake early warning systems can be evaluated.

  6. High-resolution 3D simulations of NIF ignition targets performed on Sequoia with HYDRA

    NASA Astrophysics Data System (ADS)

    Marinak, M. M.; Clark, D. S.; Jones, O. S.; Kerbel, G. D.; Sepke, S.; Patel, M. V.; Koning, J. M.; Schroeder, C. R.

    2015-11-01

    Developments in the multiphysics ICF code HYDRA enable it to perform large-scale simulations on the Sequoia machine at LLNL. With an aggregate computing power of 20 Petaflops, Sequoia offers an unprecedented capability to resolve the physical processes in NIF ignition targets for a more complete, consistent treatment of the sources of asymmetry. We describe modifications to HYDRA that enable it to scale to over one million processes on Sequoia. These include new options for replicating parts of the mesh over a subset of the processes, to avoid strong scaling limits. We consider results from a 3D full ignition capsule-only simulation performed using over one billion zones run on 262,000 processors which resolves surface perturbations through modes l = 200. We also report progress towards a high-resolution 3D integrated hohlraum simulation performed using 262,000 processors which resolves surface perturbations on the ignition capsule through modes l = 70. These aim for the most complete calculations yet of the interactions and overall impact of the various sources of asymmetry for NIF ignition targets. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.

  7. F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment

    NASA Technical Reports Server (NTRS)

    Anders, Scott G.; Fischer, Michael C.

    1999-01-01

    The F-16XL-2 Supersonic Laminar Flow Control Flight Test Experiment was part of the NASA High-Speed Research Program. The goal of the experiment was to demonstrate extensive laminar flow, to validate computational fluid dynamics (CFD) codes and design methodology, and to establish laminar flow control design criteria. Topics include the flight test hardware and design, airplane modification, the pressure and suction distributions achieved, the laminar flow achieved, and the data analysis and code correlation.

  8. Recommendations on Implementing the Energy Conservation Building Code in Rajasthan, India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Sha; Makela, Eric J.; Evans, Meredydd

    India launched the Energy Conservation Building Code (ECBC) in 2007 and Indian Bureau of Energy Efficiency (BEE) recently indicated that it would move to mandatory implementation in the 12th Five-Year Plan. The State of Rajasthan adopted ECBC with minor modifications; the new regulation is known as the Energy Conservation Building Directives – Rajasthan 2011 (ECBD-R). It became mandatory in Rajasthan on September 28, 2011. This report provides recommendations on an ECBD-R enforcement roadmap for the State of Rajasthan.

  9. Errors from approximation of ODE systems with reduced order models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevska, Tanya

    2016-12-30

    This is a code to calculate the error from approximation of systems of ordinary differential equations (ODEs) by using Proper Orthogonal Decomposition (POD) Reduced Order Models (ROM) methods and to compare and analyze the errors for two POD ROM variants. The first variant is the standard POD ROM, the second variant is a modification of the method using the values of the time derivatives (a.k.a. time-derivative snapshots). The code compares the errors from the two variants under different conditions.

  10. Listening to Students: Modification of a Reading Program Based on the Sources of Foreign Language Reading Anxiety

    ERIC Educational Resources Information Center

    Aydin, Belgin; Kuru Gonen, Ipek

    2012-01-01

    This paper is concerned with the modifications implemented in a second year foreign language (FL) reading program with respect to the problems students experience while reading in FL. This research draws on the sources of FL reading anxiety identified in the first year reading program with a motivation to re-design the second year program to help…

  11. An open-source method to analyze optokinetic reflex responses in larval zebrafish.

    PubMed

    Scheetz, Seth D; Shao, Enhua; Zhou, Yangzhong; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2018-01-01

    Optokinetic reflex (OKR) responses provide a convenient means to evaluate oculomotor, integrative and afferent visual function in larval zebrafish models, which are commonly used to elucidate molecular mechanisms underlying development, disease and repair of the vertebrate nervous system. We developed an open-source MATLAB-based solution for automated quantitative analysis of OKR responses in larval zebrafish. The package includes applications to: (i) generate sinusoidally-transformed animated grating patterns suitable for projection onto a cylindrical screen to elicit the OKR; (ii) determine and record the angular orientations of the eyes in each frame of a video recording showing the OKR response; and (iii) analyze angular orientation data from the tracking program to yield a set of parameters that quantify essential elements of the OKR. The method can be employed without modification using the operating manual provided. In addition, annotated source code is included, allowing users to modify or adapt the software for other applications. We validated the algorithms and measured OKR responses in normal larval zebrafish, showing good agreement with published quantitative data, where available. We provide the first open-source method to elicit and analyze the OKR in larval zebrafish. The wide range of parameters that are automatically quantified by our algorithms significantly expands the scope of quantitative analysis previously reported. Our method for quantifying OKR responses will be useful for numerous applications in neuroscience using the genetically- and chemically-tractable zebrafish model. Published by Elsevier B.V.

  12. Bistatic radar cross section of a perfectly conducting rhombus-shaped flat plate

    NASA Astrophysics Data System (ADS)

    Fenn, Alan J.

    1990-05-01

    The bistatic radar cross section of a perfectly conducting flat plate that has a rhombus shape (equilateral parallelogram) is investigated. The Ohio State University electromagnetic surface patch code (ESP version 4) is used to compute the theoretical bistatic radar cross section of a 35- x 27-in rhombus plate at 1.3 GHz over the bistatic angles 15 deg to 142 deg. The ESP-4 computer code is a method of moments FORTRAN-77 program which can analyze general configurations of plates and wires. This code has been installed and modified at Lincoln Laboratory on a SUN 3 computer network. Details of the code modifications are described. Comparisons of the method of moments simulations and measurements of the rhombus plate are made. It is shown that the ESP-4 computer code provides a high degree of accuracy in the calculation of copolarized and cross-polarized bistatic radar cross section patterns.

  13. [Towards a new Tunisian Medical Code of Deontology].

    PubMed

    Aissaoui, Abir; Haj Salem, Nidhal; Chadly, Ali

    2010-06-01

    The Medical Code of Deontology is a legal text including the physician's duties towards his patients, colleagues, auxiliaries and the community. Considering the scientific, legal and social changes, the deontology code should be revised periodically. The first Tunisian Medical Code of Deontology (TMCD) was promulgated in 1973 and abrogated in 1993 by the new Code. This version has never been reviewed and does not seem to fit the current conditions of medical practice. The TMCD does not contain texts referring to information given to the patient, pain control, palliative care and management of the end of life as well as protection of medical data. Furthermore, the TMCD does not include rules related to tissues and organs transplantation and medical assisted human reproduction in accordance with Tunisian legal texts. We aim in this paper at analyzing the insufficiencies of the TMCD and suggesting modifications in order to update it.

  14. Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Gould, R. K.

    1979-01-01

    Mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon were developed. The following tasks were accomplished: (1) formulation of a model for silicon vapor separation/collection from the developing turbulent flow stream within reactors of the Westinghouse (2) modification of an available general parabolic code to achieve solutions to the governing partial differential equations (boundary layer type) which describe migration of the vapor to the reactor walls, (3) a parametric study using the boundary layer code to optimize the performance characteristics of the Westinghouse reactor, (4) calculations relating to the collection efficiency of the new AeroChem reactor, and (5) final testing of the modified LAPP code for use as a method of predicting Si(1) droplet sizes in these reactors.

  15. Incorporation of Dynamic SSI Effects in the Design Response Spectra

    NASA Astrophysics Data System (ADS)

    Manjula, N. K.; Pillai, T. M. Madhavan; Nagarajan, Praveen; Reshma, K. K.

    2018-05-01

    Many studies in the past on dynamic soil-structure interactions have revealed the detrimental and advantageous effects of soil flexibility. Based on such studies, the design response spectra of international seismic codes are being improved worldwide. The improvements required for the short period range of the design response spectra in the Indian seismic code (IS 1893:2002) are presented in this paper. As the recent code revisions has not incorporated the short period amplifications, proposals given in this paper are equally applicable for the latest code also (IS 1893:2016). Analyses of single degree of freedom systems are performed to predict the required improvements. The proposed modifications to the constant acceleration portion of the spectra are evaluated with respect to the current design spectra in Eurocode 8.

  16. Admiralty Inlet Advanced Turbulence Measurements: final data and code archive

    DOE Data Explorer

    Kilcher, Levi (ORCID:0000000183851131); Thomson, Jim (ORCID:0000000289290088); Harding, Samuel

    2011-02-01

    Data and code that is not already in a public location that is used in Kilcher, Thomson, Harding, and Nylund (2017) "Turbulence Measurements from Compliant Moorings - Part II: Motion Correction" doi: 10.1175/JTECH-D-16-0213.1. The links point to Python source code used in the publication. All other files are source data used in the publication.

  17. Numerical Electromagnetic Code (NEC)-Basic Scattering Code. Part 2. Code Manual

    DTIC Science & Technology

    1979-09-01

    imaging of source axes for magnetic source. Ax R VSOURC(1,1) + 9 VSOURC(1,2) + T VSOURC(1,3) 4pi = x VIMAG(I,1) + ^ VINAG (1,2)+ VIMAG(l,3) An =unit...VNC A. yt and z components of the end cap unit normal OUTPUT VARIABLE VINAG X.. Y, and z components defining thesource image coordinate system axesin

  18. Java Source Code Analysis for API Migration to Embedded Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, Victor; McCoy, James A.; Guerrero, Jonathan

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less

  19. NESSUS/NASTRAN Interface (Modification of NESSUS to FORTRAN 90 Standard)

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The objective of this work has been to develop a FORTRAN 90 (F90) version of the NESSUS probabilistic analysis software, Version 6.2 with NASTRAN interface. The target platform for the modified NESSUS code is the SGI workstation.

  20. Monte Carlo study on secondary neutrons in passive carbon-ion radiotherapy: identification of the main source and reduction in the secondary neutron dose.

    PubMed

    Yonai, Shunsuke; Matsufuji, Naruhiro; Kanai, Tatsuaki

    2009-10-01

    Recent successful results in passive carbon-ion radiotherapy allow the patient to live for a longer time and allow younger patients to receive the radiotherapy. Undesired radiation exposure in normal tissues far from the target volume is considerably lower than that close to the treatment target, but it is considered to be non-negligible in the estimation of the secondary cancer risk. Therefore, it is very important to reduce the undesired secondary neutron exposure in passive carbon-ion radiotherapy without influencing the clinical beam. In this study, the source components in which the secondary neutrons are produced during passive carbon-ion radiotherapy were identified and the method to reduce the secondary neutron dose effectively based on the identification of the main sources without influencing the clinical beam was investigated. A Monte Carlo study with the PHITS code was performed by assuming the beamline at the Heavy-Ion Medical Accelerator in Chiba (HIMAC). At first, the authors investigated the main sources of secondary neutrons in passive carbon-ion radiotherapy. Next, they investigated the reduction in the neutron dose with various modifications of the beamline device that is the most dominant in the neutron production. Finally, they investigated the use of an additional shield for the patient. It was shown that the main source is the secondary neutrons produced in the four-leaf collimator (FLC) used as a precollimator at HIAMC, of which contribution in the total neutron ambient dose equivalent is more than 70%. The investigations showed that the modification of the FLC can reduce the neutron dose at positions close to the beam axis by 70% and the FLC is very useful not only for the collimation of the primary beam but also the reduction in the secondary neutrons. Also, an additional shield for the patient is very effective to reduce the neutron dose at positions farther than 50 cm from the beam axis. Finally, they showed that the neutron dose can be reduced by approximately 70% at any position without influencing the primary beam used in treatment. This study was performed by assuming the HIMAC beamline; however, this study provides important information for reoptimizing the arrangement and the materials of beamline devices and designing a new facility for passive carbon-ion radiotherapy and probably passive proton radiotherapy.

  1. Epigenetic Modifications in Essential Hypertension

    PubMed Central

    Wise, Ingrid A.; Charchar, Fadi J.

    2016-01-01

    Essential hypertension (EH) is a complex, polygenic condition with no single causative agent. Despite advances in our understanding of the pathophysiology of EH, hypertension remains one of the world’s leading public health problems. Furthermore, there is increasing evidence that epigenetic modifications are as important as genetic predisposition in the development of EH. Indeed, a complex and interactive genetic and environmental system exists to determine an individual’s risk of EH. Epigenetics refers to all heritable changes to the regulation of gene expression as well as chromatin remodelling, without involvement of nucleotide sequence changes. Epigenetic modification is recognized as an essential process in biology, but is now being investigated for its role in the development of specific pathologic conditions, including EH. Epigenetic research will provide insights into the pathogenesis of blood pressure regulation that cannot be explained by classic Mendelian inheritance. This review concentrates on epigenetic modifications to DNA structure, including the influence of non-coding RNAs on hypertension development. PMID:27023534

  2. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  3. 40 CFR 455.47 - Pretreatment standards for new sources (PSNS).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PESTICIDE CHEMICALS Pesticide Chemicals Formulating and... received a modification by Best Engineering Judgement for modifications not listed in Table 8 to this part...

  4. 40 CFR 455.47 - Pretreatment standards for new sources (PSNS).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PESTICIDE CHEMICALS Pesticide Chemicals Formulating and... received a modification by Best Engineering Judgement for modifications not listed in Table 8 to this part...

  5. Challenges in using medicaid claims to ascertain child maltreatment.

    PubMed

    Raghavan, Ramesh; Brown, Derek S; Allaire, Benjamin T; Garfield, Lauren D; Ross, Raven E; Hedeker, Donald

    2015-05-01

    Medicaid data contain International Classification of Diseases, Clinical Modification (ICD-9-CM) codes indicating maltreatment, yet there is a little information on how valid these codes are for the purposes of identifying maltreatment from health, as opposed to child welfare, data. This study assessed the validity of Medicaid codes in identifying maltreatment. Participants (n = 2,136) in the first National Survey of Child and Adolescent Well-Being were linked to their Medicaid claims obtained from 36 states. Caseworker determinations of maltreatment were compared with eight sets of ICD-9-CM codes. Of the 1,921 children identified by caseworkers as being maltreated, 15.2% had any relevant ICD-9-CM code in any of their Medicaid files across 4 years of observation. Maltreated boys and those of African American race had lower odds of displaying a maltreatment code. Using only Medicaid claims to identify maltreated children creates validity problems. Medicaid data linkage with other types of administrative data is required to better identify maltreated children. © The Author(s) 2014.

  6. The Athena Astrophysical MHD Code in Cylindrical Geometry

    NASA Astrophysics Data System (ADS)

    Skinner, M. A.; Ostriker, E. C.

    2011-10-01

    We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.

  7. The search for a 100MA RancheroS magnetic flux compression generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watt, Robert Gregory

    2016-09-01

    The Eulerian AMR rad-hydro-MHD code Roxane was used to investigate modifications to existing designs of the new RancheroS class of Magnetic Flux Compression Generators (FCGs) which might allow some members of this FCG family to exceed 100 MA driving a 10 nH static load. This report details the results of that study and proposes a specific generator modification which seems to satisfy both the peak current and desired risetime for the current pulse into the load. The details of the study and necessary modifications are presented. For details of the LA43S RancheroS FCG design and predictions for the first usemore » of the generator refer to the relevant publications.« less

  8. Profiling Changes in Histone Post-translational Modifications by Top-Down Mass Spectrometry.

    PubMed

    Zhou, Mowei; Wu, Si; Stenoien, David L; Zhang, Zhaorui; Connolly, Lanelle; Freitag, Michael; Paša-Tolić, Ljiljana

    2017-01-01

    Top-down mass spectrometry is a valuable tool for understanding gene expression through characterization of combinatorial histone post-translational modifications (i.e., histone code). In this protocol, we describe a top-down workflow that employs liquid chromatography (LC) coupled to mass spectrometry (MS), for fast global profiling of changes in histone proteoforms, and apply LCMS top-down approach for comparative analysis of a wild-type and a mutant fungal species. The proteoforms exhibiting differential abundances can be subjected to further targeted studies by other MS or orthogonal (e.g., biochemical) assays. This method can be generally adapted for screening of changes in histone modifications between samples such as wild type vs. mutant or healthy vs. diseased.

  9. Modification of c and n sources for enhanced production of cyclosporin ‘a’ by Aspergillus Terreus

    PubMed Central

    Tanseer, Sundas; Anjum, Tehmina

    2011-01-01

    Most of the studies regarding cyclosporin ‘A’ production through fungi concentrate around Tolypocladium inflatum. This is mainly due to lower reported production of this drug in other fungi. The present study was therefore conducted to explore indigenous isolates of Aspergillus terreus for synthesis of this drug and defining a production medium for obtaining high yield of cyclosporin ‘A’. For this purpose carbon and nitrogen sources were optimized for the selected best strain of A. terreus. Overall results depicted that the best cyclosporin ‘A’ yield from selected Aspergillus terreus (FCBP58) could be obtained by using production medium containing glucose 10% as carbon source and peptone 0.5% as nitrogen source. This modification in production medium enhanced drug synthesis by selected fungi significantly. The production capabilities when compared with biomass of fungi there was found no relationship between the two confirming that the medium modification increased overall drug synthesis powers of the fungi. PMID:24031766

  10. Epidemiologic studies of electric and magnetic fields and cancer: Strategies for extending knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savitz, D.A.

    1993-12-01

    Epidemiologic research concerning electric and magnetic fields in relation to cancer has focused on the potential etiologic roles of residential exposure on childhood cancer and occupational exposure on adult leukemia and brain cancer. Future residential studies must concentrate on exposure assessment that is enhanced by developing models of historical exposure, assessment of the relation between magnetic fields and wire codes, and consideration of alternate exposure indices. Study design issues deserving attention include possible biases in random digit dialing control selection, consideration of the temporal course of exposure and disease, and acquisition of the necessary information to assess the potential valuemore » of ecologic studies. Highest priorities are comprehensive evaluation of exposure patterns and sources and examination of the sociology and geography of residential wire codes. Future occupational studies should also concentrate on improved exposure assessment with increased attention to nonutility worker populations and development of historical exposure indicators that are superior to job titles alone. Potential carcinogens in the workplace that could act as confounders need to be more carefully examined. The temporal relation between exposure and disease and possible effect modification by other workplace agents should be incorporated into future studies. The most pressing need is for measurement of exposure patterns in a variety of worker populations and performance of traditional epidemiologic evaluations of cancer occurrence. The principal source of bias toward the null is nondifferential misclassification of exposure with improvements expected to enhance any true etiologic association that is present. Biases away from the null might include biased control selection in residential studies and chemical carcinogens acting as confounders in occupational studies. 51 refs., 1 tab.« less

  11. Numerical Modeling of Poroelastic-Fluid Systems Using High-Resolution Finite Volume Methods

    NASA Astrophysics Data System (ADS)

    Lemoine, Grady

    Poroelasticity theory models the mechanics of porous, fluid-saturated, deformable solids. It was originally developed by Maurice Biot to model geophysical problems, such as seismic waves in oil reservoirs, but has also been applied to modeling living bone and other porous media. Poroelastic media often interact with fluids, such as in ocean bottom acoustics or propagation of waves from soft tissue into bone. This thesis describes the development and testing of high-resolution finite volume numerical methods, and simulation codes implementing these methods, for modeling systems of poroelastic media and fluids in two and three dimensions. These methods operate on both rectilinear grids and logically rectangular mapped grids. To allow the use of these methods, Biot's equations of poroelasticity are formulated as a first-order hyperbolic system with a source term; this source term is incorporated using operator splitting. Some modifications are required to the classical high-resolution finite volume method. Obtaining correct solutions at interfaces between poroelastic media and fluids requires a novel transverse propagation scheme and the removal of the classical second-order correction term at the interface, and in three dimensions a new wave limiting algorithm is also needed to correctly limit shear waves. The accuracy and convergence rates of the methods of this thesis are examined for a variety of analytical solutions, including simple plane waves, reflection and transmission of waves at an interface between different media, and scattering of acoustic waves by a poroelastic cylinder. Solutions are also computed for a variety of test problems from the computational poroelasticity literature, as well as some original test problems designed to mimic possible applications for the simulation code.

  12. The incorporation of plotting capability into the Unified Subsonic Supersonic Aerodynamic Analysis program, version B

    NASA Technical Reports Server (NTRS)

    Winter, O. A.

    1980-01-01

    The B01 version of the United Subsonic Supersonic Aerodynamic Analysis program is the result of numerous modifications and additions made to the B00 version. These modifications and additions affect the program input, its computational options, the code readability, and the overlay structure. The following are described: (1) the revised input; (2) the plotting overlay programs which were also modified, and their associated subroutines, (3) the auxillary files used by the program, the revised output data; and (4) the program overlay structure.

  13. Design and Validation of CRISPR/Cas9 Systems for Targeted Gene Modification in Induced Pluripotent Stem Cells.

    PubMed

    Lee, Ciaran M; Zhu, Haibao; Davis, Timothy H; Deshmukh, Harshahardhan; Bao, Gang

    2017-01-01

    The CRISPR/Cas9 system is a powerful tool for precision genome editing. The ability to accurately modify genomic DNA in situ with single nucleotide precision opens up new possibilities for not only basic research but also biotechnology applications and clinical translation. In this chapter, we outline the procedures for design, screening, and validation of CRISPR/Cas9 systems for targeted modification of coding sequences in the human genome and how to perform genome editing in induced pluripotent stem cells with high efficiency and specificity.

  14. Graphic/symbol segmentation for Group 4 facsimile systems

    NASA Astrophysics Data System (ADS)

    Deutermann, A. R.

    1982-04-01

    The purpose of this study was to examine possible techniques for and symbol areas, and assemble a code that represents the entire document. Parameters to be considered include compression, commonality with facsimile and TELETEX* transmissions, and complexity of implementation. Six segmentation technique were selected for analysis. The techniques were designed to differ from each other as much as possible, so as to display a wide variety of characteristics. For each technique, many minor modifications would be possible, but it is not expected that these modifications would alter the conclusions drawn from the study.

  15. Differences in the causes of death of HIV-positive patients in a cohort study by data sources and coding algorithms.

    PubMed

    Hernando, Victoria; Sobrino-Vegas, Paz; Burriel, M Carmen; Berenguer, Juan; Navarro, Gemma; Santos, Ignacio; Reparaz, Jesús; Martínez, M Angeles; Antela, Antonio; Gutiérrez, Félix; del Amo, Julia

    2012-09-10

    To compare causes of death (CoDs) from two independent sources: National Basic Death File (NBDF) and deaths reported to the Spanish HIV Research cohort [Cohort de adultos con infección por VIH de la Red de Investigación en SIDA CoRIS)] and compare the two coding algorithms: International Classification of Diseases, 10th revision (ICD-10) and revised version of Coding Causes of Death in HIV (revised CoDe). Between 2004 and 2008, CoDs were obtained from the cohort records (free text, multiple causes) and also from NBDF (ICD-10). CoDs from CoRIS were coded according to ICD-10 and revised CoDe by a panel. Deaths were compared by 13 disease groups: HIV/AIDS, liver diseases, malignancies, infections, cardiovascular, blood disorders, pulmonary, central nervous system, drug use, external, suicide, other causes and ill defined. There were 160 deaths. Concordance for the 13 groups was observed in 111 (69%) cases for the two sources and in 115 (72%) cases for the two coding algorithms. According to revised CoDe, the commonest CoDs were HIV/AIDS (53%), non-AIDS malignancies (11%) and liver related (9%), these percentages were similar, 57, 10 and 8%, respectively, for NBDF (coded as ICD-10). When using ICD-10 to code deaths in CoRIS, wherein HIV infection was known in everyone, the proportion of non-AIDS malignancies was 13%, liver-related accounted for 3%, while HIV/AIDS reached 70% due to liver-related, infections and ill-defined causes being coded as HIV/AIDS. There is substantial variation in CoDs in HIV-infected persons according to sources and algorithms. ICD-10 in patients known to be HIV-positive overestimates HIV/AIDS-related deaths at the expense of underestimating liver-related diseases, infections and ill defined causes. CoDe seems as the best option for cohort studies.

  16. Source Methodology for Turbofan Noise Prediction (SOURCE3D Technical Documentation)

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This report provides the analytical documentation for the SOURCE3D Rotor Wake/Stator Interaction Code. It derives the equations for the rotor scattering coefficients and stator source vector and scattering coefficients that are needed for use in the TFANS (Theoretical Fan Noise Design/Prediction System). SOURCE3D treats the rotor and stator as isolated source elements. TFANS uses this information, along with scattering coefficients for inlet and exit elements, and provides complete noise solutions for turbofan engines. SOURCE3D is composed of a collection of FORTRAN programs that have been obtained by extending the approach of the earlier V072 Rotor Wake/Stator Interaction Code. Similar to V072, it treats the rotor and stator as a collection of blades and vanes having zero thickness and camber contained in an infinite, hardwall annular duct. SOURCE3D adds important features to the V072 capability-a rotor element, swirl flow and vorticity waves, actuator disks for flow turning, and combined rotor/actuator disk and stator/actuator disk elements. These items allow reflections from the rotor, frequency scattering, and mode trapping, thus providing more complete noise predictions than previously. The code has been thoroughly verified through comparison with D.B. Hanson's CUP2D two- dimensional code using a narrow annulus test case.

  17. The Cortical Organization of Speech Processing: Feedback Control and Predictive Coding the Context of a Dual-Stream Model

    ERIC Educational Resources Information Center

    Hickok, Gregory

    2012-01-01

    Speech recognition is an active process that involves some form of predictive coding. This statement is relatively uncontroversial. What is less clear is the source of the prediction. The dual-stream model of speech processing suggests that there are two possible sources of predictive coding in speech perception: the motor speech system and the…

  18. Modification of light sources for appropriate biological action

    NASA Astrophysics Data System (ADS)

    Kozakov, R.; Schöpp, H.; Franke, St.; Stoll, C.; Kunz, D.

    2010-06-01

    The impact of the non-visual action of light on the design of novel light sources is discussed. Therefore possible modifications of lamps dealing with spectral tailoring and their action on melatonin suppression in usual life situations are investigated. The results of melatonin suppression by plasma lamps are presented. It is shown that even short-time exposure to usual light levels in working areas has an influence on the melatonin onset.

  19. Mapping of ribosomal 23S ribosomal RNA modifications in Clostridium sporogenes.

    PubMed

    Kirpekar, Finn; Hansen, Lykke H; Mundus, Julie; Tryggedsson, Stine; Teixeira Dos Santos, Patrícia; Ntokou, Eleni; Vester, Birte

    2018-06-27

    All organisms contain RNA modifications in their ribosomal RNA (rRNA), but the importance, positions and exact function of these are still not fully elucidated. Various functions such as stabilising structures, controlling ribosome assembly and facilitating interactions have been suggested and in some cases substantiated. Bacterial rRNA contains much fewer modifications than eukaryotic rRNA. The rRNA modification patterns in bacteria differ from each other, but too few organisms have been mapped to draw general conclusions. This study maps 23S ribosomal RNA modifications in Clostridium sporogenes that can be characterised as a non-toxin producing Clostridium botulinum. Clostridia are able to sporulate and thereby survive harsh conditions, and are in general considered to be resilient to antibiotics. Selected regions of the 23S rRNA were investigated by mass spectrometry and by primer extension analysis to pinpoint modified sites and the nature of the modifications. Apparently, C. sporogenes 23S rRNA contains few modifications compared to other investigated bacteria. No modifications were identified in domain II and III of 23S rRNA. Three modifications were identified in domain IV, all of which have also been found in other organisms. Two unusual modifications were identified in domain V, methylated dihydrouridine at position U2449 and dihydrouridine at position U2500 (Escherichia coli numbering), in addition to four previously known modified positions. The enzymes responsible for the modifications were searched for in the C. sporogenes genome using BLAST with characterised enzymes as query. The search identified genes potentially coding for RNA modifying enzymes responsible for most of the found modifications.

  20. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  1. Ube2V2 Is a Rosetta Stone Bridging Redox and Ubiquitin Codes, Coordinating DNA Damage Responses.

    PubMed

    Zhao, Yi; Long, Marcus J C; Wang, Yiran; Zhang, Sheng; Aye, Yimon

    2018-02-28

    Posttranslational modifications (PTMs) are the lingua franca of cellular communication. Most PTMs are enzyme-orchestrated. However, the reemergence of electrophilic drugs has ushered mining of unconventional/non-enzyme-catalyzed electrophile-signaling pathways. Despite the latest impetus toward harnessing kinetically and functionally privileged cysteines for electrophilic drug design, identifying these sensors remains challenging. Herein, we designed "G-REX"-a technique that allows controlled release of reactive electrophiles in vivo. Mitigating toxicity/off-target effects associated with uncontrolled bolus exposure, G-REX tagged first-responding innate cysteines that bind electrophiles under true k cat / K m conditions. G-REX identified two allosteric ubiquitin-conjugating proteins-Ube2V1/Ube2V2-sharing a novel privileged-sensor-cysteine. This non-enzyme-catalyzed-PTM triggered responses specific to each protein. Thus, G-REX is an unbiased method to identify novel functional cysteines. Contrasting conventional active-site/off-active-site cysteine-modifications that regulate target activity, modification of Ube2V2 allosterically hyperactivated its enzymatically active binding-partner Ube2N, promoting K63-linked client ubiquitination and stimulating H2AX-dependent DNA damage response. This work establishes Ube2V2 as a Rosetta-stone bridging redox and ubiquitin codes to guard genome integrity.

  2. MR-Tandem: parallel X!Tandem using Hadoop MapReduce on Amazon Web Services.

    PubMed

    Pratt, Brian; Howbert, J Jeffry; Tasman, Natalie I; Nilsson, Erik J

    2012-01-01

    MR-Tandem adapts the popular X!Tandem peptide search engine to work with Hadoop MapReduce for reliable parallel execution of large searches. MR-Tandem runs on any Hadoop cluster but offers special support for Amazon Web Services for creating inexpensive on-demand Hadoop clusters, enabling search volumes that might not otherwise be feasible with the compute resources a researcher has at hand. MR-Tandem is designed to drop in wherever X!Tandem is already in use and requires no modification to existing X!Tandem parameter files, and only minimal modification to X!Tandem-based workflows. MR-Tandem is implemented as a lightly modified X!Tandem C++ executable and a Python script that drives Hadoop clusters including Amazon Web Services (AWS) Elastic Map Reduce (EMR), using the modified X!Tandem program as a Hadoop Streaming mapper and reducer. The modified X!Tandem C++ source code is Artistic licensed, supports pluggable scoring, and is available as part of the Sashimi project at http://sashimi.svn.sourceforge.net/viewvc/sashimi/trunk/trans_proteomic_pipeline/extern/xtandem/. The MR-Tandem Python script is Apache licensed and available as part of the Insilicos Cloud Army project at http://ica.svn.sourceforge.net/viewvc/ica/trunk/mr-tandem/. Full documentation and a windows installer that configures MR-Tandem, Python and all necessary packages are available at this same URL. brian.pratt@insilicos.com

  3. Hypersonic simulations using open-source CFD and DSMC solvers

    NASA Astrophysics Data System (ADS)

    Casseau, V.; Scanlon, T. J.; John, B.; Emerson, D. R.; Brown, R. E.

    2016-11-01

    Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against.

  4. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  5. An integrated, structure- and energy-based view of the genetic code.

    PubMed

    Grosjean, Henri; Westhof, Eric

    2016-09-30

    The principles of mRNA decoding are conserved among all extant life forms. We present an integrative view of all the interaction networks between mRNA, tRNA and rRNA: the intrinsic stability of codon-anticodon duplex, the conformation of the anticodon hairpin, the presence of modified nucleotides, the occurrence of non-Watson-Crick pairs in the codon-anticodon helix and the interactions with bases of rRNA at the A-site decoding site. We derive a more information-rich, alternative representation of the genetic code, that is circular with an unsymmetrical distribution of codons leading to a clear segregation between GC-rich 4-codon boxes and AU-rich 2:2-codon and 3:1-codon boxes. All tRNA sequence variations can be visualized, within an internal structural and energy framework, for each organism, and each anticodon of the sense codons. The multiplicity and complexity of nucleotide modifications at positions 34 and 37 of the anticodon loop segregate meaningfully, and correlate well with the necessity to stabilize AU-rich codon-anticodon pairs and to avoid miscoding in split codon boxes. The evolution and expansion of the genetic code is viewed as being originally based on GC content with progressive introduction of A/U together with tRNA modifications. The representation we present should help the engineering of the genetic code to include non-natural amino acids. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Parallel computing techniques for rotorcraft aerodynamics

    NASA Astrophysics Data System (ADS)

    Ekici, Kivanc

    The modification of unsteady three-dimensional Navier-Stokes codes for application on massively parallel and distributed computing environments is investigated. The Euler/Navier-Stokes code TURNS (Transonic Unsteady Rotor Navier-Stokes) was chosen as a test bed because of its wide use by universities and industry. For the efficient implementation of TURNS on parallel computing systems, two algorithmic changes are developed. First, main modifications to the implicit operator, Lower-Upper Symmetric Gauss Seidel (LU-SGS) originally used in TURNS, is performed. Second, application of an inexact Newton method, coupled with a Krylov subspace iterative method (Newton-Krylov method) is carried out. Both techniques have been tried previously for the Euler equations mode of the code. In this work, we have extended the methods to the Navier-Stokes mode. Several new implicit operators were tried because of convergence problems of traditional operators with the high cell aspect ratio (CAR) grids needed for viscous calculations on structured grids. Promising results for both Euler and Navier-Stokes cases are presented for these operators. For the efficient implementation of Newton-Krylov methods to the Navier-Stokes mode of TURNS, efficient preconditioners must be used. The parallel implicit operators used in the previous step are employed as preconditioners and the results are compared. The Message Passing Interface (MPI) protocol has been used because of its portability to various parallel architectures. It should be noted that the proposed methodology is general and can be applied to several other CFD codes (e.g. OVERFLOW).

  7. The tubulin code at a glance.

    PubMed

    Gadadhar, Sudarshan; Bodakuntla, Satish; Natarajan, Kathiresan; Janke, Carsten

    2017-04-15

    Microtubules are key cytoskeletal elements of all eukaryotic cells and are assembled of evolutionarily conserved α-tubulin-β-tubulin heterodimers. Despite their uniform structure, microtubules fulfill a large diversity of functions. A regulatory mechanism to control the specialization of the microtubule cytoskeleton is the 'tubulin code', which is generated by (i) expression of different α- and β-tubulin isotypes, and by (ii) post-translational modifications of tubulin. In this Cell Science at a Glance article and the accompanying poster, we provide a comprehensive overview of the molecular components of the tubulin code, and discuss the mechanisms by which these components contribute to the generation of functionally specialized microtubules. © 2017. Published by The Company of Biologists Ltd.

  8. An update on the BQCD Hybrid Monte Carlo program

    NASA Astrophysics Data System (ADS)

    Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk

    2018-03-01

    We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.

  9. Beyond Nazi War Crimes Experiments: The Voluntary Consent Requirement of the Nuremberg Code at 70.

    PubMed

    Annas, George J

    2018-01-01

    The year 2017 marks both the 70th anniversary of the Nuremberg Code and the first major revisions of federal research regulations in almost 3 decades. I suggest that the informed consent provisions of the federal research regulations continue to follow the requirements of the Nuremberg Code. However, modifications are needed to the informed consent (and institutional review board) provisions to make the revised federal regulations more effective in promoting a genuine conversation between the researcher and the research subject. This conversation must take seriously both the therapeutic illusion and the desire of both the researcher and the research subject not to engage in sharing uncertainty.

  10. Extensions and improvements on XTRAN3S

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Improvements to the XTRAN3S computer program are summarized. Work on this code, for steady and unsteady aerodynamic and aeroelastic analysis in the transonic flow regime has concentrated on the following areas: (1) Maintenance of the XTRAN3S code, including correction of errors, enhancement of operational capability, and installation on the Cray X-MP system; (2) Extension of the vectorization concepts in XTRAN3S to include additional areas of the code for improved execution speed; (3) Modification of the XTRAN3S algorithm for improved numerical stability for swept, tapered wing cases and improved computational efficiency; and (4) Extension of the wing-only version of XTRAN3S to include pylon and nacelle or external store capability.

  11. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.

    PubMed

    Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-03-31

    As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 31.03.2017.

  12. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  13. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  14. Modeling the Volcanic Source at Long Valley, CA, Using a Genetic Algorithm Technique

    NASA Technical Reports Server (NTRS)

    Tiampo, Kristy F.

    1999-01-01

    In this project, we attempted to model the deformation pattern due to the magmatic source at Long Valley caldera using a real-value coded genetic algorithm (GA) inversion similar to that found in Michalewicz, 1992. The project has been both successful and rewarding. The genetic algorithm, coded in the C programming language, performs stable inversions over repeated trials, with varying initial and boundary conditions. The original model used a GA in which the geophysical information was coded into the fitness function through the computation of surface displacements for a Mogi point source in an elastic half-space. The program was designed to invert for a spherical magmatic source - its depth, horizontal location and volume - using the known surface deformations. It also included the capability of inverting for multiple sources.

  15. Scalable video transmission over Rayleigh fading channels using LDPC codes

    NASA Astrophysics Data System (ADS)

    Bansal, Manu; Kondi, Lisimachos P.

    2005-03-01

    In this paper, we investigate an important problem of efficiently utilizing the available resources for video transmission over wireless channels while maintaining a good decoded video quality and resilience to channel impairments. Our system consists of the video codec based on 3-D set partitioning in hierarchical trees (3-D SPIHT) algorithm and employs two different schemes using low-density parity check (LDPC) codes for channel error protection. The first method uses the serial concatenation of the constant-rate LDPC code and rate-compatible punctured convolutional (RCPC) codes. Cyclic redundancy check (CRC) is used to detect transmission errors. In the other scheme, we use the product code structure consisting of a constant rate LDPC/CRC code across the rows of the `blocks' of source data and an erasure-correction systematic Reed-Solomon (RS) code as the column code. In both the schemes introduced here, we use fixed-length source packets protected with unequal forward error correction coding ensuring a strictly decreasing protection across the bitstream. A Rayleigh flat-fading channel with additive white Gaussian noise (AWGN) is modeled for the transmission. The rate-distortion optimization algorithm is developed and carried out for the selection of source coding and channel coding rates using Lagrangian optimization. The experimental results demonstrate the effectiveness of this system under different wireless channel conditions and both the proposed methods (LDPC+RCPC/CRC and RS+LDPC/CRC) outperform the more conventional schemes such as those employing RCPC/CRC.

  16. Navier-Stokes analysis of cold scramjet-afterbody flows

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Engelund, Walter C.; Eleshaky, Mohamed E.

    1989-01-01

    The progress of two efforts in coding solutions of Navier-Stokes equations is summarized. The first effort concerns a 3-D space marching parabolized Navier-Stokes (PNS) code being modified to compute the supersonic mixing flow through an internal/external expansion nozzle with multicomponent gases. The 3-D PNS equations, coupled with a set of species continuity equations, are solved using an implicit finite difference scheme. The completed work is summarized and includes code modifications for four chemical species, computing the flow upstream of the upper cowl for a theoretical air mixture, developing an initial plane solution for the inner nozzle region, and computing the flow inside the nozzle for both a N2/O2 mixture and a Freon-12/Ar mixture, and plotting density-pressure contours for the inner nozzle region. The second effort concerns a full Navier-Stokes code. The species continuity equations account for the diffusion of multiple gases. This 3-D explicit afterbody code has the ability to use high order numerical integration schemes such as the 4th order MacCormack, and the Gottlieb-MacCormack schemes. Changes to the work are listed and include, but are not limited to: (1) internal/external flow capability; (2) new treatments of the cowl wall boundary conditions and relaxed computations around the cowl region and cowl tip; (3) the entering of the thermodynamic and transport properties of Freon-12, Ar, O, and N; (4) modification to the Baldwin-Lomax turbulence model to account for turbulent eddies generated by cowl walls inside and external to the nozzle; and (5) adopting a relaxation formula to account for the turbulence in the mixing shear layer.

  17. Methodology of decreasing software complexity using ontology

    NASA Astrophysics Data System (ADS)

    DÄ browska-Kubik, Katarzyna

    2015-09-01

    In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.

  18. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  19. Astrophysics Source Code Library -- Now even better!

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Schmidt, Judy; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2015-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. Indexed by ADS, it now contains nearly 1,000 codes and with recent major changes, is better than ever! The resource has a new infrastructure that offers greater flexibility and functionality for users, including an easier submission process, better browsing, one-click author search, and an RSS feeder for news. The new database structure is easier to maintain and offers new possibilities for collaboration. Come see what we've done!

  20. Study of statistical coding for digital TV

    NASA Technical Reports Server (NTRS)

    Gardenhire, L. W.

    1972-01-01

    The results are presented for a detailed study to determine a pseudo-optimum statistical code to be installed in a digital TV demonstration test set. Studies of source encoding were undertaken, using redundancy removal techniques in which the picture is reproduced within a preset tolerance. A method of source encoding, which preliminary studies show to be encouraging, is statistical encoding. A pseudo-optimum code was defined and the associated performance of the code was determined. The format was fixed at 525 lines per frame, 30 frames per second, as per commercial standards.

Top