Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baratta, A.J.
1997-07-01
To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts andmore » engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.« less
User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.
1982-01-01
This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.
2004-09-14
This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.
Use the Bar Code System to Improve Accuracy of the Patient and Sample Identification.
Chuang, Shu-Hsia; Yeh, Huy-Pzu; Chi, Kun-Hung; Ku, Hsueh-Chen
2018-01-01
In time and correct sample collection were highly related to patient's safety. The sample error rate was 11.1%, because misbranded patient information and wrong sample containers during January to April, 2016. We developed a barcode system of "Specimens Identify System" through process of reengineering of TRM, used bar code scanners, add sample container instructions, and mobile APP. Conclusion, the bar code systems improved the patient safety and created green environment.
The general theory of convolutional codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Stanley, R. P.
1993-01-01
This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.
GTA Welding Research and Development for Plutonium Containment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sessions, C.E.
2002-02-21
This paper discusses the development of two welding systems that are used to contain actinide metals and oxides for long term storage. The systems are termed the bagless transfer system (BTS) and the outer container welder (OCW) system. The BTS is so named because it permits the containment of actinides without a polymeric package (i.e., bag). The development of these two systems was directed by Department of Energy Standard 3013, hereafter referred to as DOE 3013. This document defines the product and container requirements. In addition, it references national codes and standards for leak rates, ANSI N14.5, and design, Americanmore » Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section VIII (BandPVC).« less
The Library Systems Act and Rules for Administering the Library Systems Act.
ERIC Educational Resources Information Center
Texas State Library, Austin. Library Development Div.
This document contains the Texas Library Systems Act and rules for administering the Library Systems Act. Specifically, it includes the following documents: Texas Library Systems Act; Summary of Codes;Texas Administrative Code: Service Complaints and Protest Procedure; Criteria For Texas Library System Membership; and Certification Requirements…
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
Practical guide to bar coding for patient medication safety.
Neuenschwander, Mark; Cohen, Michael R; Vaida, Allen J; Patchett, Jeffrey A; Kelly, Jamie; Trohimovich, Barbara
2003-04-15
Bar coding for the medication administration step of the drug-use process is discussed. FDA will propose a rule in 2003 that would require bar-code labels on all human drugs and biologicals. Even with an FDA mandate, manufacturer procrastination and possible shifts in product availability are likely to slow progress. Such delays should not preclude health systems from adopting bar-code-enabled point-of-care (BPOC) systems to achieve gains in patient safety. Bar-code technology is a replacement for traditional keyboard data entry. The elements of bar coding are content, which determines the meaning; data format, which refers to the embedded data and symbology, which describes the "font" in which the machine-readable code is written. For a BPOC system to deliver an acceptable level of patient protection, the hospital must first establish reliable processes for a patient identification band, caregiver badge, and medication bar coding. Medications can have either drug-specific or patient-specific bar codes. Both varieties result in the desired code that supports patient's five rights of drug administration. When medications are not available from the manufacturer in immediate-container bar-coded packaging, other means of applying the bar code must be devised, including the use of repackaging equipment, overwrapping, manual bar coding, and outsourcing. Virtually all medications should be bar coded, the bar code on the label should be easily readable, and appropriate policies, procedures, and checks should be in place. Bar coding has the potential to be not only cost-effective but to produce a return on investment. By bar coding patient identification tags, caregiver badges, and immediate-container medications, health systems can substantially increase patient safety during medication administration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.
This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARCmore » and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.« less
New quantum codes derived from a family of antiprimitive BCH codes
NASA Astrophysics Data System (ADS)
Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin
The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.
Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murata, K.K.; Williams, D.C.; Griffith, R.O.
1997-12-01
The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less
Arabic Natural Language Processing System Code Library
2014-06-01
Code Compilation 2 4. Training Instructions 2 5. Applying the System to New Examples 2 6. License 3 7. History 3 8. Important Note 4 9. Papers to...a slightly different English dependency scheme and contained a variety of improvements. However, the PropBank-style SRL module was not maintained...than those in the http://sourceforge.net/projects/miacp/ release.) 8. Important Note This release contains a variety of bug fixes and other generally
Edge-diffraction effects in RCS predictions and their importance in systems analysis
NASA Astrophysics Data System (ADS)
Friess, W. F.; Klement, D.; Ruppel, M.; Stein, Volker
1996-06-01
In developing RCS prediction codes a variety of physical effects such as the edge diffraction effect have to be considered with the consequence that the computer effort increases considerably. This fact limits the field of application of such codes, especially if the RCS data serve as input parameters for system simulators which very often need these data for a high number of observation angles and/or frequencies. Vice versa the issues of a system analysis can be used to estimate the relevance of physical effects under system viewpoints and to rank them according to their magnitude. This paper tries to evaluate the importance of RCS predictions containing an edge diffracted field for systems analysis. A double dihedral with a strong depolarizing behavior and a generic airplane design containing many arbitrarily oriented edges are used as test structures. Data of the scattered field are generated by the RCS computer code SIGMA with and without including edge diffraction effects. These data are submitted to the code DORA to determine radar range and radar detectibility and to a SAR simulator code to generate SAR imagery. In both cases special scenarios are assumed. The essential features of the computer codes in their current state are described, the results are presented and discussed under systems viewpoints.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Container-code recognition system based on computer vision and deep neural networks
NASA Astrophysics Data System (ADS)
Liu, Yi; Li, Tianjian; Jiang, Li; Liang, Xiaoyao
2018-04-01
Automatic container-code recognition system becomes a crucial requirement for ship transportation industry in recent years. In this paper, an automatic container-code recognition system based on computer vision and deep neural networks is proposed. The system consists of two modules, detection module and recognition module. The detection module applies both algorithms based on computer vision and neural networks, and generates a better detection result through combination to avoid the drawbacks of the two methods. The combined detection results are also collected for online training of the neural networks. The recognition module exploits both character segmentation and end-to-end recognition, and outputs the recognition result which passes the verification. When the recognition module generates false recognition, the result will be corrected and collected for online training of the end-to-end recognition sub-module. By combining several algorithms, the system is able to deal with more situations, and the online training mechanism can improve the performance of the neural networks at runtime. The proposed system is able to achieve 93% of overall recognition accuracy.
Analysis of PANDA Passive Containment Cooling Steady-State Tests with the Spectra Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stempniewicz, Marek M
2000-07-15
Results of post test simulation of the PANDA passive containment cooling (PCC) steady-state tests (S-series tests), performed at the PANDA facility at the Paul Scherrer Institute, Switzerland, are presented. The simulation has been performed using the computer code SPECTRA, a thermal-hydraulic code, designed specifically for analyzing containment behavior of nuclear power plants.Results of the present calculations are compared to the measurement data as well as the results obtained earlier with the codes MELCOR, TRAC-BF1, and TRACG. The calculated PCC efficiencies are somewhat lower than the measured values. Similar underestimation of PCC efficiencies had been obtained in the past, with themore » other computer codes. To explain this difference, it is postulated that condensate coming into the tubes forms a stream of liquid in one or two tubes, leaving most of the tubes unaffected. The condensate entering the water box is assumed to fall down in the form of droplets. With these assumptions, the results calculated with SPECTRA are close to the experimental data.It is concluded that the SPECTRA code is a suitable tool for analyzing containments of advanced reactors, equipped with passive containment cooling systems.« less
ABSIM. Simulation of Absorption Systems in Flexible and Modular Form
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grossman, G.
1994-06-01
The computer code has been developed for simulation of absorption systems at steady-state in a flexible and modular form, making it possible to investigate various cycle configurations with different working fluids. The code is based on unit subroutines containing the governing equations for the system`s components. When all the equations have been established, a mathematical solver routine is employed to solve them simultaneously. Property subroutines contained in a separate data base serve to provide thermodynamic properties of the working fluids. The code is user-oriented and requires a relatively simple input containing the given operating conditions and the working fluid atmore » each state point. the user conveys to the computer an image of the cycle by specifying the different components and their interconnections. Based on this information, the program calculates the temperature, flowrate, concentration, pressure and vapor fraction at each state point in the system and the heat duty at each unit, from which the coefficient of performance may be determined. A graphical user-interface is provided to facilitate interactive input and study of the output.« less
Jones, B E; South, B R; Shao, Y; Lu, C C; Leng, J; Sauer, B C; Gundlapalli, A V; Samore, M H; Zeng, Q
2018-01-01
Identifying pneumonia using diagnosis codes alone may be insufficient for research on clinical decision making. Natural language processing (NLP) may enable the inclusion of cases missed by diagnosis codes. This article (1) develops a NLP tool that identifies the clinical assertion of pneumonia from physician emergency department (ED) notes, and (2) compares classification methods using diagnosis codes versus NLP against a gold standard of manual chart review to identify patients initially treated for pneumonia. Among a national population of ED visits occurring between 2006 and 2012 across the Veterans Affairs health system, we extracted 811 physician documents containing search terms for pneumonia for training, and 100 random documents for validation. Two reviewers annotated span- and document-level classifications of the clinical assertion of pneumonia. An NLP tool using a support vector machine was trained on the enriched documents. We extracted diagnosis codes assigned in the ED and upon hospital discharge and calculated performance characteristics for diagnosis codes, NLP, and NLP plus diagnosis codes against manual review in training and validation sets. Among the training documents, 51% contained clinical assertions of pneumonia; in the validation set, 9% were classified with pneumonia, of which 100% contained pneumonia search terms. After enriching with search terms, the NLP system alone demonstrated a recall/sensitivity of 0.72 (training) and 0.55 (validation), and a precision/positive predictive value (PPV) of 0.89 (training) and 0.71 (validation). ED-assigned diagnostic codes demonstrated lower recall/sensitivity (0.48 and 0.44) but higher precision/PPV (0.95 in training, 1.0 in validation); the NLP system identified more "possible-treated" cases than diagnostic coding. An approach combining NLP and ED-assigned diagnostic coding classification achieved the best performance (sensitivity 0.89 and PPV 0.80). System-wide application of NLP to clinical text can increase capture of initial diagnostic hypotheses, an important inclusion when studying diagnosis and clinical decision-making under uncertainty. Schattauer GmbH Stuttgart.
ERIC Educational Resources Information Center
Hounsell, D.; And Others
This guide for teachers to the tape indexing system (TANDEM) in use at the Modern Languages Department at Portsmouth Polytechnic focuses on tape classification, numbering, labeling, and shelving system procedures. The appendixes contain information on: (1) the classification system and related codes, (2) color and letter codes, (3) marking of tape…
The SIFT hardware/software systems. Volume 2: Software listings
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.
1985-01-01
This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.
The analysis of convolutional codes via the extended Smith algorithm
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Onyszchuk, I.
1993-01-01
Convolutional codes have been the central part of most error-control systems in deep-space communication for many years. Almost all such applications, however, have used the restricted class of (n,1), also known as 'rate 1/n,' convolutional codes. The more general class of (n,k) convolutional codes contains many potentially useful codes, but their algebraic theory is difficult and has proved to be a stumbling block in the evolution of convolutional coding systems. In this article, the situation is improved by describing a set of practical algorithms for computing certain basic things about a convolutional code (among them the degree, the Forney indices, a minimal generator matrix, and a parity-check matrix), which are usually needed before a system using the code can be built. The approach is based on the classic Forney theory for convolutional codes, together with the extended Smith algorithm for polynomial matrices, which is introduced in this article.
ABSIM. Simulation of Absorption Systems in Flexible and Modular Form
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grossman, G.
1994-06-01
The computer code has been developed for simulation of absorption systems at steady-state in a flexible and modular form, making it possible to investigate various cycle configurations with different working fluids. The code is based on unit subroutines containing the governing equations for the system's components. When all the equations have been established, a mathematical solver routine is employed to solve them simultaneously. Property subroutines contained in a separate data base serve to provide thermodynamic properties of the working fluids. The code is user-oriented and requires a relatively simple input containing the given operating conditions and the working fluid atmore » each state point. the user conveys to the computer an imagev of the cycle by specifying the different components and their interconnections. Based on this information, the program calculates the temperature, flowrate, concentration, pressure and vapor fraction at each state point in the system and the heat duty at each unit, from which the coefficient of performance may be determined. A graphical user-interface is provided to fcilitate interactive input and study of the output.« less
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
From Verified Models to Verifiable Code
NASA Technical Reports Server (NTRS)
Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.
General RMP Guidance - Appendix B: Selected NAICS Codes
This appendix contains a list of selected 2002 North American Industry Classification System (NAICS) codes used by Federal statistical agencies, in designating business types or functions in categories such as farming, manufacturing, and waste management.
Morse Monte Carlo Radiation Transport Code System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Van Yperen-De Deyne, A; Pauwels, E; Van Speybroeck, V; Waroquier, M
2012-08-14
In this paper an overview is presented of several approximations within Density Functional Theory (DFT) to calculate g-tensors in transition metal containing systems and a new accurate description of the spin-other-orbit contribution for high spin systems is suggested. Various implementations in a broad variety of software packages (ORCA, ADF, Gaussian, CP2K, GIPAW and BAND) are critically assessed on various aspects including (i) non-relativistic versus relativistic Hamiltonians, (ii) spin-orbit coupling contributions and (iii) the gauge. Particular attention is given to the level of accuracy that can be achieved for codes that allow g-tensor calculations under periodic boundary conditions, as these are ideally suited to efficiently describe extended condensed-phase systems containing transition metals. In periodic codes like CP2K and GIPAW, the g-tensor calculation schemes currently suffer from an incorrect treatment of the exchange spin-orbit interaction and a deficient description of the spin-other-orbit term. In this paper a protocol is proposed, making the predictions of the exchange part to the g-tensor shift more plausible. Focus is also put on the influence of the spin-other-orbit interaction which becomes of higher importance for high-spin systems. In a revisited derivation of the various terms arising from the two-electron spin-orbit and spin-other-orbit interaction (SOO), new insight has been obtained revealing amongst other issues new terms for the SOO contribution. The periodic CP2K code has been adapted in view of this new development. One of the objectives of this study is indeed a serious enhancement of the performance of periodic codes in predicting g-tensors in transition metal containing systems at the same level of accuracy as the most advanced but time consuming spin-orbit mean-field approach. The methods are first applied on rhodium carbide but afterwards extended to a broad test set of molecules containing transition metals from the fourth, fifth and sixth row of the periodic table. The set contains doublets as well as high-spin molecules.
Peregrine System User Basics | High-Performance Computing | NREL
peregrine.hpc.nrel.gov or to one of the login nodes. Example commands to access Peregrine from a Linux or Mac OS X system Code Example Create a file called hello.F90 containing the following code: program hello write(6 information by enclosing it in brackets < >. For example: $ ssh -Y
A Manual for Coding Descriptions, Interpretations, and Evaluations of Visual Art Forms.
ERIC Educational Resources Information Center
Acuff, Bette C.; Sieber-Suppes, Joan
This manual presents a system for categorizing stated esthetic responses to paintings. It is primarily a training manual for coders, but it may also be used for teaching reflective thinking skills and for evaluating programs of art education. The coding system contains 33 subdivisions of esthetic responses under three major categories: Cue…
Ex-Vessel Core Melt Modeling Comparison between MELTSPREAD-CORQUENCH and MELCOR 2.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robb, Kevin R.; Farmer, Mitchell; Francis, Matthew W.
System-level code analyses by both United States and international researchers predict major core melting, bottom head failure, and corium-concrete interaction for Fukushima Daiichi Unit 1 (1F1). Although system codes such as MELCOR and MAAP are capable of capturing a wide range of accident phenomena, they currently do not contain detailed models for evaluating some ex-vessel core melt behavior. However, specialized codes containing more detailed modeling are available for melt spreading such as MELTSPREAD as well as long-term molten corium-concrete interaction (MCCI) and debris coolability such as CORQUENCH. In a preceding study, Enhanced Ex-Vessel Analysis for Fukushima Daiichi Unit 1: Meltmore » Spreading and Core-Concrete Interaction Analyses with MELTSPREAD and CORQUENCH, the MELTSPREAD-CORQUENCH codes predicted the 1F1 core melt readily cooled in contrast to predictions by MELCOR. The user community has taken notice and is in the process of updating their systems codes; specifically MAAP and MELCOR, to improve and reduce conservatism in their ex-vessel core melt models. This report investigates why the MELCOR v2.1 code, compared to the MELTSPREAD and CORQUENCH 3.03 codes, yield differing predictions of ex-vessel melt progression. To accomplish this, the differences in the treatment of the ex-vessel melt with respect to melt spreading and long-term coolability are examined. The differences in modeling approaches are summarized, and a comparison of example code predictions is provided.« less
Avidan, Alexander; Weissman, Charles; Levin, Phillip D
2015-04-01
Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Flight experiment of thermal energy storage
NASA Technical Reports Server (NTRS)
Namkoong, David
1989-01-01
Thermal energy storage (TES) enables a solar dynamic system to deliver constant electric power through periods of sun and shade. Brayton and Stirling power systems under current considerations for missions in the near future require working fluid temperatures in the 1100 to 1300+ K range. TES materials that meet these requirements fall into the fluoride family of salts. These salts store energy as a heat of fusion, thereby transferring heat to the fluid at constant temperature during shade. The principal feature of fluorides that must be taken into account is the change in volume that occurs with melting and freezing. Salts shrink as they solidify, a change reaching 30 percent for some salts. The location of voids that form as result of the shrinkage is critical when the solar dynamic system reemerges into the sun. Hot spots can develop in the TES container or the container can become distorted if the melting salt cannot expand elsewhere. Analysis of the transient, two-phase phenomenon is being incorporated into a three-dimensional computer code. The code is capable of analysis under microgravity as well as 1 g. The objective of the flight program is to verify the predictions of the code, particularly of the void location and its effect on containment temperature. The four experimental packages comprising the program will be the first tests of melting and freezing conducted under microgravity. Each test package will be installed in a Getaway Special container to be carried by the shuttle. The package will be self-contained and independent of shuttle operations other than the initial opening of the container lid and the final closing of the lid. Upon the return of the test package from flight, the TES container will be radiographed and finally partitioned to examine the exact location and shape of the void. Visual inspection of the void and the temperature data during flight will constitute the bases for code verification.
Correcting quantum errors with entanglement.
Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu
2006-10-20
We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems
NASA Astrophysics Data System (ADS)
Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge
Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.
Self-assembled software and method of overriding software execution
Bouchard, Ann M.; Osbourn, Gordon C.
2013-01-08
A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.
Flight experiment of thermal energy storage. [for spacecraft power systems
NASA Technical Reports Server (NTRS)
Namkoong, David
1989-01-01
Thermal energy storage (TES) enables a solar dynamic system to deliver constant electric power through periods of sun and shade. Brayton and Stirling power systems under current considerations for missions in the near future require working fluid temperatures in the 1100 to 1300+ K range. TES materials that meet these requirements fall into the fluoride family of salts. Salts shrink as they solidify, a change reaching 30 percent for some salts. Hot spots can develop in the TES container or the container can become distorted if the melting salt cannot expand elsewhere. Analysis of the transient, two-phase phenomenon is being incorporated into a three-dimensional computer code. The objective of the flight program is to verify the predictions of the code, particularly of the void location and its effect on containment temperature. The four experimental packages comprising the program will be the first tests of melting and freezing conducted under microgravity.
Fatigue Behavior of HY-130 Steel Weldments Containing Fabrication Discontinuities.
1985-04-18
discontinuities to solutions for elliptical discontinuities. One such approach has been formalized in the ASME Section XI Boiler and Pressure Vessel Code [1... Boiler and Pressure Vessel Code , Section XI, "Rules for Inservice Inspection of Nuclear Reactor Coolant Systems," American Society of Mechanical
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.
1979-07-01
User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated.
Factor information retrieval system version 2. 0 (fire) (for microcomputers). Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
FIRE Version 2.0 contains EPA's unique recommended criteria and toxic air emission estimation factors. FIRE consists of: (1) an EPA internal repository system that contains emission factor data identified and collected, and (2) an external distribution system that contains only EPA's recommended factors. The emission factors, compiled from a review of the literature, are identified by pollutant name, CAS number, process and emission source descriptions, SIC code, SCC, and control status. The factors are rated for quality using AP-42 rating criteria.
Short range spread-spectrum radiolocation system and method
Smith, Stephen F.
2003-04-29
A short range radiolocation system and associated methods that allow the location of an item, such as equipment, containers, pallets, vehicles, or personnel, within a defined area. A small, battery powered, self-contained tag is provided to an item to be located. The tag includes a spread-spectrum transmitter that transmits a spread-spectrum code and identification information. A plurality of receivers positioned about the area receive signals from a transmitting tag. The position of the tag, and hence the item, is located by triangulation. The system employs three different ranging techniques for providing coarse, intermediate, and fine spatial position resolution. Coarse positioning information is provided by use of direct-sequence code phase transmitted as a spread-spectrum signal. Intermediate positioning information is provided by the use of a difference signal transmitted with the direct-sequence spread-spectrum code. Fine positioning information is provided by use of carrier phase measurements. An algorithm is employed to combine the three data sets to provide accurate location measurements.
Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen
2018-05-25
Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection probability and lower false alarm probability, it has a lower mean acquisition time than traditional XFAST, DF-XFAST and zero-padding.
Development of a Spacecraft Materials Selector Expert System
NASA Technical Reports Server (NTRS)
Pippin, G.; Kauffman, W. (Technical Monitor)
2002-01-01
This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.
Boyd, David A.; Thevenot, Tracy; Gumbmann, Markus; Honeyman, Allen L.; Hamilton, Ian R.
2000-01-01
Transposon mutagenesis and marker rescue were used to isolate and identify an 8.5-kb contiguous region containing six open reading frames constituting the operon for the sorbitol P-enolpyruvate phosphotransferase transport system (PTS) of Streptococcus mutans LT11. The first gene, srlD, codes for sorbitol-6-phosphate dehydrogenase, followed downstream by srlR, coding for a transcriptional regulator; srlM, coding for a putative activator; and the srlA, srlE, and srlB genes, coding for the EIIC, EIIBC, and EIIA components of the sorbitol PTS, respectively. Among all sorbitol PTS operons characterized to date, the srlD gene is found after the genes coding for the EII components; thus, the location of the gene in S. mutans is unique. The SrlR protein is similar to several transcriptional regulators found in Bacillus spp. that contain PTS regulator domains (J. Stülke, M. Arnaud, G. Rapoport, and I. Martin-Verstraete, Mol. Microbiol. 28:865–874, 1998), and its gene overlaps the srlM gene by 1 bp. The arrangement of these two regulatory genes is unique, having not been reported for other bacteria. PMID:10639465
Secure ADS-B authentication system and method
NASA Technical Reports Server (NTRS)
Viggiano, Marc J (Inventor); Valovage, Edward M (Inventor); Samuelson, Kenneth B (Inventor); Hall, Dana L (Inventor)
2010-01-01
A secure system for authenticating the identity of ADS-B systems, including: an authenticator, including a unique id generator and a transmitter transmitting the unique id to one or more ADS-B transmitters; one or more ADS-B transmitters, including a receiver receiving the unique id, one or more secure processing stages merging the unique id with the ADS-B transmitter's identification, data and secret key and generating a secure code identification and a transmitter transmitting a response containing the secure code and ADSB transmitter's data to the authenticator; the authenticator including means for independently determining each ADS-B transmitter's secret key, a receiver receiving each ADS-B transmitter's response, one or more secure processing stages merging the unique id, ADS-B transmitter's identification and data and generating a secure code, and comparison processing comparing the authenticator-generated secure code and the ADS-B transmitter-generated secure code and providing an authentication signal based on the comparison result.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
...This document contains notices of pendency before the Department of Labor (the Department) of proposed exemptions from certain of the prohibited transaction restrictions of the Employee Retirement Income Security Act of 1974 (ERISA or the Act) and/or the Internal Revenue Code of 1986 (the Code).
User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, S.B.; Rainey, R.H.
1979-05-01
The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.
Scientific and Technical Publishing at Goddard Space Flight Center in Fiscal Year 1994
NASA Technical Reports Server (NTRS)
1994-01-01
This publication is a compilation of scientific and technical material that was researched, written, prepared, and disseminated by the Center's scientists and engineers during FY94. It is presented in numerical order of the GSFC author's sponsoring technical directorate; i.e., Code 300 is the Office of Flight Assurance, Code 400 is the Flight Projects Directorate, Code 500 is the Mission Operations and Data Systems Directorate, Code 600 is the Space Sciences Directorate, Code 700 is the Engineering Directorate, Code 800 is the Suborbital Projects and Operations Directorate, and Code 900 is the Earth Sciences Directorate. The publication database contains publication or presentation title, author(s), document type, sponsor, and organizational code. This is the second annual compilation for the Center.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1989-01-01
Two aspects of the work for NASA are examined: the construction of multi-dimensional phase modulation trellis codes and a performance analysis of these codes. A complete list is contained of all the best trellis codes for use with phase modulation. LxMPSK signal constellations are included for M = 4, 8, and 16 and L = 1, 2, 3, and 4. Spectral efficiencies range from 1 bit/channel symbol (equivalent to rate 1/2 coded QPSK) to 3.75 bits/channel symbol (equivalent to 15/16 coded 16-PSK). The parity check polynomials, rotational invariance properties, free distance, path multiplicities, and coding gains are given for all codes. These codes are considered to be the best candidates for implementation of a high speed decoder for satellite transmission. The design of a hardware decoder for one of these codes, viz., the 16-state 3x8-PSK code with free distance 4.0 and coding gain 3.75 dB is discussed. An exhaustive simulation study of the multi-dimensional phase modulation trellis codes is contained. This study was motivated by the fact that coding gains quoted for almost all codes found in literature are in fact only asymptotic coding gains, i.e., the coding gain at very high signal to noise ratios (SNRs) or very low BER. These asymptotic coding gains can be obtained directly from a knowledge of the free distance of the code. On the other hand, real coding gains at BERs in the range of 10(exp -2) to 10(exp -6), where these codes are most likely to operate in a concatenated system, must be done by simulation.
Users manual for coordinate generation code CRDSRA
NASA Technical Reports Server (NTRS)
Shamroth, S. J.
1985-01-01
Generation of a viable coordinate system represents an important component of an isolated airfoil Navier-Stokes calculation. The manual describes a computer code for generation of such a coordinate system. The coordinate system is a general nonorthogonal one in which high resolution normal to the airfoil is obtained in the vicinity of the airfoil surface, and high resolution along the airfoil surface is obtained in the vicinity of the airfoil leading edge. The method of generation is a constructive technique which leads to a C type coordinate grid. The method of construction as well as input and output definitions are contained herein. The computer code itself as well as a sample output is being submitted to COSMIC.
An integrated radiation physics computer code system.
NASA Technical Reports Server (NTRS)
Steyn, J. J.; Harris, D. W.
1972-01-01
An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.
Jiao, Shuming; Jin, Zhi; Zhou, Changyuan; Zou, Wenbin; Li, Xia
2018-01-01
Quick response (QR) code has been employed as a data carrier for optical cryptosystems in many recent research works, and the error-correction coding mechanism allows the decrypted result to be noise free. However, in this paper, we point out for the first time that the Reed-Solomon coding algorithm in QR code is not a very suitable option for the nonlocally distributed speckle noise in optical cryptosystems from an information coding perspective. The average channel capacity is proposed to measure the data storage capacity and noise-resistant capability of different encoding schemes. We design an alternative 2D barcode scheme based on Bose-Chaudhuri-Hocquenghem (BCH) coding, which demonstrates substantially better average channel capacity than QR code in numerical simulated optical cryptosystems.
17 CFR 232.106 - Prohibition against electronic submissions containing executable code.
Code of Federal Regulations, 2010 CFR
2010-04-01
... executable code will be suspended, unless the executable code is contained only in one or more PDF documents, in which case the submission will be accepted but the PDF document(s) containing executable code will...
Proceedings of Conference on Variable-Resolution Modeling, Washington, DC, 5-6 May 1992
1992-05-01
of powerful new computer architectures for supporting object-oriented computing. Objects, as self -contained data-code packages with orderly...another entity structure. For example, (copy-entstr e:sys- tcm ’ new -system) creates an entity structure named c:new-system that has the same structure...324 Parry, S-H. (1984): A Self -contained Hierarchical Model Construct. In: Systems Analysis and Modeling in Defense (R.K. Huber, Ed.), New York
RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphreys, S.L.; Miller, L.A.; Monroe, D.K.
1998-04-01
This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less
Image Transmission via Spread Spectrum Techniques. Part A
1976-01-01
Code 408 DR. EDWIN H. WRENCH (714-225-6871) Code 408 and HARPER J. WHITEHOUSE (714:225-6315), Code 4002 Naval Undersea Center San Diego. California...progress report appears in two parts. Part A is a summary of work done in support of this program at the Naval Undersea Center. Part B contains final...a technical description of the bandwidth compression system developed at the Naval Undersea Center. This paper is an excerpt from the specifications
Identification coding schemes for modulated reflectance systems
Coates, Don M [Santa Fe, NM; Briles, Scott D [Los Alamos, NM; Neagley, Daniel L [Albuquerque, NM; Platts, David [Santa Fe, NM; Clark, David D [Santa Fe, NM
2006-08-22
An identifying coding apparatus employing modulated reflectance technology involving a base station emitting a RF signal, with a tag, located remotely from the base station, and containing at least one antenna and predetermined other passive circuit components, receiving the RF signal and reflecting back to the base station a modulated signal indicative of characteristics related to the tag.
FLUKA: A Multi-Particle Transport Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan
2005-12-14
This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.
Time synchronized video systems
NASA Technical Reports Server (NTRS)
Burnett, Ron
1994-01-01
The idea of synchronizing multiple video recordings to some type of 'range' time has been tried to varying degrees of success in the past. Combining this requirement with existing time code standards (SMPTE) and the new innovations in desktop multimedia however, have afforded an opportunity to increase the flexibility and usefulness of such efforts without adding costs over the traditional data recording and reduction systems. The concept described can use IRIG, GPS or a battery backed internal clock as the master time source. By converting that time source to Vertical Interval Time Code or Longitudinal Time Code, both in accordance with the SMPTE standards, the user will obtain a tape that contains machine/computer readable time code suitable for use with editing equipment that is available off-the-shelf. Accuracy on playback is then determined by the playback system chosen by the user. Accuracies of +/- 2 frames are common among inexpensive systems and complete frame accuracy is more a matter of the users' budget than the capability of the recording system.
Intrasystem Analysis Program (IAP) code summaries
NASA Astrophysics Data System (ADS)
Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.
1983-05-01
This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.
1995-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less
Florida School Laws. Chapters 228-246 Florida Statutes. 1998 Edition.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee.
This volume of Florida School Laws contains chapters 228 through 246 of the Florida Statutes, which comprise "The Florida School Code." The laws contain those statutes specifically applicable to public schools, community colleges, postsecondary institutions, all other institutions and agencies included as a part of the state system of…
Pesticide Product Information System (PPIS)
The Pesticide Product Information System contains information concerning all pesticide products registered in the United States. It includes registrant name and address, chemical ingredients, toxicity category, product names, distributor brand names, site/pest uses, pesticidal type, formulation code, and registration status.
Experimental QR code optical encryption: noise-free data recovering.
Barrera, John Fredy; Mira-Agudelo, Alejandro; Torroba, Roberto
2014-05-15
We report, to our knowledge for the first time, the experimental implementation of a quick response (QR) code as a "container" in an optical encryption system. A joint transform correlator architecture in an interferometric configuration is chosen as the experimental scheme. As the implementation is not possible in a single step, a multiplexing procedure to encrypt the QR code of the original information is applied. Once the QR code is correctly decrypted, the speckle noise present in the recovered QR code is eliminated by a simple digital procedure. Finally, the original information is retrieved completely free of any kind of degradation after reading the QR code. Additionally, we propose and implement a new protocol in which the reception of the encrypted QR code and its decryption, the digital block processing, and the reading of the decrypted QR code are performed employing only one device (smartphone, tablet, or computer). The overall method probes to produce an outcome far more attractive to make the adoption of the technique a plausible option. Experimental results are presented to demonstrate the practicality of the proposed security system.
California State Library: Processing Center Design and Specifications. Volume III, Coding Manual.
ERIC Educational Resources Information Center
Sherman, Don; Shoffner, Ralph M.
As part of the report on the California State Library Processing Center design and specifications, this volume is a coding manual for the conversion of catalog card data to a machine-readable form. The form is compatible with the national MARC system, while at the same time it contains provisions for problems peculiar to the local situation. This…
A Multiple Sphere T-Matrix Fortran Code for Use on Parallel Computer Clusters
NASA Technical Reports Server (NTRS)
Mackowski, D. W.; Mishchenko, M. I.
2011-01-01
A general-purpose Fortran-90 code for calculation of the electromagnetic scattering and absorption properties of multiple sphere clusters is described. The code can calculate the efficiency factors and scattering matrix elements of the cluster for either fixed or random orientation with respect to the incident beam and for plane wave or localized- approximation Gaussian incident fields. In addition, the code can calculate maps of the electric field both interior and exterior to the spheres.The code is written with message passing interface instructions to enable the use on distributed memory compute clusters, and for such platforms the code can make feasible the calculation of absorption, scattering, and general EM characteristics of systems containing several thousand spheres.
NASA Technical Reports Server (NTRS)
Hargrove, William T.
1991-01-01
This methodology is used to determine inspection procedures and intervals for components contained within tank mounted air compressor systems (TMAC) and base mounted air compressor systems (BMAC). These systems are included in the Pressure Vessel and System Recertification inventory at GSFC.
Zhang, Yinsheng; Zhang, Guoming
2018-01-01
A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.
Code of Federal Regulations, 2010 CFR
2010-04-01
... section 1083: Order of the Securities and Exchange Commission; registered holding company; holding company system; associate company; majority-owned subsidiary company; system group; nonexempt property; and stock... defined in the Internal Revenue Code of 1954, shall be given the respective definition contained in such...
TRIQS: A toolbox for research on interacting quantum systems
NASA Astrophysics Data System (ADS)
Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka
2015-11-01
We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.
1995-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less
A flexible surface wetness sensor using a RFID technique.
Yang, Cheng-Hao; Chien, Jui-Hung; Wang, Bo-Yan; Chen, Ping-Hei; Lee, Da-Sheng
2008-02-01
This paper presents a flexible wetness sensor whose detection signal, converted to a binary code, is transmitted through radio-frequency (RF) waves from a radio-frequency identification integrated circuit (RFID IC) to a remote reader. The flexible sensor, with a fixed operating frequency of 13.56 MHz, contains a RFID IC and a sensor circuit that is fabricated on a flexible printed circuit board (FPCB) using a Micro-Electro-Mechanical-System (MEMS) process. The sensor circuit contains a comb-shaped sensing area surrounded by an octagonal antenna with a width of 2.7 cm. The binary code transmitted from the RFIC to the reader changes if the surface conditions of the detector surface changes from dry to wet. This variation in the binary code can be observed on a digital oscilloscope connected to the reader.
Development of a CFD code for casting simulation
NASA Technical Reports Server (NTRS)
Murph, Jesse E.
1992-01-01
The task of developing a computational fluid dynamics (CFD) code to accurately model the mold filling phase of a casting operation was accomplished in a systematic manner. First the state-of-the-art was determined through a literature search, a code search, and participation with casting industry personnel involved in consortium startups. From this material and inputs from industry personnel, an evaluation of the currently available codes was made. It was determined that a few of the codes already contained sophisticated CFD algorithms and further validation of one of these codes could preclude the development of a new CFD code for this purpose. With industry concurrence, ProCAST was chosen for further evaluation. Two benchmark cases were used to evaluate the code's performance using a Silicon Graphics Personal Iris system. The results of these limited evaluations (because of machine and time constraints) are presented along with discussions of possible improvements and recommendations for further evaluation.
Synthetic alienation of microbial organisms by using genetic code engineering: Why and how?
Kubyshkin, Vladimir; Budisa, Nediljko
2017-08-01
The main goal of synthetic biology (SB) is the creation of biodiversity applicable for biotechnological needs, while xenobiology (XB) aims to expand the framework of natural chemistries with the non-natural building blocks in living cells to accomplish artificial biodiversity. Protein and proteome engineering, which overcome limitation of the canonical amino acid repertoire of 20 (+2) prescribed by the genetic code by using non-canonic amino acids (ncAAs), is one of the main focuses of XB research. Ideally, estranging the genetic code from its current form via systematic introduction of ncAAs should enable the development of bio-containment mechanisms in synthetic cells potentially endowing them with a "genetic firewall" i.e. orthogonality which prevents genetic information transfer to natural systems. Despite rapid progress over the past two decades, it is not yet possible to completely alienate an organism that would use and maintain different genetic code associations permanently. In order to engineer robust bio-contained life forms, the chemical logic behind the amino acid repertoire establishment should be considered. Starting from recent proposal of Hartman and Smith about the genetic code establishment in the RNA world, here the authors mapped possible biotechnological invasion points for engineering of bio-contained synthetic cells equipped with non-canonical functionalities. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burk, K.W.; Andrews, G.L.
1989-02-01
The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less
Coded Modulation in C and MATLAB
NASA Technical Reports Server (NTRS)
Hamkins, Jon; Andrews, Kenneth S.
2011-01-01
This software, written separately in C and MATLAB as stand-alone packages with equivalent functionality, implements encoders and decoders for a set of nine error-correcting codes and modulators and demodulators for five modulation types. The software can be used as a single program to simulate the performance of such coded modulation. The error-correcting codes implemented are the nine accumulate repeat-4 jagged accumulate (AR4JA) low-density parity-check (LDPC) codes, which have been approved for international standardization by the Consultative Committee for Space Data Systems, and which are scheduled to fly on a series of NASA missions in the Constellation Program. The software implements the encoder and decoder functions, and contains compressed versions of generator and parity-check matrices used in these operations.
SU-E-T-493: Accelerated Monte Carlo Methods for Photon Dosimetry Using a Dual-GPU System and CUDA.
Liu, T; Ding, A; Xu, X
2012-06-01
To develop a Graphics Processing Unit (GPU) based Monte Carlo (MC) code that accelerates dose calculations on a dual-GPU system. We simulated a clinical case of prostate cancer treatment. A voxelized abdomen phantom derived from 120 CT slices was used containing 218×126×60 voxels, and a GE LightSpeed 16-MDCT scanner was modeled. A CPU version of the MC code was first developed in C++ and tested on Intel Xeon X5660 2.8GHz CPU, then it was translated into GPU version using CUDA C 4.1 and run on a dual Tesla m 2 090 GPU system. The code was featured with automatic assignment of simulation task to multiple GPUs, as well as accurate calculation of energy- and material- dependent cross-sections. Double-precision floating point format was used for accuracy. Doses to the rectum, prostate, bladder and femoral heads were calculated. When running on a single GPU, the MC GPU code was found to be ×19 times faster than the CPU code and ×42 times faster than MCNPX. These speedup factors were doubled on the dual-GPU system. The dose Result was benchmarked against MCNPX and a maximum difference of 1% was observed when the relative error is kept below 0.1%. A GPU-based MC code was developed for dose calculations using detailed patient and CT scanner models. Efficiency and accuracy were both guaranteed in this code. Scalability of the code was confirmed on the dual-GPU system. © 2012 American Association of Physicists in Medicine.
Chadwick, Georgina; Varagunam, Mira; Brand, Christian; Riley, Stuart A; Maynard, Nick; Crosby, Tom; Michalowski, Julie; Cromwell, David A
2017-06-09
The International Classification of Diseases 10th Revision (ICD-10) system used in the English hospital administrative database (Hospital Episode Statistics (HES)) does not contain a specific code for oesophageal high-grade dysplasia (HGD). The aim of this paper was to examine how patients with HGD were coded in HES and whether it was done consistently. National population-based cohort study of patients with newly diagnosed with HGD in England. The study used data collected prospectively as part of the National Oesophago-Gastric Cancer Audit (NOGCA). These records were linked to HES to investigate the pattern of ICD-10 codes recorded for these patients at the time of diagnosis. All patients with a new diagnosis of HGD between 1 April 2013 and 31 March 2014 in England, who had data submitted to the NOGCA. The main outcome assessed was the pattern of primary and secondary ICD-10 diagnostic codes recorded in the HES records at endoscopy at the time of diagnosis of HGD. Among 452 patients with a new diagnosis of HGD between 1 April 2013 and 31 March 2014, Barrett's oesophagus was the only condition coded in 200 (44.2%) HES records. Records for 59 patients (13.1%) contained no oesophageal conditions. The remaining 193 patients had various diagnostic codes recorded, 93 included a diagnosis of Barrett's oesophagus and 57 included a diagnosis of oesophageal/gastric cardia cancer. HES is not suitable to support national studies looking at the management of HGD. This is one reason for the UK to adopt an extended ICD system (akin to ICD-10-CM). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Biodegradation of paint stripper solvents in a modified gas lift loop bioreactor.
Vanderberg-Twary, L; Steenhoudt, K; Travis, B J; Hanners, J L; Foreman, T M; Brainard, J R
1997-07-05
Paint stripping wastes generated during the decontamination and decommissioning of former nuclear facilities contain paint stripping organics (dichloromethane, 2-propanol, and methanol) and bulk materials containing paint pigments. It is desirable to degrade the organic residues as part of an integrated chemical-biological treatment system. We have developed a modified gas lift loop bioreactor employing a defined consortium of Rhodococcus rhodochrous strain OFS and Hyphomicrobium sp. DM-2 that degrades paint stripper organics. Mass transfer coefficients and kinetic constants for biodegradation in the system were determined. It was found that transfer of organic substrates from surrogate waste into the air and further into the liquid medium in the bioreactor were rapid processes, occurring within minutes. Monod kinetics was employed to model the biodegradation of paint stripping organics. Analysis of the bioreactor process was accomplished with BIOLAB, a mathematical code that simulates coupled mass transfer and biodegradation processes. This code was used to fit experimental data to Monod kinetics and to determine kinetic parameters. The BIOLAB code was also employed to compare activities in the bioreactor of individual microbial cultures to the activities of combined cultures in the bioreactor. This code is of benefit for further optimization and scale-up of the bioreactor for treatment of paint stripping and other volatile organic wastes in bulk materials.
Enhanced absorption cycle computer model
NASA Astrophysics Data System (ADS)
Grossman, G.; Wilk, M.
1993-09-01
Absorption heat pumps have received renewed and increasing attention in the past two decades. The rising cost of electricity has made the particular features of this heat-powered cycle attractive for both residential and industrial applications. Solar-powered absorption chillers, gas-fired domestic heat pumps, and waste-heat-powered industrial temperature boosters are a few of the applications recently subjected to intensive research and development. The absorption heat pump research community has begun to search for both advanced cycles in various multistage configurations and new working fluid combinations with potential for enhanced performance and reliability. The development of working absorption systems has created a need for reliable and effective system simulations. A computer code has been developed for simulation of absorption systems at steady state in a flexible and modular form, making it possible to investigate various cycle configurations with different working fluids. The code is based on unit subroutines containing the governing equations for the system's components and property subroutines containing thermodynamic properties of the working fluids. The user conveys to the computer an image of his cycle by specifying the different subunits and their interconnections. Based on this information, the program calculates the temperature, flow rate, concentration, pressure, and vapor fraction at each state point in the system, and the heat duty at each unit, from which the coefficient of performance (COP) may be determined. This report describes the code and its operation, including improvements introduced into the present version. Simulation results are described for LiBr-H2O triple-effect cycles, LiCl-H2O solar-powered open absorption cycles, and NH3-H2O single-effect and generator-absorber heat exchange cycles. An appendix contains the user's manual.
NASA Technical Reports Server (NTRS)
Paloski, William H.; Odette, Louis L.; Krever, Alfred J.; West, Allison K.
1987-01-01
A real-time expert system is being developed to serve as the astronaut interface for a series of Spacelab vestibular experiments. This expert system is written in a version of Prolog that is itself written in Forth. The Prolog contains a predicate that can be used to execute Forth definitions; thus, the Forth becomes an embedded real-time operating system within the Prolog programming environment. The expert system consists of a data base containing detailed operational instructions for each experiment, a rule base containing Prolog clauses used to determine the next step in an experiment sequence, and a procedure base containing Prolog goals formed from real-time routines coded in Forth. In this paper, we demonstrate and describe the techniques and considerations used to develop this real-time expert system, and we conclude that Forth-based Prolog provides a viable implementation vehicle for this and similar applications.
Code of Federal Regulations, 2013 CFR
2013-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2012 CFR
2012-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2011 CFR
2011-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2014 CFR
2014-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Code of Federal Regulations, 2010 CFR
2010-01-01
... hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... processing and hermetically sealed containers; closures; code marking; heat processing; incubation. (a... storage and transportation as evidenced by the incubation test. (h) Lots of canned products shall be...
Predictions of Energy Savings in HVAC Systems by Lumped Models (Preprint)
2010-04-14
various control devices into a simulated HVAC system. Con- trols contain a setpoint of 26.7oC. The adjustable damper, variable chiller work input, and variable fanspeed contain values of αP of -1.0, 0.1, and 1.0, respectively. 25 ...Villanova, PA 19085 bCode 985, Naval System Warfare Center, Carderock Division, Philadelphia, PA 19112 Abstract An approach to optimizing the energy...suggest an order of mag- nitude greater energy savings using a variable chiller power control approach compared to control damper and variable-drive
The development of fire evaluation system for detention and correctional occupancies
NASA Astrophysics Data System (ADS)
Nelson, H. E.; Shibe, A. J.
1984-12-01
A fire safety evaluation system for detention and correctional occupancies was developed. It can be used for determining if a facility has fire safety equivalent to that obtained by meeting the requirement of a given code. The system was calibrated for use with proposed chapters for detention and correctional occupancies of the Life Safety Code (1985). There are separate sets of requirements for each of four use conditions: one for zoned egress, one for zoned impeded egress, one for impeded egress, and one for contained. Within each set, there are two levels of evaluation: one for partially sprinklered and nonsprinklered buildings, and one for totally sprinklered buildings.
Finite element methods in a simulation code for offshore wind turbines
NASA Astrophysics Data System (ADS)
Kurz, Wolfgang
1994-06-01
Offshore installation of wind turbines will become important for electricity supply in future. Wind conditions above sea are more favorable than on land and appropriate locations on land are limited and restricted. The dynamic behavior of advanced wind turbines is investigated with digital simulations to reduce time and cost in development and design phase. A wind turbine can be described and simulated as a multi-body system containing rigid and flexible bodies. Simulation of the non-linear motion of such a mechanical system using a multi-body system code is much faster than using a finite element code. However, a modal representation of the deformation field has to be incorporated in the multi-body system approach. The equations of motion of flexible bodies due to deformation are generated by finite element calculations. At Delft University of Technology the simulation code DUWECS has been developed which simulates the non-linear behavior of wind turbines in time domain. The wind turbine is divided in subcomponents which are represented by modules (e.g. rotor, tower etc.).
ICCE/ICCAI 2000 Full & Short Papers (System Design and Development).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on system design and development from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a code restructuring tool to help scaffold novice programmers; a framework for Internet-based…
49 CFR 1152.12 - Filing and publication.
Code of Federal Regulations, 2010 CFR
2010-10-01
... filed. (b) The color-coded system diagram map or narrative, any amendments, and accompanying line... through 3 lines or lines being revised, a notice containing: (i) A black-and-white copy of the system... black-and-white) or narrative; and (4) Notify interested persons of this availability through its...
EPA Office of Water (OW): SDWIS - HUC12 Densities for Public Surface Water and Groundwater Sources
Public Water System location points, based on information from the Safe Drinking Water Act Information System (SDWIS/Federal) for a 2010 third quarter (SDWIS_2010Q3) baseline period, were applied to relate system latitude and longitude coordinates (LatLongs) to Watershed Boundary Dataset subwatershed polygons (HUC12s). This HUC12 table can be mapped through setting up appropriate table relationships on the attribute HUC_12 with the HUC12 GIS layer that is part of EPA's Reach Address Database (RAD) Version 3. At the present time, the RAD Version 3 contains HUC12 polygons for the conterminous United States (CONUS), Hawaii, Puerto Rico, and the U.S. Virgin Islands (materials for Alaska or for other territories and dependencies are not available as of February, 2010). The records in this table are based on a special QUERY created by the EPA Office of Ground Water and Drinking Water (OGWDW) from the primary SDWIS/FED information to provide a robust point representation for a PWS system. PWS points are selected based on the following prioritization: 1. If the system has a treatment plant with LatLongs and MAD codes; 2. If the system has a treatment plant with LatLongs but without MAD codes; 3. If the system has a well with LatLongs and MAD codes; 4. If the system has a well with LatLongs but without MAD codes; 5. If the system has an intake with LatLongs and MAD codes; 6. If the system has an intake with LatLongs but without MAD codes; 7. If the system has any source
System, methods and apparatus for program optimization for multi-threaded processor architectures
Bastoul, Cedric; Lethin, Richard A; Leung, Allen K; Meister, Benoit J; Szilagyi, Peter; Vasilache, Nicolas T; Wohlford, David E
2015-01-06
Methods, apparatus and computer software product for source code optimization are provided. In an exemplary embodiment, a first custom computing apparatus is used to optimize the execution of source code on a second computing apparatus. In this embodiment, the first custom computing apparatus contains a memory, a storage medium and at least one processor with at least one multi-stage execution unit. The second computing apparatus contains at least two multi-stage execution units that allow for parallel execution of tasks. The first custom computing apparatus optimizes the code for parallelism, locality of operations and contiguity of memory accesses on the second computing apparatus. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims.
EDS V25 containment vessel explosive qualification test report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudolphi, John Joseph
2012-04-01
The V25 containment vessel was procured by the Project Manager, Non-Stockpile Chemical Materiel (PMNSCM) as a replacement vessel for use on the P2 Explosive Destruction Systems. It is the first EDS vessel to be fabricated under Code Case 2564 of the ASME Boiler and Pressure Vessel Code, which provides rules for the design of impulsively loaded vessels. The explosive rating for the vessel based on the Code Case is nine (9) pounds TNT-equivalent for up to 637 detonations. This limit is an increase from the 4.8 pounds TNT-equivalency rating for previous vessels. This report describes the explosive qualification tests thatmore » were performed in the vessel as part of the process for qualifying the vessel for explosive use. The tests consisted of a 11.25 pound TNT equivalent bare charge detonation followed by a 9 pound TNT equivalent detonation.« less
47 CFR 11.51 - EAS code and Attention Signal Transmission requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... transmitting requirements contained in this section for the combined stations or systems with one EAS Encoder... the encoder. (2) Manual interrupt of programming and transmission of EAS messages may be used. EAS...
Tang, Wan; Chen, Min; Ni, Jin; Yang, Ximin
2011-01-01
The traditional Radio Frequency Identification (RFID) system, in which the information maintained in tags is passive and static, has no intelligent decision-making ability to suit application and environment dynamics. The Second-Generation RFID (2G-RFID) system, referred as 2G-RFID-sys, is an evolution of the traditional RFID system to ensure better quality of service in future networks. Due to the openness of the active mobile codes in the 2G-RFID system, the realization of conveying intelligence brings a critical issue: how can we make sure the backend system will interpret and execute mobile codes in the right way without misuse so as to avoid malicious attacks? To address this issue, this paper expands the concept of Role-Based Access Control (RBAC) by introducing context-aware computing, and then designs a secure middleware for backend systems, named Two-Level Security Enhancement Mechanism or 2L-SEM, in order to ensure the usability and validity of the mobile code through contextual authentication and role analysis. According to the given contextual restrictions, 2L-SEM can filtrate the illegal and invalid mobile codes contained in tags. Finally, a reference architecture and its typical application are given to illustrate the implementation of 2L-SEM in a 2G-RFID system, along with the simulation results to evaluate how the proposed mechanism can guarantee secure execution of mobile codes for the system. PMID:22163983
Tang, Wan; Chen, Min; Ni, Jin; Yang, Ximin
2011-01-01
The traditional Radio Frequency Identification (RFID) system, in which the information maintained in tags is passive and static, has no intelligent decision-making ability to suit application and environment dynamics. The Second-Generation RFID (2G-RFID) system, referred as 2G-RFID-sys, is an evolution of the traditional RFID system to ensure better quality of service in future networks. Due to the openness of the active mobile codes in the 2G-RFID system, the realization of conveying intelligence brings a critical issue: how can we make sure the backend system will interpret and execute mobile codes in the right way without misuse so as to avoid malicious attacks? To address this issue, this paper expands the concept of Role-Based Access Control (RBAC) by introducing context-aware computing, and then designs a secure middleware for backend systems, named Two-Level Security Enhancement Mechanism or 2L-SEM, in order to ensure the usability and validity of the mobile code through contextual authentication and role analysis. According to the given contextual restrictions, 2L-SEM can filtrate the illegal and invalid mobile codes contained in tags. Finally, a reference architecture and its typical application are given to illustrate the implementation of 2L-SEM in a 2G-RFID system, along with the simulation results to evaluate how the proposed mechanism can guarantee secure execution of mobile codes for the system.
MCNP calculations for container inspection with tagged neutrons
NASA Astrophysics Data System (ADS)
Boghen, G.; Donzella, A.; Filippini, V.; Fontana, A.; Lunardon, M.; Moretto, S.; Pesente, S.; Zenoni, A.
2005-12-01
We are developing an innovative tagged neutrons inspection system (TNIS) for cargo containers: the system will allow us to assay the chemical composition of suspect objects, previously identified by a standard X-ray radiography. The operation of the system is extensively being simulated by using the MCNP Monte Carlo code to study different inspection geometries, cargo loads and hidden threat materials. Preliminary simulations evaluating the signal and the signal over background ratio expected as a function of the system parameters are presented. The results for a selection of cases are briefly discussed and demonstrate that the system can operate successfully in different filling conditions.
ALPS - A LINEAR PROGRAM SOLVER
NASA Technical Reports Server (NTRS)
Viterna, L. A.
1994-01-01
Linear programming is a widely-used engineering and management tool. Scheduling, resource allocation, and production planning are all well-known applications of linear programs (LP's). Most LP's are too large to be solved by hand, so over the decades many computer codes for solving LP's have been developed. ALPS, A Linear Program Solver, is a full-featured LP analysis program. ALPS can solve plain linear programs as well as more complicated mixed integer and pure integer programs. ALPS also contains an efficient solution technique for pure binary (0-1 integer) programs. One of the many weaknesses of LP solvers is the lack of interaction with the user. ALPS is a menu-driven program with no special commands or keywords to learn. In addition, ALPS contains a full-screen editor to enter and maintain the LP formulation. These formulations can be written to and read from plain ASCII files for portability. For those less experienced in LP formulation, ALPS contains a problem "parser" which checks the formulation for errors. ALPS creates fully formatted, readable reports that can be sent to a printer or output file. ALPS is written entirely in IBM's APL2/PC product, Version 1.01. The APL2 workspace containing all the ALPS code can be run on any APL2/PC system (AT or 386). On a 32-bit system, this configuration can take advantage of all extended memory. The user can also examine and modify the ALPS code. The APL2 workspace has also been "packed" to be run on any DOS system (without APL2) as a stand-alone "EXE" file, but has limited memory capacity on a 640K system. A numeric coprocessor (80X87) is optional but recommended. The standard distribution medium for ALPS is a 5.25 inch 360K MS-DOS format diskette. IBM, IBM PC and IBM APL2 are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation.
Wavelet-based compression of pathological images for telemedicine applications
NASA Astrophysics Data System (ADS)
Chen, Chang W.; Jiang, Jianfei; Zheng, Zhiyong; Wu, Xue G.; Yu, Lun
2000-05-01
In this paper, we present the performance evaluation of wavelet-based coding techniques as applied to the compression of pathological images for application in an Internet-based telemedicine system. We first study how well suited the wavelet-based coding is as it applies to the compression of pathological images, since these images often contain fine textures that are often critical to the diagnosis of potential diseases. We compare the wavelet-based compression with the DCT-based JPEG compression in the DICOM standard for medical imaging applications. Both objective and subjective measures have been studied in the evaluation of compression performance. These studies are performed in close collaboration with expert pathologists who have conducted the evaluation of the compressed pathological images and communication engineers and information scientists who designed the proposed telemedicine system. These performance evaluations have shown that the wavelet-based coding is suitable for the compression of various pathological images and can be integrated well with the Internet-based telemedicine systems. A prototype of the proposed telemedicine system has been developed in which the wavelet-based coding is adopted for the compression to achieve bandwidth efficient transmission and therefore speed up the communications between the remote terminal and the central server of the telemedicine system.
Establishment and assessment of code scaling capability
NASA Astrophysics Data System (ADS)
Lim, Jaehyok
In this thesis, a method for using RELAP5/MOD3.3 (Patch03) code models is described to establish and assess the code scaling capability and to corroborate the scaling methodology that has been used in the design of the Purdue University Multi-Dimensional Integral Test Assembly for ESBWR applications (PUMA-E) facility. It was sponsored by the United States Nuclear Regulatory Commission (USNRC) under the program "PUMA ESBWR Tests". PUMA-E facility was built for the USNRC to obtain data on the performance of the passive safety systems of the General Electric (GE) Nuclear Energy Economic Simplified Boiling Water Reactor (ESBWR). Similarities between the prototype plant and the scaled-down test facility were investigated for a Gravity-Driven Cooling System (GDCS) Drain Line Break (GDLB). This thesis presents the results of the GDLB test, i.e., the GDLB test with one Isolation Condenser System (ICS) unit disabled. The test is a hypothetical multi-failure small break loss of coolant (SB LOCA) accident scenario in the ESBWR. The test results indicated that the blow-down phase, Automatic Depressurization System (ADS) actuation, and GDCS injection processes occurred as expected. The GDCS as an emergency core cooling system provided adequate supply of water to keep the Reactor Pressure Vessel (RPV) coolant level well above the Top of Active Fuel (TAF) during the entire GDLB transient. The long-term cooling phase, which is governed by the Passive Containment Cooling System (PCCS) condensation, kept the reactor containment system that is composed of Drywell (DW) and Wetwell (WW) below the design pressure of 414 kPa (60 psia). In addition, the ICS continued participating in heat removal during the long-term cooling phase. A general Code Scaling, Applicability, and Uncertainty (CSAU) evaluation approach was discussed in detail relative to safety analyses of Light Water Reactor (LWR). The major components of the CSAU methodology that were highlighted particularly focused on the scaling issues of experiments and models and their applicability to the nuclear power plant transient and accidents. The major thermal-hydraulic phenomena to be analyzed were identified and the predictive models adopted in RELAP5/MOD3.3 (Patch03) code were briefly reviewed.
NASA Technical Reports Server (NTRS)
Chang, Dong Kyung; Metzgar, David; Wills, Christopher; Boland, C. Richard
2003-01-01
All "minor" components of the human DNA mismatch repair (MMR) system-MSH3, MSH6, PMS2, and the recently discovered MLH3-contain mononucleotide microsatellites in their coding sequences. This intriguing finding contrasts with the situation found in the major components of the DNA MMR system-MSH2 and MLH1-and, in fact, most human genes. Although eukaryotic genomes are rich in microsatellites, non-triplet microsatellites are rare in coding regions. The recurring presence of exonal mononucleotide repeat sequences within a single family of human genes would therefore be considered exceptional.
Tobacco imagery in Bollywood films: 2006–2008
Nazar, Gaurang P; Gupta, Vinay K; Millett, Christopher; Arora, Monika
2013-01-01
Objective To estimate exposure to tobacco imagery in youth-rated Bollywood films, and examine the results in light of recent developments in India's film rating system. Methods Content coding of 44 top grossing Bollywood films (including 38 youth-rated films) released during 2006–2008 was undertaken to estimate tobacco occurrences and impressions. Results Out of the 38 youth-rated (U and U/A) films coded, 50% contained tobacco imagery. Mean tobacco occurrences were 1.9, 2.9 and 13.7 per U, U/A and adult (A) rated films, respectively. Top grossing youth-rated films delivered 1.91 billion tobacco impressions to Indian cinema audiences. Conclusions Half the youth-rated Bollywood films contain tobacco imagery resulting in large population level exposure in India, relative to other countries. Measures to reduce youth exposure to tobacco imagery through films, such as restricting access through the rating system, will complement other tobacco control measures. PMID:27326073
Supplement to the December 1974 Space Investigation Documentation System (SIDS) report
NASA Technical Reports Server (NTRS)
1975-01-01
A listing and brief description of spacecraft and experiments designed to update the December 1974 Space Investigations Documentation System (SIDS) report to March 31, 1975 was presented. The information is given in two sections. In the first, spacecraft and experiment descriptions are sorted by spacecraft common name. Within each spacecraft lising, experiments are sorted by the principal investigator's or team leader's last name. Each spacecraft entry heading contains the spacecraft common name, alternate names, NSSDC ID code, last reported state of the spacecraft, actual or planned launch date, weight, launch site and vehicle, sponsor, orbit parameters, personnel. Each experiment entry heading contains the experiment name, NSSDC ID code, last reported status, the Office of Space Science (OSS) division, the relevant SIDS disciplines, personnel. In the second, all spacecraft and experiment names described in the previous section and in the December 1974 report are sorted out.
Numerical, analytical, experimental study of fluid dynamic forces in seals
NASA Technical Reports Server (NTRS)
Shapiro, William; Artiles, Antonio; Aggarwal, Bharat; Walowit, Jed; Athavale, Mahesh M.; Preskwas, Andrzej J.
1992-01-01
NASA/Lewis Research Center is sponsoring a program for providing computer codes for analyzing and designing turbomachinery seals for future aerospace and engine systems. The program is made up of three principal components: (1) the development of advanced three dimensional (3-D) computational fluid dynamics codes, (2) the production of simpler two dimensional (2-D) industrial codes, and (3) the development of a knowledge based system (KBS) that contains an expert system to assist in seal selection and design. The first task has been to concentrate on cylindrical geometries with straight, tapered, and stepped bores. Improvements have been made by adoption of a colocated grid formulation, incorporation of higher order, time accurate schemes for transient analysis and high order discretization schemes for spatial derivatives. This report describes the mathematical formulations and presents a variety of 2-D results, including labyrinth and brush seal flows. Extensions of 3-D are presently in progress.
NASA Technical Reports Server (NTRS)
Spiers, Gary D.
1994-01-01
Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.
VizieR Online Data Catalog: FARGO_THORIN 1.0 hydrodynamic code (Chrenko+, 2017)
NASA Astrophysics Data System (ADS)
Chrenko, O.; Broz, M.; Lambrechts, M.
2017-07-01
This archive contains the source files, documentation and example simulation setups of the FARGO_THORIN 1.0 hydrodynamic code. The program was introduced, described and used for simulations in the paper. It is built on top of the FARGO code (Masset, 2000A&AS..141..165M, Baruteau & Masset, 2008ApJ...672.1054B) and it is also interfaced with the REBOUND integrator package (Rein & Liu, 2012A&A...537A.128R). THORIN stands for Two-fluid HydrOdynamics, the Rebound integrator Interface and Non-isothermal gas physics. The program is designed for self-consistent investigations of protoplanetary systems consisting of a gas disk, a disk of small solid particles (pebbles) and embedded protoplanets. Code features: I) Non-isothermal gas disk with implicit numerical solution of the energy equation. The implemented energy source terms are: Compressional heating, viscous heating, stellar irradiation, vertical escape of radiation, radiative diffusion in the midplane and radiative feedback to accretion heating of protoplanets. II) Planets evolved in 3D, with close encounters allowed. The orbits are integrated using the IAS15 integrator (Rein & Spiegel, 2015MNRAS.446.1424R). The code detects the collisions among planets and resolve them as mergers. III) Refined treatment of the planet-disk gravitational interaction. The code uses a vertical averaging of the gravitational potential, as outlined in Muller & Kley (2012A&A...539A..18M). IV) Pebble disk represented by an Eulerian, presureless and inviscid fluid. The pebble dynamics is affected by the Epstein gas drag and optionally by the diffusive effects. We also implemented the drag back-reaction term into the Navier-Stokes equation for the gas. Archive summary: ------------------------------------------------------------------------- directory/file Explanation ------------------------------------------------------------------------- /in_relax Contains setup of the first example simulation /in_wplanet Contains setup of the second example simulation /srcmain Contains the source files of FARGOTHORIN /src_reb Contains the source files of the REBOUND integrator package to be linked with THORIN GUNGPL3 GNU General Public License, version 3 LICENSE License agreement README Simple user's guide UserGuide.pdf Extended user's guide refman.pdf Programer's guide ----------------------------------------------------------------------------- (1 data file).
Normative lessons: codes of conduct, self-regulation and the law.
Parker, Malcolm H
2010-06-07
Good medical practice: a code of conduct for doctors in Australia provides uniform standards to be applied in relation to complaints about doctors to the new Medical Board of Australia. The draft Code was criticised for being prescriptive. The final Code employs apparently less authoritative wording than the draft Code, but the implicit obligations it contains are no less prescriptive. Although the draft Code was thought to potentially undermine trust in doctors, and stifle professional judgement in relation to individual patients, its general obligations always allowed for flexibility of application, depending on the circumstances of individual patients. Professional codes may contain some aspirational statements, but they always contain authoritative ones, and they share this feature with legal codes. In successfully diluting the apparent prescriptivity of the draft Code, the profession has lost an opportunity to demonstrate its commitment to the raison d'etre of self-regulation - the protection of patients. Professional codes are not opportunities for reflection, consideration and debate, but are outcomes of these activities.
NASA Technical Reports Server (NTRS)
Muratore, John F.
1991-01-01
Lessons learned from operational real time expert systems are examined. The basic system architecture is discussed. An expert system is any software that performs tasks to a standard that would normally require a human expert. An expert system implies knowledge contained in data rather than code. And an expert system implies the use of heuristics as well as algorithms. The 15 top lessons learned by the operation of a real time data system are presented.
Automated Classification of Power Signals
2008-06-01
determine when a transient occurs. The identification of this signal can then be determined by an expert classifier and a series of these...the manual identification and classification of system events. Once events were located, the characteristics were examined to determine if system... identification code, which varies depending on the system classifier that is specified. Figure 3-7 provides an example of a Linux directory containing
NASA Technical Reports Server (NTRS)
Jefferies, K.
1994-01-01
OFFSET is a ray tracing computer code for optical analysis of a solar collector. The code models the flux distributions within the receiver cavity produced by reflections from the solar collector. It was developed to model the offset solar collector of the solar dynamic electric power system being developed for Space Station Freedom. OFFSET has been used to improve the understanding of the collector-receiver interface and to guide the efforts of NASA contractors also researching the optical components of the power system. The collector for Space Station Freedom consists of 19 hexagonal panels each containing 24 triangular, reflective facets. Current research is geared toward optimizing flux distribution inside the receiver via changes in collector design and receiver orientation. OFFSET offers many options for experimenting with the design of the system. The offset parabolic collector model configuration is determined by an input file of facet corner coordinates. The user may choose other configurations by changing this file, but to simulate collectors that have other than 19 groups of 24 triangular facets would require modification of the FORTRAN code. Each of the roughly 500 facets in the assembled collector may be independently aimed to smooth out, or tailor, the flux distribution on the receiver's wall. OFFSET simulates the effects of design changes such as in receiver aperture location, tilt angle, and collector facet contour. Unique features of OFFSET include: 1) equations developed to pseudo-randomly select ray originating sources on the Sun which appear evenly distributed and include solar limb darkening; 2) Cone-optics technique used to add surface specular error to the ray originating sources to determine the apparent ray sources of the reflected sun; 3) choice of facet reflective surface contour -- spherical, ideal parabolic, or toroidal; 4) Gaussian distributions of radial and tangential components of surface slope error added to the surface normals at the ten nodal points on each facet; and 5) color contour plots of receiver incident flux distribution generated by PATRAN processing of FORTRAN computer code output. OFFSET output includes a file of input data for confirmation, a PATRAN results file containing the values necessary to plot the flux distribution at the receiver surface, a PATRAN results file containing the intensity distribution on a 40 x 40 cm area of the receiver aperture plane, a data file containing calculated information on the system configuration, a file including the X-Y coordinates of the target points of each collector facet on the aperture opening, and twelve P/PLOT input data files to allow X-Y plotting of various results data. OFFSET is written in FORTRAN (70%) for the IBM VM operating system. The code contains PATRAN statements (12%) and P/PLOT statements (18%) for generating plots. Once the program has been run on VM (or an equivalent system), the PATRAN and P/PLOT files may be transferred to a DEC VAX (or equivalent system) with access to PATRAN for PATRAN post processing. OFFSET was written in 1988 and last updated in 1989. PATRAN is a registered trademark of PDA Engineering. IBM is a registered trademark of International Business Machines Corporation. DEC VAX is a registered trademark of Digital Equipment Corporation.
Time-Dependent Simulation of Incompressible Flow in a Turbopump Using Overset Grid Approach
NASA Technical Reports Server (NTRS)
Kiris, Cetin; Kwak, Dochan
2001-01-01
This paper reports the progress being made towards complete unsteady turbopump simulation capability by using overset grid systems. A computational model of a turbo-pump impeller is used as a test case for the performance evaluation of the MPI, hybrid MPI/Open-MP, and MLP versions of the INS3D code. Relative motion of the grid system for rotor-stator interaction was obtained by employing overset grid techniques. Unsteady computations for a turbo-pump, which contains 114 zones with 34.3 Million grid points, are performed on Origin 2000 systems at NASA Ames Research Center. The approach taken for these simulations, and the performance of the parallel versions of the code are presented.
LOFT. Containment and service building (TAN650). Section through east/west axis ...
LOFT. Containment and service building (TAN-650). Section through east/west axis of building as viewed from the south. Shows basement and grade levels of containment building, connection to control room on west side, air filter vaults, and duct enclosure for air exhaust system. Kaiser engineers 6413-11-STEP/LOFT-650-A-4. Date: October 1964. INEEL index code no. 036-650-00-486-122216 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID
Automotive Gas Turbine Power System-Performance Analysis Code
NASA Technical Reports Server (NTRS)
Juhasz, Albert J.
1997-01-01
An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.
Documentation of the GLAS fourth order general circulation model. Volume 2: Scalar code
NASA Technical Reports Server (NTRS)
Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.
1983-01-01
Volume 2, of a 3 volume technical memoranda contains a detailed documentation of the GLAS fourth order general circulation model. Volume 2 contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A variable name dictionary for the scalar code, and code listings are outlined.
Condensation model for the ESBWR passive condensers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Revankar, S. T.; Zhou, W.; Wolf, B.
2012-07-01
In the General Electric's Economic simplified boiling water reactor (GE-ESBWR) the passive containment cooling system (PCCS) plays a major role in containment pressure control in case of an loss of coolant accident. The PCCS condenser must be able to remove sufficient energy from the reactor containment to prevent containment from exceeding its design pressure following a design basis accident. There are three PCCS condensation modes depending on the containment pressurization due to coolant discharge; complete condensation, cyclic venting and flow through mode. The present work reviews the models and presents model predictive capability along with comparison with existing data frommore » separate effects test. The condensation models in thermal hydraulics code RELAP5 are also assessed to examine its application to various flow modes of condensation. The default model in the code predicts complete condensation well, and basically is Nusselt solution. The UCB model predicts through flow well. None of condensation model in RELAP5 predict complete condensation, cyclic venting, and through flow condensation consistently. New condensation correlations are given that accurately predict all three modes of PCCS condensation. (authors)« less
Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.; Wilson, J.H.; Arwood, P.C.
The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less
GridMan: A grid manipulation system
NASA Technical Reports Server (NTRS)
Eiseman, Peter R.; Wang, Zhu
1992-01-01
GridMan is an interactive grid manipulation system. It operates on grids to produce new grids which conform to user demands. The input grids are not constrained to come from any particular source. They may be generated by algebraic methods, elliptic methods, hyperbolic methods, parabolic methods, or some combination of methods. The methods are included in the various available structured grid generation codes. These codes perform the basic assembly function for the various elements of the initial grid. For block structured grids, the assembly can be quite complex due to a large number of clock corners, edges, and faces for which various connections and orientations must be properly identified. The grid generation codes are distinguished among themselves by their balance between interactive and automatic actions and by their modest variations in control. The basic form of GridMan provides a much more substantial level of grid control and will take its input from any of the structured grid generation codes. The communication link to the outside codes is a data file which contains the grid or section of grid.
Thrust Chamber Modeling Using Navier-Stokes Equations: Code Documentation and Listings. Volume 2
NASA Technical Reports Server (NTRS)
Daley, P. L.; Owens, S. F.
1988-01-01
A copy of the PHOENICS input files and FORTRAN code developed for the modeling of thrust chambers is given. These copies are contained in the Appendices. The listings are contained in Appendices A through E. Appendix A describes the input statements relevant to thrust chamber modeling as well as the FORTRAN code developed for the Satellite program. Appendix B describes the FORTRAN code developed for the Ground program. Appendices C through E contain copies of the Q1 (input) file, the Satellite program, and the Ground program respectively.
Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8
DOE Office of Scientific and Technical Information (OSTI.GOV)
First, M.W.
1991-02-01
Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)
HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, R.A.; Lowery, P.S.; Lessor, D.L.
1987-09-01
HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations formore » conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.« less
HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, R.A.; Lowery, P.S.
1987-10-01
HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equationsmore » for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
... of the Code of Federal Regulations (10 CFR), Part 50, Appendix R, Section III.O, ``Oil collection... with an oil collection system (OCS) if the containment is not inerted during normal operation and such collection systems shall be capable of collecting lube oil from all potential pressurized and unpressurized...
Program structure-based blocking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolli, Carlo; Eichenberger, Alexandre E.; O'Brien, John K.
2017-09-26
Embodiments relate to program structure-based blocking. An aspect includes receiving source code corresponding to a computer program by a compiler of a computer system. Another aspect includes determining a prefetching section in the source code by a marking module of the compiler. Yet another aspect includes performing, by a blocking module of the compiler, blocking of instructions located in the prefetching section into instruction blocks, such that the instruction blocks of the prefetching section only contain instructions that are located in the prefetching section.
ONR Far East Scientific Bulletin, Volume 7, Number 2, April-June 1982,
1982-01-01
contained source code . - PAL (Program Automation Language) PAL is a system design language that automatically generates an executable program from a...NTIS c3&1 DTIC TliB Unn ’l.- A ElJustitt for _ By - Distrib~tion Availability Codes Avail and/or Di st Speojal iii 0- CONTENTS~ P age r’A Gflmpse at...tools exist at ECL in prototype forms. Like most major computer manufacturers, they have also extended high level languages such as FORTRAN , COBOL
Transient Heat Transfer in Coated Superconductors.
1982-10-29
of the use of the SCEPTRE code are contained in the instruction manual and the book on the code. 30 An example of an actual SCEPTRE program is given in...22. 0. Tsukomoto and S. Kobayashi, J. of Appl. Physics, 46, 1359, (1975) 23. Y Iwasa and B.A. Apgar , Cryogenics 18, 267, (1978) 24. D.E. Baynham, V.W...Computer program for circuit and Systems Analysis. Prentice Hall 1971 and J.C. Bowers et. al. Users Manual for Super-Sceptre Government Document AD/A-OIl
Structural mechanics simulations
NASA Technical Reports Server (NTRS)
Biffle, Johnny H.
1992-01-01
Sandia National Laboratory has a very broad structural capability. Work has been performed in support of reentry vehicles, nuclear reactor safety, weapons systems and components, nuclear waste transport, strategic petroleum reserve, nuclear waste storage, wind and solar energy, drilling technology, and submarine programs. The analysis environment contains both commercial and internally developed software. Included are mesh generation capabilities, structural simulation codes, and visual codes for examining simulation results. To effectively simulate a wide variety of physical phenomena, a large number of constitutive models have been developed.
3D-DIVIMP-HC modeling analysis of methane injection into DIII-D using the DiMES porous plug injector
NASA Astrophysics Data System (ADS)
Mu, Y.; McLean, A. G.; Elder, J. D.; Stangeby, P. C.; Bray, B. D.; Brooks, N. H.; Davis, J. W.; Fenstermacher, M. E.; Groth, M.; Lasnier, C. J.; Rudakov, D. L.; Watkins, J. G.; West, W. P.; Wong, C. P. C.
2009-06-01
A self-contained gas injection system for the Divertor Material Evaluation System (DiMES) on DIII-D, the porous plug injector (PPI), has been employed for in situ study of chemical erosion in the tokamak divertor environment by injection of CH 4 [A.G. McLean et al., these Proceedings]. A new interpretive code, 3D-DIVIMP-HC, has been developed and applied to the interpretation of the CH, CI, and CII emissions. Particular emphasis is placed on the interpretation of 2D filtered-camera (TV) pictures in CH, CI and CII light taken from a view essentially straight down on the PPI. The code replicates sufficient measurements to conclude that most of the basic elements of the controlling physics and chemistry have been identified and incorporated in the code-model.
The MINERVA Software Development Process
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.
2017-01-01
This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.
NASA Technical Reports Server (NTRS)
Johnson, Sherylene (Compiler); Bertelrud, Arild (Compiler); Anders, J. B. (Technical Monitor)
2002-01-01
This report is part of a series of reports describing a flow physics high-lift experiment conducted in NASA Langley Research Center's Low-Turbulence Pressure Tunnel (LTPT) in 1996. The anemometry system used in the experiment was originally designed for and used in flight tests with NASA's Boeing 737 airplane. Information that may be useful in the evaluation or use of the experimental data has been compiled. The report also contains details regarding record structure, how to read the embedded time code, as well as the output file formats used in the code reading the binary data.
The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics
NASA Astrophysics Data System (ADS)
Ganander, Hans
2003-10-01
For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
Salisbury, Joseph P; Sîrbulescu, Ruxandra F; Moran, Benjamin M; Auclair, Jared R; Zupanc, Günther K H; Agar, Jeffrey N
2015-03-11
The brown ghost knifefish (Apteronotus leptorhynchus) is a weakly electric teleost fish of particular interest as a versatile model system for a variety of research areas in neuroscience and biology. The comprehensive information available on the neurophysiology and neuroanatomy of this organism has enabled significant advances in such areas as the study of the neural basis of behavior, the development of adult-born neurons in the central nervous system and their involvement in the regeneration of nervous tissue, as well as brain aging and senescence. Despite substantial scientific interest in this species, no genomic resources are currently available. Here, we report the de novo assembly and annotation of the A. leptorhynchus transcriptome. After evaluating several trimming and transcript reconstruction strategies, de novo assembly using Trinity uncovered 42,459 unique contigs containing at least a partial protein-coding sequence based on alignment to a reference set of known Actinopterygii sequences. As many as 11,847 of these contigs contained full or near-full length protein sequences, providing broad coverage of the proteome. A variety of non-coding RNA sequences were also identified and annotated, including conserved long intergenic non-coding RNA and other long non-coding RNA observed previously to be expressed in adult zebrafish (Danio rerio) brain, as well as a variety of miRNA, snRNA, and snoRNA. Shotgun proteomics confirmed translation of open reading frames from over 2,000 transcripts, including alternative splice variants. Assignment of tandem mass spectra was greatly improved by use of the assembly compared to databases of sequences from closely related organisms. The assembly and raw reads have been deposited at DDBJ/EMBL/GenBank under the accession number GBKR00000000. Tandem mass spectrometry data is available via ProteomeXchange with identifier PXD001285. Presented here is the first release of an annotated de novo transcriptome assembly from Apteronotus leptorhynchus, providing a broad overview of RNA expressed in central nervous system tissue. The assembly, which includes substantial coverage of a wide variety of both protein coding and non-coding transcripts, will allow the development of better tools to understand the mechanisms underlying unique characteristics of the knifefish model system, such as their tremendous regenerative capacity and negligible brain senescence.
MELCOR/CONTAIN LMR Implementation Report-Progress FY15
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L.; Louie, David L.Y.
2016-01-01
This report describes the progress of the CONTAIN-LMR sodium physics and chemistry models to be implemented in to MELCOR 2.1. It also describes the progress to implement these models into CONT AIN 2 as well. In the past two years, the implementation included the addition of sodium equations of state and sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laborat ory by modifying MELCOR to include liquid lithium equation of state as a working fluid to mode l the nuclear fusion safety research. The second source uses properties generatedmore » for the SIMMER code. Testing and results from this implementation of sodium pr operties are given. In addition, the CONTAIN-LMR code was derived from an early version of C ONTAIN code. Many physical models that were developed sin ce this early version of CONTAIN are not captured by this early code version. Therefore, CONTAIN 2 is being updated with the sodium models in CONTAIN-LMR in or der to facilitate verification of these models with the MELCOR code. Although CONTAIN 2, which represents the latest development of CONTAIN, now contains ma ny of the sodium specific models, this work is not complete due to challenges from the lower cell architecture in CONTAIN 2, which is different from CONTAIN- LMR. This implementation should be completed in the coming year, while sodi um models from C ONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use. In terms of implementing the sodium m odels into MELCOR, a separate sodium model branch was created for this document . Because of massive development in the main stream MELCOR 2.1 code and the require ment to merge the latest code version into this branch, the integration of the s odium models were re-directed to implement the sodium chemistry models first. This change led to delays of the actual implementation. For aid in the future implementation of sodium models, a new sodium chemistry package was created. Thus reporting for the implementation of the sodium chemistry is discussed in this report.« less
NASA Astrophysics Data System (ADS)
Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.
1990-10-01
Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.
Unified method of knowledge representation in the evolutionary artificial intelligence systems
NASA Astrophysics Data System (ADS)
Bykov, Nickolay M.; Bykova, Katherina N.
2003-03-01
The evolution of artificial intelligence systems called by complicating of their operation topics and science perfecting has resulted in a diversification of the methods both the algorithms of knowledge representation and usage in these systems. Often by this reason it is very difficult to design the effective methods of knowledge discovering and operation for such systems. In the given activity the authors offer a method of unitized representation of the systems knowledge about objects of an external world by rank transformation of their descriptions, made in the different features spaces: deterministic, probabilistic, fuzzy and other. The proof of a sufficiency of the information about the rank configuration of the object states in the features space for decision making is presented. It is shown that the geometrical and combinatorial model of the rank configurations set introduce their by group of some system of incidence, that allows to store the information on them in a convolute kind. The method of the rank configuration description by the DRP - code (distance rank preserving code) is offered. The problems of its completeness, information capacity, noise immunity and privacy are reviewed. It is shown, that the capacity of a transmission channel for such submission of the information is more than unit, as the code words contain the information both about the object states, and about the distance ranks between them. The effective algorithm of the data clustering for the object states identification, founded on the given code usage, is described. The knowledge representation with the help of the rank configurations allows to unitize and to simplify algorithms of the decision making by fulfillment of logic operations above the DRP - code words. Examples of the proposed clustering techniques operation on the given samples set, the rank configuration of resulted clusters and its DRP-codes are presented.
Theory of Mind: A Neural Prediction Problem
Koster-Hale, Jorie; Saxe, Rebecca
2014-01-01
Predictive coding posits that neural systems make forward-looking predictions about incoming information. Neural signals contain information not about the currently perceived stimulus, but about the difference between the observed and the predicted stimulus. We propose to extend the predictive coding framework from high-level sensory processing to the more abstract domain of theory of mind; that is, to inferences about others’ goals, thoughts, and personalities. We review evidence that, across brain regions, neural responses to depictions of human behavior, from biological motion to trait descriptions, exhibit a key signature of predictive coding: reduced activity to predictable stimuli. We discuss how future experiments could distinguish predictive coding from alternative explanations of this response profile. This framework may provide an important new window on the neural computations underlying theory of mind. PMID:24012000
From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation
Blazewicz, Marek; Hinder, Ian; Koppelman, David M.; ...
2013-01-01
Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization ismore » based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of variables and thousands of terms.« less
The queueing perspective of asynchronous network coding in two-way relay network
NASA Astrophysics Data System (ADS)
Liang, Yaping; Chang, Qing; Li, Xianxu
2018-04-01
Asynchronous network coding (NC) has potential to improve the wireless network performance compared with a routing or the synchronous network coding. Recent researches concentrate on the optimization between throughput/energy consuming and delay with a couple of independent input flow. However, the implementation of NC requires a thorough investigation of its impact on relevant queueing systems where few work focuses on. Moreover, few works study the probability density function (pdf) in network coding scenario. In this paper, the scenario with two independent Poisson input flows and one output flow is considered. The asynchronous NC-based strategy is that a new arrival evicts a head packet holding in its queue when waiting for another packet from the other flow to encode. The pdf for the output flow which contains both coded and uncoded packets is derived. Besides, the statistic characteristics of this strategy are analyzed. These results are verified by numerical simulations.
An ultraviolet-visible spectrophotometer automation system. Part 3: Program documentation
NASA Astrophysics Data System (ADS)
Roth, G. S.; Teuschler, J. M.; Budde, W. L.
1982-07-01
The Ultraviolet-Visible Spectrophotometer (UVVIS) automation system accomplishes 'on-line' spectrophotometric quality assurance determinations, report generations, plot generations and data reduction for chlorophyll or color analysis. This system also has the capability to process manually entered data for the analysis of chlorophyll or color. For each program of the UVVIS system, this document contains a program description, flowchart, variable dictionary, code listing, and symbol cross-reference table. Also included are descriptions of file structures and of routines common to all automated analyses. The programs are written in Data General extended BASIC, Revision 4.3, under the RDOS operating systems, Revision 6.2. The BASIC code has been enhanced for real-time data acquisition, which is accomplished by CALLS to assembly language subroutines. Two other related publications are 'An Ultraviolet-Visible Spectrophotometer Automation System - Part I Functional Specifications,' and 'An Ultraviolet-Visible Spectrophotometer Automation System - Part II User's Guide.'
Cantwell, Kate; Morgans, Amee; Smith, Karen; Livingston, Michael; Dietze, Paul
2014-02-01
This paper aims to examine whether an adaptation of the International Classification of Disease (ICD) coding system can be applied retrospectively to final paramedic assessment data in an ambulance dataset with a view to developing more fine-grained, clinically relevant case definitions than are available through point-of-call data. Over 1.2 million case records were extracted from the Ambulance Victoria data warehouse. Data fields included dispatch code, cause (CN) and final primary assessment (FPA). Each FPA was converted to an ICD-10-AM code using word matching or best fit. ICD-10-AM codes were then converted into Major Diagnostic Categories (MDC). CN was aligned with the ICD-10-AM codes for external cause of morbidity and mortality. The most accurate results were obtained when ICD-10-AM codes were assigned using information from both FPA and CN. Comparison of cases coded as unconscious at point-of-call with the associated paramedic assessment highlighted the extra clinical detail obtained when paramedic assessment data are used. Ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Coding of ambulance data using ICD-10-AM allows for comparison of not only ambulance service users but also with other population groups. WHAT IS KNOWN ABOUT THE TOPIC? There is no reliable and standard coding and categorising system for paramedic assessment data contained in ambulance service databases. WHAT DOES THIS PAPER ADD? This study demonstrates that ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Representation of ambulance case types using ICD-10-AM-coded information obtained after paramedic assessment is more fine grained and clinically relevant than point-of-call data, which uses caller information before ambulance attendance. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? This paper describes a model of coding using an internationally recognised standard coding and categorising system to support analysis of paramedic assessment. Ambulance data coded using ICD-10-AM allows for reliable reporting and comparison within the prehospital setting and across the healthcare industry.
Patient health record on a smart card.
Naszlady, A; Naszlady, J
1998-02-01
A validated health questionnaire has been used for the documentation of a patient's history (826 items) and of the findings from physical examination (591 items) in our clinical ward for 25 years. This computerized patient record has been completed in EUCLIDES code (CEN TC/251) for laboratory tests and an ATC and EAN code listing for the names of the drugs permanently required by the patient. In addition, emergency data were also included on an EEPROM chipcard with a 24 kb capacity. The program is written in FOX-PRO language. A group of 5000 chronically ill in-patients received these cards which contain their health data. For security reasons the contents of the smart card is only accessible by a doctor's PIN coded key card. The personalization of each card was carried out in our health center and the depersonalized alphanumeric data were collected for further statistical evaluation. This information served as a basis for a real need assessment of health care and for the calculation of its cost. Code-combined with an optical card, a completely paperless electronic patient record system has been developed containing all three information carriers in medicine: Texts, Curves and Pictures.
Cost Estimation of Post Production Software Support in Ground Combat Systems
2007-09-01
request, year of the request, EINOMEN (a word description of the system), and a PRON (a unique identifier containing the year and weapons system...variance. The fiscal year of request, descriptive name (coded as EINOMEN), unique program identifier ( PRON ), amount funded, and total amount requested...entire data set loses this sophistication. Most of the unique PRONs in the database map to a specific ground combat system, as described in the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werley, Kenneth Alan; Mccown, Andrew William
The EPREP code is designed to evaluate the effects of an Electro-Magnetic Pulse (EMP) on the electric power transmission system. The EPREP code embodies an umbrella framework that allows a user to set up analysis conditions and to examine analysis results. The code links to three major physics/engineering modules. The first module describes the EM wave in space and time. The second module evaluates the damage caused by the wave on specific electric power (EP) transmission system components. The third module evaluates the consequence of the damaged network on its (reduced) ability to provide electric power to meet demand. Thismore » third module is the focus of the present paper. The EMPACT code serves as the third module. The EMPACT name denotes EMP effects on Alternating Current Transmission systems. The EMPACT algorithms compute electric power transmission network flow solutions under severely damaged network conditions. Initial solutions are often characterized by unacceptible network conditions including line overloads and bad voltages. The EMPACT code contains algorithms to adjust optimally network parameters to eliminate network problems while minimizing outages. System adjustments include automatically adjusting control equipment (generator V control, variable transformers, and variable shunts), as well as non-automatic control of generator power settings and minimal load shedding. The goal is to evaluate the minimal loss of customer load under equilibrium (steady-state) conditions during peak demand.« less
Fractional optical cryptographic protocol for data containers in a noise-free multiuser environment
NASA Astrophysics Data System (ADS)
Jaramillo, Alexis; Barrera, John Fredy; Zea, Alejandro Vélez; Torroba, Roberto
2018-03-01
Optical encryption systems have great potential for flexible and high-performance data protection, making them an area of rapid development. However, most approaches present two main issues, namely, the presence of speckle noise, and the degree of security they offer. Here we introduce an experimental implementation of an optical encrypting protocol that tackles these issues by taking advantage of recent developments in the field. These developments include the introduction of information containers for noise free information retrieval, the use of multiplexing to allow for a multiple user environment and an architecture based on the Joint fractional Fourier transform that allows increased degrees of freedom and simplifies the experimental requirements. Thus, data handling via QR code containers involving multiple users processed in a fractional joint transform correlator produce coded information with increased security and ease of use. In this way, we can guarantee that only the user with the correct combination of encryption key and security parameters can achieve noise free information after deciphering. We analyze the performance of the system when the order of the fractional Fourier transform is changed during decryption. We show experimental results that confirm the validity of our proposal.
A bicistronic transgene system for genetic modification of Parthenium argentatum
USDA-ARS?s Scientific Manuscript database
Parthenium argentatum (guayule) was transformed with a bicistronic transgene containing a viral 2A cleavage sequence. The transgene includes the coding sequences of two key enzymes of the mevalonate pathway, 3-hydroxy-3-methylglutaryl-CoA reductase (HMGR) and farnesyl pyrophosphate synthase (FPPS), ...
C Language Integrated Production System, Ada Version
NASA Technical Reports Server (NTRS)
Culbert, Chris; Riley, Gary; Savely, Robert T.; Melebeck, Clovis J.; White, Wesley A.; Mcgregor, Terry L.; Ferguson, Melisa; Razavipour, Reza
1992-01-01
CLIPS/Ada provides capabilities of CLIPS v4.3 but uses Ada as source language for CLIPS executable code. Implements forward-chaining rule-based language. Program contains inference engine and language syntax providing framework for construction of expert-system program. Also includes features for debugging application program. Based on Rete algorithm which provides efficient method for performing repeated matching of patterns. Written in Ada.
Theta phase precession and phase selectivity: a cognitive device description of neural coding
NASA Astrophysics Data System (ADS)
Zalay, Osbert C.; Bardakjian, Berj L.
2009-06-01
Information in neural systems is carried by way of phase and rate codes. Neuronal signals are processed through transformative biophysical mechanisms at the cellular and network levels. Neural coding transformations can be represented mathematically in a device called the cognitive rhythm generator (CRG). Incoming signals to the CRG are parsed through a bank of neuronal modes that orchestrate proportional, integrative and derivative transformations associated with neural coding. Mode outputs are then mixed through static nonlinearities to encode (spatio) temporal phase relationships. The static nonlinear outputs feed and modulate a ring device (limit cycle) encoding output dynamics. Small coupled CRG networks were created to investigate coding functionality associated with neuronal phase preference and theta precession in the hippocampus. Phase selectivity was found to be dependent on mode shape and polarity, while phase precession was a product of modal mixing (i.e. changes in the relative contribution or amplitude of mode outputs resulted in shifting phase preference). Nonlinear system identification was implemented to help validate the model and explain response characteristics associated with modal mixing; in particular, principal dynamic modes experimentally derived from a hippocampal neuron were inserted into a CRG and the neuron's dynamic response was successfully cloned. From our results, small CRG networks possessing disynaptic feedforward inhibition in combination with feedforward excitation exhibited frequency-dependent inhibitory-to-excitatory and excitatory-to-inhibitory transitions that were similar to transitions seen in a single CRG with quadratic modal mixing. This suggests nonlinear modal mixing to be a coding manifestation of the effect of network connectivity in shaping system dynamic behavior. We hypothesize that circuits containing disynaptic feedforward inhibition in the nervous system may be candidates for interpreting upstream rate codes to guide downstream processes such as phase precession, because of their demonstrated frequency-selective properties.
Methodology, status and plans for development and assessment of Cathare code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bestion, D.; Barre, F.; Faydide, B.
1997-07-01
This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less
Unified approach for incompressible flows
NASA Astrophysics Data System (ADS)
Chang, Tyne-Hsien
1993-12-01
An unified approach for solving both compressible and incompressible flows was investigated in this study. The difference in CFD code development between incompressible and compressible flows is due to the mathematical characteristics. However, if one can modify the continuity equation for incompressible flows by introducing pseudocompressibility, the governing equations for incompressible flows would have the same mathematical characters as compressible flows. The application of a compressible flow code to solve incompressible flows becomes feasible. Among numerical algorithms developed for compressible flows, the Centered Total Variation Diminishing (CTVD) schemes possess better mathematical properties to damp out the spurious oscillations while providing high-order accuracy for high speed flows. It leads us to believe that CTVD schemes can equally well solve incompressible flows. In this study, the governing equations for incompressible flows include the continuity equation and momentum equations. The continuity equation is modified by adding a time-derivative of the pressure term containing the artificial compressibility. The modified continuity equation together with the unsteady momentum equations forms a hyperbolic-parabolic type of time-dependent system of equations. The continuity equation is modified by adding a time-derivative of the pressure term containing the artificial compressibility. The modified continuity equation together with the unsteady momentum equations forms a hyperbolic-parabolic type of time-dependent system of equations. Thus, the CTVD schemes can be implemented. In addition, the boundary conditions including physical and numerical boundary conditions must be properly specified to obtain accurate solution. The CFD code for this research is currently in progress. Flow past a circular cylinder will be used for numerical experiments to determine the accuracy and efficiency of the code before applying this code to more specific applications.
SIM_ADJUST -- A computer code that adjusts simulated equivalents for observations or predictions
Poeter, Eileen P.; Hill, Mary C.
2008-01-01
This report documents the SIM_ADJUST computer code. SIM_ADJUST surmounts an obstacle that is sometimes encountered when using universal model analysis computer codes such as UCODE_2005 (Poeter and others, 2005), PEST (Doherty, 2004), and OSTRICH (Matott, 2005; Fredrick and others (2007). These codes often read simulated equivalents from a list in a file produced by a process model such as MODFLOW that represents a system of interest. At times values needed by the universal code are missing or assigned default values because the process model could not produce a useful solution. SIM_ADJUST can be used to (1) read a file that lists expected observation or prediction names and possible alternatives for the simulated values; (2) read a file produced by a process model that contains space or tab delimited columns, including a column of simulated values and a column of related observation or prediction names; (3) identify observations or predictions that have been omitted or assigned a default value by the process model; and (4) produce an adjusted file that contains a column of simulated values and a column of associated observation or prediction names. The user may provide alternatives that are constant values or that are alternative simulated values. The user may also provide a sequence of alternatives. For example, the heads from a series of cells may be specified to ensure that a meaningful value is available to compare with an observation located in a cell that may become dry. SIM_ADJUST is constructed using modules from the JUPITER API, and is intended for use on any computer operating system. SIM_ADJUST consists of algorithms programmed in Fortran90, which efficiently performs numerical calculations.
Public domain optical character recognition
NASA Astrophysics Data System (ADS)
Garris, Michael D.; Blue, James L.; Candela, Gerald T.; Dimmick, Darrin L.; Geist, Jon C.; Grother, Patrick J.; Janet, Stanley A.; Wilson, Charles L.
1995-03-01
A public domain document processing system has been developed by the National Institute of Standards and Technology (NIST). The system is a standard reference form-based handprint recognition system for evaluating optical character recognition (OCR), and it is intended to provide a baseline of performance on an open application. The system's source code, training data, performance assessment tools, and type of forms processed are all publicly available. The system recognizes the handprint entered on handwriting sample forms like the ones distributed with NIST Special Database 1. From these forms, the system reads hand-printed numeric fields, upper and lowercase alphabetic fields, and unconstrained text paragraphs comprised of words from a limited-size dictionary. The modular design of the system makes it useful for component evaluation and comparison, training and testing set validation, and multiple system voting schemes. The system contains a number of significant contributions to OCR technology, including an optimized probabilistic neural network (PNN) classifier that operates a factor of 20 times faster than traditional software implementations of the algorithm. The source code for the recognition system is written in C and is organized into 11 libraries. In all, there are approximately 19,000 lines of code supporting more than 550 subroutines. Source code is provided for form registration, form removal, field isolation, field segmentation, character normalization, feature extraction, character classification, and dictionary-based postprocessing. The recognition system has been successfully compiled and tested on a host of UNIX workstations. This paper gives an overview of the recognition system's software architecture, including descriptions of the various system components along with timing and accuracy statistics.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
... (NAICS code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide... pesticide containing streptomycin sulfate, which is also used in human and animal treatment as an antibiotic... which contains the active ingredient, streptomycin sulfate, also used in humans and animals as an...
Standardized Definitions for Code Verification Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.
Automatic mathematical modeling for space application
NASA Technical Reports Server (NTRS)
Wang, Caroline K.
1987-01-01
A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.
NASA directives master list and index
NASA Technical Reports Server (NTRS)
1995-01-01
This handbook sets forth in two parts, Master List of Management Directives and Index to NASA Management Directives, the following information for the guidance of users of the NASA Management Directives System. Chapter 1 contains introductory information material on how to use this handbook. Chapter 2 is a complete master list of agencywide management directives, describing each directive by type, number, effective date, expiration date, title, and organization code of the office responsible for the directive. Chapter 3 includes a consolidated numerical list of all delegations of authority and a breakdown of such delegation by the office or center to which special authority is assigned. Chapter 4 sets forth a consolidated list of all NASA handbooks (NHB's) and important footnotes covering the control and ordering of such documents. Chapter 5 is a consolidated list of NASA management directives applicable to the Jet Propulsion Laboratory. Chapter 6 is a consolidated list of NASA regulations published in the Code of Federal Regulations. Chapter 7 is a consolidated list of NASA regulations published in Title 14 of the Code of Federal Regulations. Complementary manuals to the NASA Management Directives System are described in Chapter 8. The second part contains an in depth alphabetical index to all NASA management directives other than handbooks, most of which are indexed by titles only.
Diffusive deposition of aerosols in Phebus containment during FPT-2 test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontautas, A.; Urbonavicius, E.
2012-07-01
At present the lumped-parameter codes is the main tool to investigate the complex response of the containment of Nuclear Power Plant in case of an accident. Continuous development and validation of the codes is required to perform realistic investigation of the processes that determine the possible source term of radioactive products to the environment. Validation of the codes is based on the comparison of the calculated results with the measurements performed in experimental facilities. The most extensive experimental program to investigate fission product release from the molten fuel, transport through the cooling circuit and deposition in the containment is performedmore » in PHEBUS test facility. Test FPT-2 performed in this facility is considered for analysis of processes taking place in containment. Earlier performed investigations using COCOSYS code showed that the code could be successfully used for analysis of thermal-hydraulic processes and deposition of aerosols, but there was also noticed that diffusive deposition on the vertical walls does not fit well with the measured results. In the CPA module of ASTEC code there is implemented different model for diffusive deposition, therefore the PHEBUS containment model was transferred from COCOSYS code to ASTEC-CPA to investigate the influence of the diffusive deposition modelling. Analysis was performed using PHEBUS containment model of 16 nodes. The calculated thermal-hydraulic parameters are in good agreement with measured results, which gives basis for realistic simulation of aerosol transport and deposition processes. Performed investigations showed that diffusive deposition model has influence on the aerosol deposition distribution on different surfaces in the test facility. (authors)« less
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Technical Reports Server (NTRS)
Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.
2016-01-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Spaulding, R.; Becker, J. C.
2016-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Standardization of grant and contract awardee names has been an area of concern since the development of the Department`s Procurement and Assistance Data System (PADS). A joint effort was begun in 1983 by the Office of Scientific and Technical Information (OSTI) and the Office of Procurement and Assistance Management/Information Systems and Analysis Division to develop a means for providing uniformity of awardee names. As a result of this effort, a method of assigning vendor identification codes to each unique awardee name, division, city, and state combination was developed and is maintained by OSTI. Changes to vendor identification codes or awardeemore » names contained in PADS can be made only by OSTI. Awardee names in the Directory indicate that the awardee has had a prime contract (excluding purchase orders of $10,000 or less) with, or a financial assistance award from, the Department. Award status--active, inactive, or retired--is not shown. The Directory is in alphabetic sequence based on awardee name and reflects the OSTI-assigned vendor identification code to the right of the name. A vendor identification code is assigned to each unique awardee name, division, city, and state (for place of performance). The same vendor identification code is used for awards throughout the Department.« less
FRANOPP: Framework for analysis and optimization problems user's guide
NASA Technical Reports Server (NTRS)
Riley, K. M.
1981-01-01
Framework for analysis and optimization problems (FRANOPP) is a software aid for the study and solution of design (optimization) problems which provides the driving program and plotting capability for a user generated programming system. In addition to FRANOPP, the programming system also contains the optimization code CONMIN, and two user supplied codes, one for analysis and one for output. With FRANOPP the user is provided with five options for studying a design problem. Three of the options utilize the plot capability and present an indepth study of the design problem. The study can be focused on a history of the optimization process or on the interaction of variables within the design problem.
Planning and Managing Intermodal Transportation Systems: A Guide to ISTEA Requirements
DOT National Transportation Integrated Search
1995-02-01
The profiles contained in the appendix are all in the St. Albans, Vermont district. They are listed below by border groups as used in the study, with the U.S. Customs port codes indicated. Montreal South Frontier Border Crossings: Derby Line, VT (102...
47 CFR 101.521 - Spectrum utilization.
Code of Federal Regulations, 2010 CFR
2010-10-01
... applicants for DEMS frequencies in the 10.6 GHz band must submit as part of the original application a... contain detailed descriptions of the modulation method, the channel time sharing method, any error detecting and/or correcting codes, any spatial frequency reuse system and the total data throughput capacity...
1983-12-01
while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support
MatProps: Material Properties Database and Associated Access Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durrenberger, J K; Becker, R C; Goto, D M
2007-08-13
Coefficients for analytic constitutive and equation of state models (EOS), which are used by many hydro codes at LLNL, are currently stored in a legacy material database (Steinberg, UCRL-MA-106349). Parameters for numerous materials are available through this database, and include Steinberg-Guinan and Steinberg-Lund constitutive models for metals, JWL equations of state for high explosives, and Mie-Gruniesen equations of state for metals. These constitutive models are used in most of the simulations done by ASC codes today at Livermore. Analytic EOSs are also still used, but have been superseded in many cases by tabular representations in LEOS (http://leos.llnl.gov). Numerous advanced constitutivemore » models have been developed and implemented into ASC codes over the past 20 years. These newer models have more physics and better representations of material strength properties than their predecessors, and therefore more model coefficients. However, a material database of these coefficients is not readily available. Therefore incorporating these coefficients with those of the legacy models into a portable database that could be shared amongst codes would be most welcome. The goal of this paper is to describe the MatProp effort at LLNL to create such a database and associated access library that could be used by codes throughout the DOE complex and beyond. We have written an initial version of the MatProp database and access library and our DOE/ASC code ALE3D (Nichols et. al., UCRL-MA-152204) is able to import information from the database. The database, a link to which exists on the Sourceforge server at LLNL, contains coefficients for many materials and models (see Appendix), and includes material parameters in the following categories--flow stress, shear modulus, strength, damage, and equation of state. Future versions of the Matprop database and access library will include the ability to read and write material descriptions that can be exchanged between codes. It will also include an ability to do unit changes, i.e. have the library return parameters in user-specified unit systems. In addition to these, additional material categories can be added (e.g., phase change kinetics, etc.). The Matprop database and access library is part of a larger set of tools used at LLNL for assessing material model behavior. One of these is MSlib, a shared constitutive material model library. Another is the Material Strength Database (MSD), which allows users to compare parameter fits for specific constitutive models to available experimental data. Together with Matprop, these tools create a suite of capabilities that provide state-of-the-art models and parameters for those models to integrated simulation codes. This document is broken into several appendices. Appendix A contains a code example to retrieve several material coefficients. Appendix B contains the API for the Matprop data access library. Appendix C contains a list of the material names and model types currently available in the Matprop database. Appendix D contains a list of the parameter names for the currently recognized model types. Appendix E contains a full xml description of the material Tantalum.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sartori, E.; Roussin, R.W.
This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less
Transformation of two and three-dimensional regions by elliptic systems
NASA Technical Reports Server (NTRS)
Mastin, C. Wayne
1994-01-01
Several reports are attached to this document which contain the results of our research at the end of this contract period. Three of the reports deal with our work on generating surface grids. One is a preprint of a paper which will appear in the journal Applied Mathematics and Computation. Another is the abstract from a dissertation which has been prepared by Ahmed Khamayseh, a graduate student who has been supported by this grant for the last two years. The last report on surface grids is the extended abstract of a paper to be presented at the 14th IMACS World Congress in July. This report contains results on conformal mappings of surfaces, which are closely related to elliptic methods for surface grid generation. A preliminary report is included on new methods for dealing with block interfaces in multiblock grid systems. The development work is complete and the methods will eventually be incorporated into the National Grid Project (NGP) grid generation code. Thus, the attached report contains only a simple grid system which was used to test the algorithms to prove that the concepts are sound. These developments will greatly aid grid control when using elliptic systems and prevent unwanted grid movement. The last report is a brief summary of some timings that were obtained when the multiblock grid generation code was run on the Intel IPSC/860 hypercube. Since most of the data in a grid code is local to a particular block, only a small fraction of the total data must be passed between processors. The data is also distributed among the processors so that the total size of the grid can be increase along with the number of processors. This work is only in a preliminary stage. However, one of the ERC graduate students has taken an interest in the project and is presently extending these results as a part of his master's thesis.
Code of Federal Regulations, 2014 CFR
2014-07-01
... maintenance of aerial photographic records? (a) Mark each aerial film container with a unique identification code to facilitate identification and filing. (b) Mark aerial film indexes with the unique aerial film identification codes or container codes for the aerial film that they index. Also, file and mark the aerial...
Code of Federal Regulations, 2013 CFR
2013-07-01
... maintenance of aerial photographic records? (a) Mark each aerial film container with a unique identification code to facilitate identification and filing. (b) Mark aerial film indexes with the unique aerial film identification codes or container codes for the aerial film that they index. Also, file and mark the aerial...
Code of Federal Regulations, 2012 CFR
2012-07-01
... maintenance of aerial photographic records? (a) Mark each aerial film container with a unique identification code to facilitate identification and filing. (b) Mark aerial film indexes with the unique aerial film identification codes or container codes for the aerial film that they index. Also, file and mark the aerial...
Code of Federal Regulations, 2011 CFR
2011-07-01
... maintenance of aerial photographic records? (a) Mark each aerial film container with a unique identification code to facilitate identification and filing. (b) Mark aerial film indexes with the unique aerial film identification codes or container codes for the aerial film that they index. Also, file and mark the aerial...
Code of Federal Regulations, 2010 CFR
2010-07-01
... maintenance of aerial photographic records? (a) Mark each aerial film container with a unique identification code to facilitate identification and filing. (b) Mark aerial film indexes with the unique aerial film identification codes or container codes for the aerial film that they index. Also, file and mark the aerial...
Viewpoint: a comparison of cause-of-injury coding in U.S. military and civilian hospitals.
Amoroso, P J; Bell, N S; Smith, G S; Senier, L; Pickett, D
2000-04-01
Complete and accurate coding of injury causes is essential to the understanding of injury etiology and to the development and evaluation of injury-prevention strategies. While civilian hospitals use ICD-9-CM external cause-of-injury codes, military hospitals use codes derived from the NATO Standardization Agreement (STANAG) 2050. The STANAG uses two separate variables to code injury cause. The Trauma code uses a single digit with 10 possible values to identify the general class of injury as battle injury, intentionally inflicted nonbattle injury, or unintentional injury. The Injury code is used to identify cause or activity at the time of the injury. For a subset of the Injury codes, the last digit is modified to indicate place of occurrence. This simple system contains fewer than 300 basic codes, including many that are specific to battle- and sports-related injuries not coded well by either the ICD-9-CM or the draft ICD-10-CM. However, while falls, poisonings, and injuries due to machinery and tools are common causes of injury hospitalizations in the military, few STANAG codes correspond to these events. Intentional injuries in general and sexual assaults in particular are also not well represented in the STANAG. Because the STANAG does not map directly to the ICD-9-CM system, quantitative comparisons between military and civilian data are difficult. The ICD-10-CM, which will be implemented in the United States sometime after 2001, expands considerably on its predecessor, ICD-9-CM, and provides more specificity and detail than the STANAG. With slight modification, it might become a suitable replacement for the STANAG.
ERIC Educational Resources Information Center
Iowa State Dept. of Public Instruction, Des Moines. Area Schools and Career Education Branch.
A selected summary is provided of: (1) Iowa school laws relating to area schools; and (2) standards for area community colleges and area vocational schools. Part one contains the specific Chapters (280A and 286A, Code of Iowa) which pertain to the area schools. Chapter 280A deals with the organization of the school and county systems; plans for…
Distributed intelligent control and status networking
NASA Technical Reports Server (NTRS)
Fortin, Andre; Patel, Manoj
1993-01-01
Over the past two years, the Network Control Systems Branch (Code 532) has been investigating control and status networking technologies. These emerging technologies use distributed processing over a network to accomplish a particular custom task. These networks consist of small intelligent 'nodes' that perform simple tasks. Containing simple, inexpensive hardware and software, these nodes can be easily developed and maintained. Once networked, the nodes can perform a complex operation without a central host. This type of system provides an alternative to more complex control and status systems which require a central computer. This paper will provide some background and discuss some applications of this technology. It will also demonstrate the suitability of one particular technology for the Space Network (SN) and discuss the prototyping activities of Code 532 utilizing this technology.
MELCOR Analysis of OSU Multi-Application Small Light Water Reactor (MASLWR) Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Dhongik S.; Jo, HangJin; Fu, Wen
A multi-application small light water reactor (MASLWR) conceptual design was developed by Oregon State University (OSU) with emphasis on passive safety systems. The passive containment safety system employs condensation and natural circulation to achieve the necessary heat removal from the containment in case of postulated accidents. Containment condensation experiments at the MASLWR test facility at OSU are modeled and analyzed with MELCOR, a system-level reactor accident analysis computer code. The analysis assesses its ability to predict condensation heat transfer in the presence of noncondensable gas for accidents where high-energy steam is released into the containment. This work demonstrates MELCOR’s abilitymore » to predict the pressure-temperature response of the scaled containment. Our analysis indicates that the heat removal rates are underestimated in the experiment due to the limited locations of the thermocouples and applies corrections to these measurements by conducting integral energy analyses along with CFD simulation for confirmation. Furthermore, the corrected heat removal rate measurements and the MELCOR predictions on the heat removal rate from the containment show good agreement with the experimental data.« less
MELCOR Analysis of OSU Multi-Application Small Light Water Reactor (MASLWR) Experiment
Yoon, Dhongik S.; Jo, HangJin; Fu, Wen; ...
2017-05-23
A multi-application small light water reactor (MASLWR) conceptual design was developed by Oregon State University (OSU) with emphasis on passive safety systems. The passive containment safety system employs condensation and natural circulation to achieve the necessary heat removal from the containment in case of postulated accidents. Containment condensation experiments at the MASLWR test facility at OSU are modeled and analyzed with MELCOR, a system-level reactor accident analysis computer code. The analysis assesses its ability to predict condensation heat transfer in the presence of noncondensable gas for accidents where high-energy steam is released into the containment. This work demonstrates MELCOR’s abilitymore » to predict the pressure-temperature response of the scaled containment. Our analysis indicates that the heat removal rates are underestimated in the experiment due to the limited locations of the thermocouples and applies corrections to these measurements by conducting integral energy analyses along with CFD simulation for confirmation. Furthermore, the corrected heat removal rate measurements and the MELCOR predictions on the heat removal rate from the containment show good agreement with the experimental data.« less
Digital Equivalent Data System for XRF Labeling of Objects
NASA Technical Reports Server (NTRS)
Schramm, Harry F.; Kaiser, Bruce
2005-01-01
A digital equivalent data system (DEDS) is a system for identifying objects by means of the x-ray fluorescence (XRF) spectra of labeling elements that are encased in or deposited on the objects. As such, a DEDS is a revolutionary new major subsystem of an XRF system. A DEDS embodies the means for converting the spectral data output of an XRF scanner to an ASCII alphanumeric or barcode label that can be used to identify (or verify the assumed or apparent identity of) an XRF-scanned object. A typical XRF spectrum of interest contains peaks at photon energies associated with specific elements on the Periodic Table (see figure). The height of each spectral peak above the local background spectral intensity is proportional to the relative abundance of the corresponding element. Alphanumeric values are assigned to the relative abundances of the elements. Hence, if an object contained labeling elements in suitably chosen proportions, an alphanumeric representation of the object could be extracted from its XRF spectrum. The mixture of labeling elements and for reading the XRF spectrum would be compatible with one of the labeling conventions now used for bar codes and binary matrix patterns (essentially, two-dimensional bar codes that resemble checkerboards). A further benefit of such compatibility is that it would enable the conversion of the XRF spectral output to a bar or matrix-coded label, if needed. In short, a process previously used only for material composition analysis has been reapplied to the world of identification. This new level of verification is now being used for "authentication."
Resources for comparing the speed and performance of medical autocoders.
Berman, Jules J
2004-06-15
Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.
Design of an integrated airframe/propulsion control system architecture
NASA Technical Reports Server (NTRS)
Cohen, Gerald C.; Lee, C. William; Strickland, Michael J.
1990-01-01
The design of an integrated airframe/propulsion control system architecture is described. The design is based on a prevalidation methodology that used both reliability and performance tools. An account is given of the motivation for the final design and problems associated with both reliability and performance modeling. The appendices contain a listing of the code for both the reliability and performance model used in the design.
A trend analysis of surgical operations under a global payment system in Tehran, Iran (2005–2015)
Goudari, Faranak Behzadi; Rashidian, Arash; Arab, Mohammad; Mahmoudi, Mahmood
2018-01-01
Background Global payment system is a first example of per-case payment system that contains 60 commonly used surgical operations for which payment is based on the average cost per case in Iran. Objective The aim of the study was to determine the amount of reduction, increase or no change in the trend of global operations. Methods In this retrospective longitudinal study, data on the 60 primary global surgery codes was gathered from Tehran Health Insurance Organization within the ten-year period of 2005–2015 separately, for each month. Out of 60 surgery codes, only acceptable data for 46 codes were available based on the insurance documents sent by medical centers. A quantitative analysis of time series through Regression Analysis Model using STATA software v.11 was performed. Results Some global surgery codes had an upward trend and some were downwards. Of N Codes, N83, N20, N28, N63, and N93 had an upward trend (p<0.05) and N32, N43, N81 and N90 showed a significant downward trend (p<0.05). Similarly, all H Codes except for H18 had a significant upward trend (p<0.000). As such, K Codes including K45, K56 and K81 had an increasing movement. S Codes also experienced both increasing and decreasing trends. However, none of the O Codes changed according to time. Other global surgical codes like C61, E07, M51, L60, J98 (p<0.000), I84 (p<0.031) and I86 (p<0.000) shown upward and downward trends. Total global surgeries trend was significantly upwards (B=24.26109, p<0.000). Conclusion The varying trend of global surgeries can partly reflect the behavior of service providers in order to increase their profits and minimize their costs. PMID:29765576
Rapidly deployable emergency communication system
Gladden, Charles A.; Parelman, Martin H.
1979-01-01
A highly versatile, highly portable emergency communication system which permits deployment in a very short time to cover both wide areas and distant isolated areas depending upon mission requirements. The system employs a plurality of lightweight, fully self-contained repeaters which are deployed within the mission area to provide communication between field teams, and between each field team and a mobile communication control center. Each repeater contains a microcomputer controller, the program for which may be changed from the control center by the transmission of digital data within the audible range (300-3,000 Hz). Repeaters are accessed by portable/mobile transceivers, other repeaters, and the control center through the transmission and recognition of digital data code words in the subaudible range.
The British Film Catalogue: 1895-1970.
ERIC Educational Resources Information Center
Gifford, Denis
This reference book catalogues nearly every commercial film produced in Britain for public entertainment from 1895 to 1970. The entries are listed chronologically by year and month. Each entry is limited to a single film and contains a cross index code number, exhibition date, main title, length, color system, production company, distribution…
Metric. Career Education Program.
ERIC Educational Resources Information Center
Salem City Schools, NJ.
This is a compilation of instructional materials to assist teachers and students in learning about the metric system. Contents are organized into four color-coded sections containing the following: (1) background and reference materials for the teacher, including a list of available media and a conversion chart; (2) metric activities for primary…
Real time programming environment for Windows, Appendix A
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-04-05
This appendix contains all source code for the RTProE system. The following file contents are included: pdb.h; hgen.l; hgen.y; igen.l; igen.y; pdm.l; pdm.y; rtdata.l; rtdata.y; framegen.c; librt.c; librt.h; rtsched.c; build.tsh; sde.tcl; rtsched.def.
40 CFR 370.42 - What is Tier II inventory information?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Confidential and Non-Confidential Information Sheets and all attachments. All other pages must also contain..., the city, county, State and zip code. (d) The North American Industry Classification System (NAICS... “confidential.” You must provide the confidential location information on a separate sheet from the other Tier...
40 CFR 370.42 - What is Tier II inventory information?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Confidential and Non-Confidential Information Sheets and all attachments. All other pages must also contain..., the city, county, State and zip code. (d) The North American Industry Classification System (NAICS... “confidential.” You must provide the confidential location information on a separate sheet from the other Tier...
Sollie, Annet; Sijmons, Rolf H; Helsper, Charles; Numans, Mattijs E
2017-03-01
To assess quality and reusability of coded cancer diagnoses in routine primary care data. To identify factors that influence data quality and areas for improvement. A dynamic cohort study in a Dutch network database containing 250,000 anonymized electronic medical records (EMRs) from 52 general practices was performed. Coded data from 2000 to 2011 for the three most common cancer types (breast, colon and prostate cancer) was compared to the Netherlands Cancer Registry. Data quality is expressed in Standard Incidence Ratios (SIRs): the ratio between the number of coded cases observed in the primary care network database and the expected number of cases based on the Netherlands Cancer Registry. Ratios were multiplied by 100% for readability. The overall SIR was 91.5% (95%CI 88.5-94.5) and showed improvement over the years. SIRs differ between cancer types: from 71.5% for colon cancer in males to 103.9% for breast cancer. There are differences in data quality (SIRs 76.2% - 99.7%) depending on the EMR system used, with SIRs up to 232.9% for breast cancer. Frequently observed errors in routine healthcare data can be classified as: lack of integrity checks, inaccurate use and/or lack of codes, and lack of EMR system functionality. Re-users of coded routine primary care Electronic Medical Record data should be aware that 30% of cancer cases can be missed. Up to 130% of cancer cases found in the EMR data can be false-positive. The type of EMR system and the type of cancer influence the quality of coded diagnosis registry. While data quality can be improved (e.g. through improving system design and by training EMR system users), re-use should only be taken care of by appropriately trained experts. Copyright © 2016. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-06-01
In this compendium each profile of a nuclear facility is a capsule summary of pertinent facts regarding that particular installation. The facilities described include the entire fuel cycle in the broadest sense, encompassing resource recovery through waste management. Power plants and all US facilities have been excluded. To facilitate comparison the profiles have been recorded in a standard format. Because of the breadth of the undertaking some data fields do not apply to the establishment under discussion and accordingly are blank. The set of nuclear facility profiles occupies four volumes; the profiles are ordered by country name, and then bymore » facility code. Each nuclear facility profile volume contains two complete indexes to the information. The first index aggregates the facilities alphabetically by country. It is further organized by category of facility, and then by the four-character facility code. It provides a quick summary of the nuclear energy capability or interest in each country and also an identifier, the facility code, which can be used to access the information contained in the profile.« less
NASA Technical Reports Server (NTRS)
Chakrabarti, S.; Martin, J. J.; Pearson, J. B.; Lewis, R. A.
2003-01-01
The NASA MSFC Propulsion Research Center (PRC) is conducting a research activity examining the storage of low energy antiprotons. The High Performance Antiproton Trap (HiPAT) is an electromagnetic system (Penning-Malmberg design) consisting of a 4 Tesla superconductor, a high voltage confinement electrode system, and an ultra high vacuum test section; designed with an ultimate goal of maintaining charged particles with a half-life of 18 days. Currently, this system is being experimentally evaluated using normal matter ions which are cheap to produce and relatively easy to handle and provide a good indication of overall trap behavior, with the exception of assessing annihilation losses. Computational particle-in-cell plasma modeling using the XOOPIC code is supplementing the experiments. Differing electrode voltage configurations are employed to contain charged particles, typically using flat, modified flat and harmonic potential wells. Ion cloud oscillation frequencies are obtained experimentally by amplification of signals induced on the electrodes by the particle motions. XOOPIC simulations show that for given electrode voltage configurations, the calculated charged particle oscillation frequencies are close to experimental measurements. As a two-dimensional axisymmetric code, XOOPIC cannot model azimuthal plasma variations, such as those induced by radio-frequency (RF) modulation of the central quadrupole electrode in experiments designed to enhance ion cloud containment. However, XOOPIC can model analytically varying electric potential boundary conditions and particle velocity initial conditions. Application of these conditions produces ion cloud axial and radial oscillation frequency modes of interest in achieving the goal of optimizing HiPAT for reliable containment of antiprotons.
Transformable Rhodobacter strains, method for producing transformable Rhodobacter strains
Laible, Philip D.; Hanson, Deborah K.
2018-05-08
The invention provides an organism for expressing foreign DNA, the organism engineered to accept standard DNA carriers. The genome of the organism codes for intracytoplasmic membranes and features an interruption in at least one of the genes coding for restriction enzymes. Further provided is a system for producing biological materials comprising: selecting a vehicle to carry DNA which codes for the biological materials; determining sites on the vehicle's DNA sequence susceptible to restriction enzyme cleavage; choosing an organism to accept the vehicle based on that organism not acting upon at least one of said vehicle's sites; engineering said vehicle to contain said DNA; thereby creating a synthetic vector; and causing the synthetic vector to enter the organism so as cause expression of said DNA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Johnson, Seth R; Prokopenko, Andrey V
'ForTrilinos' is related to The Trilinos Project, which contains a large and growing collection of solver capabilities that can utilize next-generation platforms, in particular scalable multicore, manycore, accelerator and heterogeneous systems. Trilinos is primarily written in C++, including its user interfaces. While C++ is advantageous for gaining access to the latest programming environments, it limits Trilinos usage via Fortran. Sever ad hoc translation interfaces exist to enable Fortran usage of Trilinos, but none of these interfaces is general-purpose or written for reusable and sustainable external use. 'ForTrilinos' provides a seamless pathway for large and complex Fortran-based codes to access Trilinosmore » without C/C++ interface code. This access includes Fortran versions of Kokkos abstractions for code execution and data management.« less
Automatic Residential/Commercial Classification of Parcels with Solar Panel Detections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; Omitaomu, Olufemi A; Kotikot, Susan
A computational method to automatically detect solar panels on rooftops to aid policy and financial assessment of solar distributed generation. The code automatically classifies parcels containing solar panels in the U.S. as residential or commercial. The code allows the user to specify an input dataset containing parcels and detected solar panels, and then uses information about the parcels and solar panels to automatically classify the rooftops as residential or commercial using machine learning techniques. The zip file containing the code includes sample input and output datasets for the Boston and DC areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sienicki, J.J.
A fast running and simple computer code has been developed to calculate pressure loadings inside light water reactor containments/confinements under loss-of-coolant accident conditions. PACER was originally developed to calculate containment/confinement pressure and temperature time histories for loss-of-coolant accidents in Soviet-designed VVER reactors and is relevant to the activities of the US International Nuclear Safety Center. The code employs a multicompartment representation of the containment volume and is focused upon application to early time containment phenomena during and immediately following blowdown. PACER has been developed for FORTRAN 77 and earlier versions of FORTRAN. The code has been successfully compiled and executedmore » on SUN SPARC and Hewlett-Packard HP-735 workstations provided that appropriate compiler options are specified. The code incorporates both capabilities built around a hardwired default generic VVER-440 Model V230 design as well as fairly general user-defined input. However, array dimensions are hardwired and must be changed by modifying the source code if the number of compartments/cells differs from the default number of nine. Detailed input instructions are provided as well as a description of outputs. Input files and selected output are presented for two sample problems run on both HP-735 and SUN SPARC workstations.« less
Lee, Hwan Young; Yoo, Ji-Eun; Park, Myung Jin; Chung, Ukhee; Kim, Chong-Youl; Shin, Kyoung-Jin
2006-11-01
The present study analyzed 21 coding region SNP markers and one deletion motif for the determination of East Asian mitochondrial DNA (mtDNA) haplogroups by designing three multiplex systems which apply single base extension methods. Using two multiplex systems, all 593 Korean mtDNAs were allocated into 15 haplogroups: M, D, D4, D5, G, M7, M8, M9, M10, M11, R, R9, B, A, and N9. As the D4 haplotypes occurred most frequently in Koreans, the third multiplex system was used to further define D4 subhaplogroups: D4a, D4b, D4e, D4g, D4h, and D4j. This method allowed the complementation of coding region information with control region mutation motifs and the resultant findings also suggest reliable control region mutation motifs for the assignment of East Asian mtDNA haplogroups. These three multiplex systems produce good results in degraded samples as they contain small PCR products (101-154 bp) for single base extension reactions. SNP scoring was performed in 101 old skeletal remains using these three systems to prove their utility in degraded samples. The sequence analysis of mtDNA control region with high incidence of haplogroup-specific mutations and the selective scoring of highly informative coding region SNPs using the three multiplex systems are useful tools for most applications involving East Asian mtDNA haplogroup determination and haplogroup-directed stringent quality control.
MELCOR/CONTAIN LMR Implementation Report. FY14 Progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L; Louie, David L.Y.
2014-10-01
This report describes the preliminary implementation of the sodium thermophysical properties and the design documentation for the sodium models of CONTAIN-LMR to be implemented into MELCOR 2.1. In the past year, the implementation included two separate sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. To minimize the impact to MELCOR, the implementation of the fusion safety database (FSD) was done by utilizing the detection of the datamore » input file as a way to invoking the FSD. The FSD methodology has been adapted currently for this work, but it may subject modification as the project continues. The second source uses properties generated for the SIMMER code. Preliminary testing and results from this implementation of sodium properties are given. In this year, the design document for the CONTAIN-LMR sodium models, such as the two condensable option, sodium spray fire, and sodium pool fire is being developed. This design document is intended to serve as a guide for the MELCOR implementation. In addition, CONTAIN-LMR code used was based on the earlier version of CONTAIN code. Many physical models that were developed since this early version of CONTAIN may not be captured by the code. Although CONTAIN 2, which represents the latest development of CONTAIN, contains some sodium specific models, which are not complete, the utilizing CONTAIN 2 with all sodium models implemented from CONTAIN-LMR as a comparison code for MELCOR should be done. This implementation should be completed in early next year, while sodium models from CONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use.« less
PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Lin, Lianshan
2013-01-01
To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less
Exposure calculation code module for reactor core analysis: BURNER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Cunningham, G.W.
1979-02-01
The code module BURNER for nuclear reactor exposure calculations is presented. The computer requirements are shown, as are the reference data and interface data file requirements, and the programmed equations and procedure of calculation are described. The operating history of a reactor is followed over the period between solutions of the space, energy neutronics problem. The end-of-period nuclide concentrations are determined given the necessary information. A steady state, continuous fueling model is treated in addition to the usual fixed fuel model. The control options provide flexibility to select among an unusually wide variety of programmed procedures. The code also providesmore » user option to make a number of auxiliary calculations and print such information as the local gamma source, cumulative exposure, and a fine scale power density distribution in a selected zone. The code is used locally in a system for computation which contains the VENTURE diffusion theory neutronics code and other modules.« less
MELCOR/CONTAIN LMR Implementation Report - FY16 Progress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
2016-11-01
This report describes the progress of the CONTAIN - LMR sodium physics and chemistry models to be implemented in MELCOR 2.1. In the past three years , the implementation included the addition of sodium equations of state and sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. The second source uses properties generated for the SIMMER code. The implemented modeling has been tested and results are reported inmore » this document. In addition, the CONTAIN - LMR code was derived from an early version of the CONTAIN code, and many physical models that were developed since this early version of CONTAIN are not available in this early code version. Therefore, CONTAIN 2 has been updated with the sodium models in CONTAIN - LMR as CONTAIN2 - LMR, which may be used to provide code-to-code comparison with CONTAIN - LMR and MELCOR when the sodium chemistry models from CONTAIN - LMR have been completed. Both the spray fire and pool fire chemistry routines from CONTAIN - LMR have been integrated into MELCOR 2.1, and debugging and testing are in progress. Because MELCOR only models the equation of state for liquid and gas phases of the coolant, a modeling gap still exists when dealing with experiments or accident conditions that take place when the ambient temperature is below the freezing point of sodium. An alternative method is under investigation to overcome this gap . We are no longer working on the separate branch from the main branch of MELCOR 2.1 since the major modeling of MELCOR 2.1 has been completed. At the current stage, the newly implemented sodium chemistry models will be a part of the main MELCOR release version (MELCOR 2.2). This report will discuss the accomplishments and issues relating to the implementation. Also, we will report on the planned completion of all remaining tasks in the upcoming FY2017, including the atmospheric chemistry model and sodium - concrete interaction model implementation .« less
Skoblikow, Nikolai E; Zimin, Andrei A
2016-05-01
The hypothesis of direct coding, assuming the direct contact of pairs of coding molecules with amino acid side chains in hollow unit cells (cellules) of a regular crystal-structure mineral is proposed. The coding nucleobase-containing molecules in each cellule (named "lithocodon") partially shield each other; the remaining free space determines the stereochemical character of the filling side chain. Apatite-group minerals are considered as the most preferable for this type of coding (named "lithocoding"). A scheme of the cellule with certain stereometric parameters, providing for the isomeric selection of contacting molecules is proposed. We modelled the filling of cellules with molecules involved in direct coding, with the possibility of coding by their single combination for a group of stereochemically similar amino acids. The regular ordered arrangement of cellules enables the polymerization of amino acids and nucleobase-containing molecules in the same direction (named "lithotranslation") preventing the shift of coding. A table of the presumed "LithoCode" (possible and optimal lithocodon assignments for abiogenically synthesized α-amino acids involved in lithocoding and lithotranslation) is proposed. The magmatic nature of the mineral, abiogenic synthesis of organic molecules and polymerization events are considered within the framework of the proposed "volcanic scenario".
User's manual: Subsonic/supersonic advanced panel pilot code
NASA Technical Reports Server (NTRS)
Moran, J.; Tinoco, E. N.; Johnson, F. T.
1978-01-01
Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.
XSECT: A computer code for generating fuselage cross sections - user's manual
NASA Technical Reports Server (NTRS)
Ames, K. R.
1982-01-01
A computer code, XSECT, has been developed to generate fuselage cross sections from a given area distribution and wing definition. The cross sections are generated to match the wing definition while conforming to the area requirement. An iterative procedure is used to generate each cross section. Fuselage area balancing may be included in this procedure if desired. The code is intended as an aid for engineers who must first design a wing under certain aerodynamic constraints and then design a fuselage for the wing such that the contraints remain satisfied. This report contains the information necessary for accessing and executing the code, which is written in FORTRAN to execute on the Cyber 170 series computers (NOS operating system) and produces graphical output for a Tektronix 4014 CRT. The LRC graphics software is used in combination with the interface between this software and the PLOT 10 software.
NASA Astrophysics Data System (ADS)
Ma, Fanghui; Gao, Jian; Fu, Fang-Wei
2018-06-01
Let R={F}_q+v{F}_q+v2{F}_q be a finite non-chain ring, where q is an odd prime power and v^3=v. In this paper, we propose two methods of constructing quantum codes from (α +β v+γ v2)-constacyclic codes over R. The first one is obtained via the Gray map and the Calderbank-Shor-Steane construction from Euclidean dual-containing (α +β v+γ v2)-constacyclic codes over R. The second one is obtained via the Gray map and the Hermitian construction from Hermitian dual-containing (α +β v+γ v2)-constacyclic codes over R. As an application, some new non-binary quantum codes are obtained.
Identification of common, unique and polymorphic microsatellites among 73 cyanobacterial genomes.
Kabra, Ritika; Kapil, Aditi; Attarwala, Kherunnisa; Rai, Piyush Kant; Shanker, Asheesh
2016-04-01
Microsatellites also known as Simple Sequence Repeats are short tandem repeats of 1-6 nucleotides. These repeats are found in coding as well as non-coding regions of both prokaryotic and eukaryotic genomes and play a significant role in the study of gene regulation, genetic mapping, DNA fingerprinting and evolutionary studies. The availability of 73 complete genome sequences of cyanobacteria enabled us to mine and statistically analyze microsatellites in these genomes. The cyanobacterial microsatellites identified through bioinformatics analysis were stored in a user-friendly database named CyanoSat, which is an efficient data representation and query system designed using ASP.net. The information in CyanoSat comprises of perfect, imperfect and compound microsatellites found in coding, non-coding and coding-non-coding regions. Moreover, it contains PCR primers with 200 nucleotides long flanking region. The mined cyanobacterial microsatellites can be freely accessed at www.compubio.in/CyanoSat/home.aspx. In addition to this 82 polymorphic, 13,866 unique and 2390 common microsatellites were also detected. These microsatellites will be useful in strain identification and genetic diversity studies of cyanobacteria.
A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0
NASA Technical Reports Server (NTRS)
DeChant, Lawrence J.; Nadell, Shari-Beth
1999-01-01
A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.
PANDA asymmetric-configuration passive decay heat removal test results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, O.; Dreier, J.; Aubert, C.
1997-12-01
PANDA is a large-scale, low-pressure test facility for investigating passive decay heat removal systems for the next generation of LWRs. In the first series of experiments, PANDA was used to examine the long-term LOCA response of the Passive Containment Cooling System (PCCS) for the General Electric (GE) Simplified Boiling Water Reactor (SBWR). The test objectives include concept demonstration and extension of the database available for qualification of containment codes. Also included is the study of the effects of nonuniform distributions of steam and noncondensable gases in the Dry-well (DW) and in the Suppression Chamber (SC). 3 refs., 9 figs.
A real-time simulator of a turbofan engine
NASA Technical Reports Server (NTRS)
Litt, Jonathan S.; Delaat, John C.; Merrill, Walter C.
1989-01-01
A real-time digital simulator of a Pratt and Whitney F100 engine has been developed for real-time code verification and for actuator diagnosis during full-scale engine testing. This self-contained unit can operate in an open-loop stand-alone mode or as part of closed-loop control system. It can also be used for control system design and development. Tests conducted in conjunction with the NASA Advanced Detection, Isolation, and Accommodation program show that the simulator is a valuable tool for real-time code verification and as a real-time actuator simulator for actuator fault diagnosis. Although currently a small perturbation model, advances in microprocessor hardware should allow the simulator to evolve into a real-time, full-envelope, full engine simulation.
Zhou, Yuefang; Cameron, Elaine; Forbes, Gillian; Humphris, Gerry
2012-08-01
To develop and validate the St Andrews Behavioural Interaction Coding Scheme (SABICS): a tool to record nurse-child interactive behaviours. The SABICS was developed primarily from observation of video recorded interactions; and refined through an iterative process of applying the scheme to new data sets. Its practical applicability was assessed via implementation of the scheme on specialised behavioural coding software. Reliability was calculated using Cohen's Kappa. Discriminant validity was assessed using logistic regression. The SABICS contains 48 codes. Fifty-five nurse-child interactions were successfully coded through administering the scheme on The Observer XT8.0 system. Two visualization results of interaction patterns demonstrated the scheme's capability of capturing complex interaction processes. Cohen's Kappa was 0.66 (inter-coder) and 0.88 and 0.78 (two intra-coders). The frequency of nurse behaviours, such as "instruction" (OR = 1.32, p = 0.027) and "praise" (OR = 2.04, p = 0.027), predicted a child receiving the intervention. The SABICS is a unique system to record interactions between dental nurses and 3-5 years old children. It records and displays complex nurse-child interactive behaviours. It is easily administered and demonstrates reasonable psychometric properties. The SABICS has potential for other paediatric settings. Its development procedure may be helpful for other similar coding scheme development. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Davies, Kalina T J; Tsagkogeorga, Georgia; Rossiter, Stephen J
2014-12-19
The majority of DNA contained within vertebrate genomes is non-coding, with a certain proportion of this thought to play regulatory roles during development. Conserved Non-coding Elements (CNEs) are an abundant group of putative regulatory sequences that are highly conserved across divergent groups and thus assumed to be under strong selective constraint. Many CNEs may contain regulatory factor binding sites, and their frequent spatial association with key developmental genes - such as those regulating sensory system development - suggests crucial roles in regulating gene expression and cellular patterning. Yet surprisingly little is known about the molecular evolution of CNEs across diverse mammalian taxa or their role in specific phenotypic adaptations. We examined 3,110 vertebrate-specific and ~82,000 mammalian-specific CNEs across 19 and 9 mammalian orders respectively, and tested for changes in the rate of evolution of CNEs located in the proximity of genes underlying the development or functioning of auditory systems. As we focused on CNEs putatively associated with genes underlying the development/functioning of auditory systems, we incorporated echolocating taxa in our dataset because of their highly specialised and derived auditory systems. Phylogenetic reconstructions of concatenated CNEs broadly recovered accepted mammal relationships despite high levels of sequence conservation. We found that CNE substitution rates were highest in rodents and lowest in primates, consistent with previous findings. Comparisons of CNE substitution rates from several genomic regions containing genes linked to auditory system development and hearing revealed differences between echolocating and non-echolocating taxa. Wider taxonomic sampling of four CNEs associated with the homeobox genes Hmx2 and Hmx3 - which are required for inner ear development - revealed family-wise variation across diverse bat species. Specifically within one family of echolocating bats that utilise frequency-modulated echolocation calls varying widely in frequency and intensity high levels of sequence divergence were found. Levels of selective constraint acting on CNEs differed both across genomic locations and taxa, with observed variation in substitution rates of CNEs among bat species. More work is needed to determine whether this variation can be linked to echolocation, and wider taxonomic sampling is necessary to fully document levels of conservation in CNEs across diverse taxa.
Matney, Susan; Bakken, Suzanne; Huff, Stanley M
2003-01-01
In recent years, the Logical Observation Identifiers, Names, and Codes (LOINC) Database has been expanded to include assessment items of relevance to nursing and in 2002 met the criteria for "recognition" by the American Nurses Association. Assessment measures in LOINC include those related to vital signs, obstetric measurements, clinical assessment scales, assessments from standardized nursing terminologies, and research instruments. In order for LOINC to be of greater use in implementing information systems that support nursing practice, additional content is needed. Moreover, those implementing systems for nursing practice must be aware of the manner in which LOINC codes for assessments can be appropriately linked with other aspects of the nursing process such as diagnoses and interventions. Such linkages are necessary to document nursing contributions to healthcare outcomes within the context of a multidisciplinary care environment and to facilitate building of nursing knowledge from clinical practice. The purposes of this paper are to provide an overview of the LOINC database, to describe examples of assessments of relevance to nursing contained in LOINC, and to illustrate linkages of LOINC assessments with other nursing concepts.
The Ubiquitin Code in the Ubiquitin-Proteasome System and Autophagy.
Kwon, Yong Tae; Ciechanover, Aaron
2017-11-01
The conjugation of the 76 amino acid protein ubiquitin to other proteins can alter the metabolic stability or non-proteolytic functions of the substrate. Once attached to a substrate (monoubiquitination), ubiquitin can itself be ubiquitinated on any of its seven lysine (Lys) residues or its N-terminal methionine (Met1). A single ubiquitin polymer may contain mixed linkages and/or two or more branches. In addition, ubiquitin can be conjugated with ubiquitin-like modifiers such as SUMO or small molecules such as phosphate. The diverse ways to assemble ubiquitin chains provide countless means to modulate biological processes. We overview here the complexity of the ubiquitin code, with an emphasis on the emerging role of linkage-specific degradation signals (degrons) in the ubiquitin-proteasome system (UPS) and the autophagy-lysosome system (hereafter autophagy). Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cobos Arribas, Pedro; Monasterio Huelin Macia, Felix
2003-04-01
A FPGA based hardware implementation of the Santos-Victor optical flow algorithm, useful in robot guidance applications, is described in this paper. The system used to do contains an ALTERA FPGA (20K100), an interface with a digital camera, three VRAM memories to contain the data input and some output memories (a VRAM and a EDO) to contain the results. The system have been used previously to develop and test other vision algorithms, such as image compression, optical flow calculation with differential and correlation methods. The designed system let connect the digital camera, or the FPGA output (results of algorithms) to a PC, throw its Firewire or USB port. The problems take place in this occasion have motivated to adopt another hardware structure for certain vision algorithms with special requirements, that need a very hard code intensive processing.
Behavior States: Now You See Them, Now You Don't.
ERIC Educational Resources Information Center
Mudford, Oliver C.; Hogg, James; Roberts, Jessie
1999-01-01
A study attempted to replicate a previous study that presented reliability data from recordings of behavior state using a 13-category coding system. Replication was unsuccessful. Obtained mean percentage agreement on occurrence for individual behavior state and participants (n=34) ranged across observer pairs from 0 to 58 percent. (Contains 13…
Code of Federal Regulations, 2010 CFR
... 49 U.S.C. United States Code, 2009 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...
Code of Federal Regulations, 2010 CFR
... 49 U.S.C. United States Code, 2011 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...
Code of Federal Regulations, 2010 CFR
... 49 U.S.C. United States Code, 2014 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...
Code of Federal Regulations, 2011 CFR
2011-01-01
... in devices used in industrial measuring systems, including x-ray fluorescence analyzers [Program Code... of ores containing source material for extraction of metals other than uranium or thorium, including.... 4 Other facilities include licenses for extraction of metals, heavy metals, and rare earths. 5 There...
NASA directives: Master list and index
NASA Technical Reports Server (NTRS)
1994-01-01
This Handbook sets forth in two parts the following information for the guidance of users of the NASA Management Directives System. Chapter 1 contains introductory information material on how to use this Handbook. Chapter 2 is a complete master list of Agency-wide management directives, describing each directive by type, number, effective date, expiration date, title, and organization code of the office responsible for the directive. Chapter 3 includes a consolidated numerical list of all delegations of authority and a breakdown of such delegation by the office of Installation to which special authority is assigned. Chapter 4 sets forth a consolidated list of all NASA Handbooks (NHB's) and important footnotes covering the control and ordering of such documents. Chapter 5 is a consolidated list of NASA management directives applicable to the Jet Propulsion Laboratory. Chapter 6 is a consolidated list of NASA management directives published in the code of Federal Regulations. Complementary manuals to the NASA Management Directives System are described in Chapter 7. Part B contains an in-depth alphabetical index to all NASA management directives other than Handbooks.
Mars Global Reference Atmospheric Model 2010 Version: Users Guide
NASA Technical Reports Server (NTRS)
Justh, H. L.
2014-01-01
This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.
Selective encryption for H.264/AVC video coding
NASA Astrophysics Data System (ADS)
Shi, Tuo; King, Brian; Salama, Paul
2006-02-01
Due to the ease with which digital data can be manipulated and due to the ongoing advancements that have brought us closer to pervasive computing, the secure delivery of video and images has become a challenging problem. Despite the advantages and opportunities that digital video provide, illegal copying and distribution as well as plagiarism of digital audio, images, and video is still ongoing. In this paper we describe two techniques for securing H.264 coded video streams. The first technique, SEH264Algorithm1, groups the data into the following blocks of data: (1) a block that contains the sequence parameter set and the picture parameter set, (2) a block containing a compressed intra coded frame, (3) a block containing the slice header of a P slice, all the headers of the macroblock within the same P slice, and all the luma and chroma DC coefficients belonging to the all the macroblocks within the same slice, (4) a block containing all the ac coefficients, and (5) a block containing all the motion vectors. The first three are encrypted whereas the last two are not. The second method, SEH264Algorithm2, relies on the use of multiple slices per coded frame. The algorithm searches the compressed video sequence for start codes (0x000001) and then encrypts the next N bits of data.
Conflict Containment in the Balkans: Testing Extended Deterrence.
1995-03-01
STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is unlimited. 13. ABSTRACT This thesis critically analyzes a prominent theoretical...Containment 15. NUMBER OF in the Balkans; Deterrence; Coercive Diplomacy; Balance of Forces. PAGES: 161 16. PRICE CODE 17. SECURITY CLASSIFI- 18. SECURITY...Department of National Security Affai sAccesion For NTIS CRA&I DTtC TAB Unannounced Justifca ........... By- Distribution Availability Codes Avail and/or Dist
Recent MELCOR and VICTORIA Fission Product Research at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, N.E.; Cole, R.K.; Gauntt, R.O.
1999-01-21
The MELCOR and VICTORIA severe accident analysis codes, which were developed at Sandia National Laboratories for the U. S. Nuclear Regulatory Commission, are designed to estimate fission product releases during nuclear reactor accidents in light water reactors. MELCOR is an integrated plant-assessment code that models the key phenomena in adequate detail for risk-assessment purposes. VICTORIA is a more specialized fission- product code that provides detailed modeling of chemical reactions and aerosol processes under the high-temperature conditions encountered in the reactor coolant system during a severe reactor accident. This paper focuses on recent enhancements and assessments of the two codes inmore » the area of fission product chemistry modeling. Recently, a model for iodine chemistry in aqueous pools in the containment building was incorporated into the MELCOR code. The model calculates dissolution of iodine into the pool and releases of organic and inorganic iodine vapors from the pool into the containment atmosphere. The main purpose of this model is to evaluate the effect of long-term revolatilization of dissolved iodine. Inputs to the model include dose rate in the pool, the amount of chloride-containing polymer, such as Hypalon, and the amount of buffering agents in the containment. Model predictions are compared against the Radioiodine Test Facility (RTF) experiments conduced by Atomic Energy of Canada Limited (AECL), specifically International Standard Problem 41. Improvements to VICTORIA's chemical reactions models were implemented as a result of recommendations from a peer review of VICTORIA that was completed last year. Specifically, an option is now included to model aerosols and deposited fission products as three condensed phases in addition to the original option of a single condensed phase. The three-condensed-phase model results in somewhat higher predicted fission product volatilities than does the single-condensed-phase model. Modeling of U02 thermochemistry was also improved, and results in better prediction of vaporization of uranium from fuel, which can react with released fission products to affect their volatility. This model also improves the prediction of fission product release rates from fuel. Finally, recent comparisons of MELCOR and VICTORIA with International Standard Problem 40 (STORM) data are presented. These comparisons focus on predicted therrnophoretic deposition, which is the dominant deposition mechanism. Sensitivity studies were performed with the codes to examine experimental and modeling uncertainties.« less
The three-dimensional Multi-Block Advanced Grid Generation System (3DMAGGS)
NASA Technical Reports Server (NTRS)
Alter, Stephen J.; Weilmuenster, Kenneth J.
1993-01-01
As the size and complexity of three dimensional volume grids increases, there is a growing need for fast and efficient 3D volumetric elliptic grid solvers. Present day solvers are limited by computational speed and do not have all the capabilities such as interior volume grid clustering control, viscous grid clustering at the wall of a configuration, truncation error limiters, and convergence optimization residing in one code. A new volume grid generator, 3DMAGGS (Three-Dimensional Multi-Block Advanced Grid Generation System), which is based on the 3DGRAPE code, has evolved to meet these needs. This is a manual for the usage of 3DMAGGS and contains five sections, including the motivations and usage, a GRIDGEN interface, a grid quality analysis tool, a sample case for verifying correct operation of the code, and a comparison to both 3DGRAPE and GRIDGEN3D. Since it was derived from 3DGRAPE, this technical memorandum should be used in conjunction with the 3DGRAPE manual (NASA TM-102224).
High Performance Radiation Transport Simulations on TITAN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Christopher G; Davidson, Gregory G; Evans, Thomas M
2012-01-01
In this paper we describe the Denovo code system. Denovo solves the six-dimensional, steady-state, linear Boltzmann transport equation, of central importance to nuclear technology applications such as reactor core analysis (neutronics), radiation shielding, nuclear forensics and radiation detection. The code features multiple spatial differencing schemes, state-of-the-art linear solvers, the Koch-Baker-Alcouffe (KBA) parallel-wavefront sweep algorithm for inverting the transport operator, a new multilevel energy decomposition method scaling to hundreds of thousands of processing cores, and a modern, novel code architecture that supports straightforward integration of new features. In this paper we discuss the performance of Denovo on the 10--20 petaflop ORNLmore » GPU-based system, Titan. We describe algorithms and techniques used to exploit the capabilities of Titan's heterogeneous compute node architecture and the challenges of obtaining good parallel performance for this sparse hyperbolic PDE solver containing inherently sequential computations. Numerical results demonstrating Denovo performance on early Titan hardware are presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Ford, W.E. III; Petrie, L.M.
AMPX-77 is a modular system of computer programs that pertain to nuclear analyses, with a primary emphasis on tasks associated with the production and use of multigroup cross sections. AH basic cross-section data are to be input in the formats used by the Evaluated Nuclear Data Files (ENDF/B), and output can be obtained in a variety of formats, including its own internal and very general formats, along with a variety of other useful formats used by major transport, diffusion theory, and Monte Carlo codes. Processing is provided for both neutron and gamma-my data. The present release contains codes all writtenmore » in the FORTRAN-77 dialect of FORTRAN and wig process ENDF/B-V and earlier evaluations, though major modules are being upgraded in order to process ENDF/B-VI and will be released when a complete collection of usable routines is available.« less
Holographic Labeling And Reading Machine For Authentication And Security Appications
Weber, David C.; Trolinger, James D.
1999-07-06
A holographic security label and automated reading machine for marking and subsequently authenticating any object such as an identification badge, a pass, a ticket, a manufactured part, or a package is described. The security label is extremely difficult to copy or even to read by unauthorized persons. The system comprises a holographic security label that has been created with a coded reference wave, whose specification can be kept secret. The label contains information that can be extracted only with the coded reference wave, which is derived from a holographic key, which restricts access of the information to only the possessor of the key. A reading machine accesses the information contained in the label and compares it with data stored in the machine through the application of a joint transform correlator, which is also equipped with a reference hologram that adds additional security to the procedure.
Entanglement-assisted quantum quasicyclic low-density parity-check codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Brun, Todd A.; Devetak, Igor
2009-03-01
We investigate the construction of quantum low-density parity-check (LDPC) codes from classical quasicyclic (QC) LDPC codes with girth greater than or equal to 6. We have shown that the classical codes in the generalized Calderbank-Skor-Steane construction do not need to satisfy the dual-containing property as long as preshared entanglement is available to both sender and receiver. We can use this to avoid the many four cycles which typically arise in dual-containing LDPC codes. The advantage of such quantum codes comes from the use of efficient decoding algorithms such as sum-product algorithm (SPA). It is well known that in the SPA, cycles of length 4 make successive decoding iterations highly correlated and hence limit the decoding performance. We show the principle of constructing quantum QC-LDPC codes which require only small amounts of initial shared entanglement.
Neutronic calculation of fast reactors by the EUCLID/V1 integrated code
NASA Astrophysics Data System (ADS)
Koltashev, D. A.; Stakhanova, A. A.
2017-01-01
This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.
Proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification
NASA Technical Reports Server (NTRS)
Ewen, Denney, W. (Editor); Jensen, Thomas (Editor)
2009-01-01
This NASA conference publication contains the proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification, held as part of LICS in Los Angeles, CA, USA, on August 15, 2009. Software certification demonstrates the reliability, safety, or security of software systems in such a way that it can be checked by an independent authority with minimal trust in the techniques and tools used in the certification process itself. It can build on existing validation and verification (V&V) techniques but introduces the notion of explicit software certificates, Vvilich contain all the information necessary for an independent assessment of the demonstrated properties. One such example is proof-carrying code (PCC) which is an important and distinctive approach to enhancing trust in programs. It provides a practical framework for independent assurance of program behavior; especially where source code is not available, or the code author and user are unknown to each other. The workshop wiII address theoretical foundations of logic-based software certification as well as practical examples and work on alternative application domains. Here "certificate" is construed broadly, to include not just mathematical derivations and proofs but also safety and assurance cases, or any fonnal evidence that supports the semantic analysis of programs: that is, evidence about an intrinsic property of code and its behaviour that can be independently checked by any user, intermediary, or third party. These guarantees mean that software certificates raise trust in the code itself, distinct from and complementary to any existing trust in the creator of the code, the process used to produce it, or its distributor. In addition to the contributed talks, the workshop featured two invited talks, by Kelly Hayhurst and Andrew Appel. The PCC 2009 website can be found at http://ti.arc.nasa.gov /event/pcc 091.
NASA Technical Reports Server (NTRS)
Rudy, David H.; Kumar, Ajay; Thomas, James L.; Gnoffo, Peter A.; Chakravarthy, Sukumar R.
1988-01-01
A comparative study was made using 4 different computer codes for solving the compressible Navier-Stokes equations. Three different test problems were used, each of which has features typical of high speed internal flow problems of practical importance in the design and analysis of propulsion systems for advanced hypersonic vehicles. These problems are the supersonic flow between two walls, one of which contains a 10 deg compression ramp, the flow through a hypersonic inlet, and the flow in a 3-D corner formed by the intersection of two symmetric wedges. Three of the computer codes use similar recently developed implicit upwind differencing technology, while the fourth uses a well established explicit method. The computed results were compared with experimental data where available.
A Picture is Worth 1,000 Words. The Use of Clinical Images in Electronic Medical Records.
Ai, Angela C; Maloney, Francine L; Hickman, Thu-Trang; Wilcox, Allison R; Ramelson, Harley; Wright, Adam
2017-07-12
To understand how clinicians utilize image uploading tools in a home grown electronic health records (EHR) system. A content analysis of patient notes containing non-radiological images from the EHR was conducted. Images from 4,000 random notes from July 1, 2009 - June 30, 2010 were reviewed and manually coded. Codes were assigned to four properties of the image: (1) image type, (2) role of image uploader (e.g. MD, NP, PA, RN), (3) practice type (e.g. internal medicine, dermatology, ophthalmology), and (4) image subject. 3,815 images from image-containing notes stored in the EHR were reviewed and manually coded. Of those images, 32.8% were clinical and 66.2% were non-clinical. The most common types of the clinical images were photographs (38.0%), diagrams (19.1%), and scanned documents (14.4%). MDs uploaded 67.9% of clinical images, followed by RNs with 10.2%, and genetic counselors with 6.8%. Dermatology (34.9%), ophthalmology (16.1%), and general surgery (10.8%) uploaded the most clinical images. The content of clinical images referencing body parts varied, with 49.8% of those images focusing on the head and neck region, 15.3% focusing on the thorax, and 13.8% focusing on the lower extremities. The diversity of image types, content, and uploaders within a home grown EHR system reflected the versatility and importance of the image uploading tool. Understanding how users utilize image uploading tools in a clinical setting highlights important considerations for designing better EHR tools and the importance of interoperability between EHR systems and other health technology.
Atmospheric and wind modeling for ATC
NASA Technical Reports Server (NTRS)
Slater, Gary L.
1990-01-01
The section on atmospheric modeling covers the following topics: the standard atmosphere, atmospheric variations, atmosphere requirements for ATC, and implementation of a software model for Center/Tracon Advisory System (CTAS). The section on wind modeling covers the following topics: wind data -- NOAA profiler system; wind profile estimation; incorporation of various data types into filtering scheme; spatial and temporal variation; and software implementation into CTAS. The appendices contain Matlab codes for atmospheric routines and for wind estimation.
Digital Systems Validation Handbook. Volume 2
1992-07-01
imitate human intelligence functions. ASSURANCE ASSESSMENT. (4) Procedures whose purpose is to ensure that a proposed system functions according to...The spectrum (20 to 20,000 Hz) of human hearing, often defined as extending from approximately 20 Hz to 50 kHz and sometimes to 150 kHz. Audio noise...contained body of code which can be called by other routines to perform a function. SUPER-DIAGNOSTIC FILTER. (7) An algorithm which provides all the
EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.
Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D
2012-01-01
Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.
Posttest analysis of the 1:6-scale reinforced concrete containment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, P.A.; Kennedy, J.M.; Marchertas, A.H.
A prediction of the response of the Sandia National Laboratories 1:6- scale reinforced concrete containment model test was made by Argonne National Laboratory. ANL along with nine other organizations performed a detailed nonlinear response analysis of the 1:6-scale model containment subjected to overpressurization in the fall of 1986. The two-dimensional code TEMP-STRESS and the three-dimensional NEPTUNE code were utilized (1) to predict the global response of the structure, (2) to identify global failure sites and the corresponding failure pressures and (3) to identify some local failure sites and pressure levels. A series of axisymmetric models was studied with the two-dimensionalmore » computer program TEMP-STRESS. The comparison of these pretest computations with test data from the containment model has provided a test for the capability of the respective finite element codes to predict global failure modes, and hence serves as a validation of these codes. Only the two-dimensional analyses will be discussed in this paper. 3 refs., 10 figs.« less
NASA Technical Reports Server (NTRS)
Mckee, James W.
1990-01-01
This volume (2 of 4) contains the specification, structured flow charts, and code listing for the protocol. The purpose of an autonomous power system on a spacecraft is to relieve humans from having to continuously monitor and control the generation, storage, and distribution of power in the craft. This implies that algorithms will have been developed to monitor and control the power system. The power system will contain computers on which the algorithms run. There should be one control computer system that makes the high level decisions and sends commands to and receive data from the other distributed computers. This will require a communications network and an efficient protocol by which the computers will communicate. One of the major requirements on the protocol is that it be real time because of the need to control the power elements.
The Fermilab lattice supercomputer project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischler, M.; Atac, R.; Cook, A.
1989-02-01
The ACPMAPS system is a highly cost effective, local memory MIMD computer targeted at algorithm development and production running for gauge theory on the lattice. The machine consists of a compound hypercube of crates, each of which is a full crossbar switch containing several processors. The processing nodes are single board array processors based on the Weitek XL chip set, each with a peak power of 20 MFLOPS and supported by 8 MBytes of data memory. The system currently being assembled has a peak power of 5 GFLOPS, delivering performance at approximately $250/MFLOP. The system is programmable in C andmore » Fortran. An underpinning of software routines (CANOPY) provides an easy and natural way of coding lattice problems, such that the details of parallelism, and communication and system architecture are transparent to the user. CANOPY can easily be ported to any single CPU or MIMD system which supports C, and allows the coding of typical applications with very little effort. 3 refs., 1 fig.« less
NASA Astrophysics Data System (ADS)
Villiger, Arturo; Schaer, Stefan; Dach, Rolf; Prange, Lars; Jäggi, Adrian
2017-04-01
It is common to handle code biases in the Global Navigation Satellite System (GNSS) data analysis as conventional differential code biases (DCBs): P1-C1, P1-P2, and P2-C2. Due to the increasing number of signals and systems in conjunction with various tracking modes for the different signals (as defined in RINEX3 format), the number of DCBs would increase drastically and the bookkeeping becomes almost unbearable. The Center for Orbit Determination in Europe (CODE) has thus changed its processing scheme to observable-specific signal biases (OSB). This means that for each observation involved all related satellite and receiver biases are considered. The OSB contributions from various ionosphere analyses (geometry-free linear combination) using different observables and frequencies and from clock analyses (ionosphere-free linear combination) are then combined on normal equation level. By this, one consistent set of OSB values per satellite and receiver can be obtained that contains all information needed for GNSS-related processing. This advanced procedure of code bias handling is now also applied to the IGS (International GNSS Service) MGEX (Multi-GNSS Experiment) procedure at CODE. Results for the biases from the legacy IGS solution as well as the CODE MGEX processing (considering GPS, GLONASS, Galileo, BeiDou, and QZSS) are presented. The consistency with the traditional method is confirmed and the new results are discussed regarding the long-term stability. When processing code data, it is essential to know the true observable types in order to correct for the associated biases. CODE has been verifying the receiver tracking technologies for GPS based on estimated DCB multipliers (for the RINEX 2 case). With the change to OSB, the original verification approach was extended to search for the best fitting observable types based on known OSB values. In essence, a multiplier parameter is estimated for each involved GNSS observable type. This implies that we could recover, for receivers tracking a combination of signals, even the factors of these combinations. The verification of the observable types is crucial to identify the correct observable types of RINEX 2 data (which does not contain the signal modulation in comparison to RINEX 3). The correct information of the used observable types is essential for precise point positioning (PPP) applications and GNSS ambiguity resolution. Multi-GNSS OSBs and verified receiver tracking modes are essential to get best possible multi-GNSS solutions for geodynamic purposes and other applications.
DARKDROID: Exposing the Dark Side of Android Marketplaces
2016-06-01
Moreover, our approaches can detect apps containing both intentional and unintentional vulnerabilities, such as unsafe code loading mechanisms and...Security, Static Analysis, Dynamic Analysis, Malware Detection , Vulnerability Scanning 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18...applications in a DoD context. ................... 1 1.2.2 Develop sophisticated whole-system static analyses to detect malicious Android applications
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Paris (France).
The seven levels of education, as classified numerically by International Standard Classification of Education (ISCED), are defined along with courses, programs, and fields of education listed under each level. Also contained is an alphabetical subject index indicating appropriate code numbers. For related documents see TM003535 and TM003536. (RC)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
... #0; #0;Rules and Regulations #0; Federal Register #0; #0; #0;This section of the FEDERAL REGISTER contains regulatory documents #0;having general applicability and legal effect, most of which are keyed #0;to and codified in the Code of Federal Regulations, which is published #0;under 50 titles pursuant to...
77 FR 4203 - List of Approved Spent Fuel Storage Casks: MAGNASTOR® System, Revision 2
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-27
... #0; #0;Rules and Regulations #0; Federal Register #0; #0; #0;This section of the FEDERAL REGISTER contains regulatory documents #0;having general applicability and legal effect, most of which are keyed #0;to and codified in the Code of Federal Regulations, which is published #0;under 50 titles pursuant to...
Nehme, A; Zibara, K; Cerutti, C; Bricca, G
2015-06-01
The implication of the renin-angiotensin-aldosterone system (RAAS) in atheroma development is well described. However, a complete view of the local RAAS in atheroma is still missing. In this study we aimed to reveal the organization of RAAS in atheroma at the transcriptomic level and identify the transcriptional regulators behind it. Extended RAAS (extRAAS) was defined as the set of 37 genes coding for classical and novel RAAS participants (Figure 1). Five microarray datasets containing overall 590 samples representing carotid and peripheral atheroma were downloaded from the GEO database. Correlation-based hierarchical clustering (R software) of extRAAS genes within each dataset allowed the identification of modules of co-expressed genes. Reproducible co-expression modules across datasets were then extracted. Transcription factors (TFs) having common binding sites (TFBSs) in the promoters of coordinated genes were identified using the Genomatix database tools and analyzed for their correlation with extRAAS genes in the microarray datasets. Expression data revealed the expressed extRAAS components and their relative abundance displaying the favored pathways in atheroma. Three co-expression modules with more than 80% reproducibility across datasets were extracted. Two of them (M1 and M2) contained genes coding for angiotensin metabolizing enzymes involved in different pathways: M1 included ACE, MME, RNPEP, and DPP3, in addition to 7 other genes; and M2 included CMA1, CTSG, and CPA3. The third module (M3) contained genes coding for receptors known to be implicated in atheroma (AGTR1, MR, GR, LNPEP, EGFR and GPER). M1 and M3 were negatively correlated in 3 of 5 datasets. We identified 19 TFs that have enriched TFBSs in the promoters of genes of M1, and two for M3, but none was found for M2. Among the extracted TFs, ELF1, MAX, and IRF5 showed significant positive correlations with peptidase-coding genes from M1 and negative correlations with receptors-coding genes from M3 (p < 0.05). The identified co-expression modules display the transcriptional organization of local extRAAS in human carotid atheroma. The identification of several TFs potentially associated to extRAAS genes may provide a frame for the discovery of atheroma-specific modulators of extRAAS activity.(Figure is included in full-text article.).
ERIC Educational Resources Information Center
Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark
2012-01-01
A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…
NASA Astrophysics Data System (ADS)
Kaveeshwar, Ashok; Rodriguez, Raul
1993-01-01
The STARSYS system is aimed at providing low cost global positioning and messaging by satellite. STARSYS is a data only system using very brief message transmission times. Code Division Multiple Access (CDMA) is used for frequency sharing; it enables transmitted data to be coded for unique user identification and also to employ Doppler and radio ranging to determine the geographical location of a transmitting terminal. The STARSYS system is composed of the field receiver/transmitter, the Low Earth Orbit (LEO) satellite constellation and the ground station elements. Each message transmission is able to contain up to thirty two digital characters. Market applications are numerous: theft control, vehicle and logistic tracking and messaging, personal communications, utility and environmental data acquisition and transfer. Co-primary frequency allocation at WARC-92 accelerated interest for potential customers and investors, although the amount of frequency allocation (less than 1 GHz) is small.
World`s first SPB LNG carrier ``POLAR EAGLE``
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aoki, Eiji; Nakajima, Yoshiyuki; Yamada, Koichiro
1994-12-31
The world`s first Self-supporting Prismatic-shape IMO type B (SPB) LNG Carrier named ``POLAR EAGLE`` has been delivered to Phillips 66 Natural Gas Company and Marathon Oil Company in June, 1993. The cargo containment system installed onboard the vessel, SPB cargo containment system, was developed by Ishikawajima-Harima Heavy Industries Co., Ltd. (IHI) and fully complies with IMO Gas Carrier Code for a type B independent tank. ``POLAR EAGLE`` was constructed in the authors` Aichi works and delivered 34 months after the contract of the vessel. Its performance was confirmed through various kinds of tests and inspections during construction of the vessel.more » Results of typical tests and inspections are introduced.« less
NASA Technical Reports Server (NTRS)
2004-01-01
Two-dimensional data matrix symbols, which contain encoded letters and numbers, are permanently etched on items for identification. They can store up to 100 times more information than traditional bar codes. While the symbols provide several advantages over bar codes, once they are covered by paint they can no longer be read by optical scanners. Since most products are painted eventually, this presents a problem for industries relying on the symbols for identification and tracking. In 1987, NASA s Marshall Space Flight Center began studying direct parts marking with matrix symbols in order to track millions of Space Shuttle parts. Advances in the technology proved that by incorporating magnetic properties into the paints, inks, and pastes used to apply the matrix symbols, the codes could be read by a magnetic scanner even after being covered with paint or other coatings. NASA received a patent for such a scanner in 1998, but the system it used for development was not portable and was too costly. A prototype was needed as a lead-in to a production model. In the summer of 2000, NASA began seeking companies to build a hand-held scanner that would detect the Read Through Paint data matrix identification marks containing magnetic materials through coatings.
76 FR 44977 - Shipping Coordinating Committee; Notice of Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-27
... packing of cargo transport units. --Consideration for the efficacy of Container Inspection Programme... Dangerous Goods, Solid Cargoes and Containers (DSC 16) to be held at IMO Headquarters, London, United... Solid Bulk Cargoes Code (IMSBC Code) including evaluation of properties of solid bulk cargos. --Casualty...
NASA Technical Reports Server (NTRS)
Mallasch, Paul G.; Babic, Slavoljub
1994-01-01
The United States Air Force (USAF) provides NASA Lewis Research Center with monthly reports containing the Synchronous Satellite Catalog and the associated Two Line Mean Element Sets. The USAF Synchronous Satellite Catalog supplies satellite orbital parameters collected by an automated monitoring system and provided to Lewis Research Center as text files on magnetic tape. Software was developed to facilitate automated formatting, data normalization, cross-referencing, and error correction of Synchronous Satellite Catalog files before loading into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). This document contains the User's Guide and Software Maintenance Manual with information necessary for installation, initialization, start-up, operation, error recovery, and termination of the software application. It also contains implementation details, modification aids, and software source code adaptations for use in future revisions.
Customized data container for improved performance in optical cryptosystems
NASA Astrophysics Data System (ADS)
Vélez Zea, Alejandro; Fredy Barrera, John; Torroba, Roberto
2016-12-01
Coherent optical encryption procedures introduce speckle noise to the output, limiting many practical applications. Until now the only method available to avoid this noise is to codify the information to be processed into a container that is encrypted instead of the original data. Although the decrypted container presents the noise due to the optical processing, their features remain recognizable enough to allow decoding, bringing the original information free of any kind of degradation. The first adopted containers were the quick response (QR) codes. However, the limitations of optical encryption procedures and the features of QR codes imply that in practice only simple codes containing small amounts of data can be processed without large experimental requirements. In order to overcome this problem, we introduce the first tailor made container to be processed in optical cryptosystems, ensuring larger noise tolerance and the ability to process more information with less experimental requirements. We present both simulations and experimental results to demonstrate the advantages of our proposal.
Extension of Generalized Fluid System Simulation Program's Fluid Property Database
NASA Technical Reports Server (NTRS)
Patel, Kishan
2011-01-01
This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.
Report of AAPM Task Group 162: Software for planar image quality metrology.
Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J
2018-02-01
The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.
ATLAS Test Program Generator II (AGEN II). Volume I. Executive Software System.
1980-08-01
features. l-1 C. To provide detailed descriptions of each of the system components and modules and their corresponding flowcharts. D. To describe methods of...contains the FORTRAN source code listings to enable programmer to do the expansions and modifications. The methods and details of adding another...characteristics of the network. The top-down implementa- tion method is therefore suggested. This method starts at the top by designing the IVT modules in
Evaluation of Agency Non-Code Layered Pressure Vessels (LPVs) . Volume 2; Appendices
NASA Technical Reports Server (NTRS)
Prosser, William H.
2014-01-01
In coordination with the Office of Safety and Mission Assurance and the respective Center Pressure System Managers (PSMs), the NASA Engineering and Safety Center (NESC) was requested to formulate a consensus draft proposal for the development of additional testing and analysis methods to establish the technical validity, and any limitation thereof, for the continued safe operation of facility non-code layered pressure vessels. The PSMs from each NASA Center were asked to participate as part of the assessment team by providing, collecting, and reviewing data regarding current operations of these vessels. This document contains the appendices to the main report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, A.W.
1990-04-01
This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost becomemore » tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies.« less
Fission yield and criticality excursion code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanchard, A.
2000-06-30
The ANSI/ANS 8.3 standard allows a maximum yield not to exceed 2 x 10 fissions to calculate requiring the alarm system to be effective. It is common practice to use this allowance or to develop some other yield based on past criticality accident history or excursion experiments. The literature on the subject of yields discusses maximum yields larger and somewhat smaller than the ANS 8.3 permissive value. The ability to model criticality excursions and vary the various parameters to determine a credible maximum yield for operational specific cases has been available for some time but is not in common usemore » by criticality safety specialists. The topic of yields for various solution, metal, oxide powders, etc. in various geometry's and containers has been published by laboratory specialists or university staff and students for many decades but have not been available to practitioners. The need for best-estimate calculations of fission yields with a well-validated criticality excursion code has long been recognized. But no coordinated effort has been made so far to develop a generalized and well-validated excursion code for different types of systems. In this paper, the current practices to estimate fission yields are summarized along with its shortcomings for the 12-Rad zone (at SRS) and Criticality Alarm System (CAS) calculations. Finally the need for a user-friendly excursion code is reemphasized.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, J.; Kucukboyaci, V. N.; Nguyen, L.
2012-07-01
The Westinghouse Small Modular Reactor (SMR) is an 800 MWt (> 225 MWe) integral pressurized water reactor (iPWR) with all primary components, including the steam generator and the pressurizer located inside the reactor vessel. The reactor core is based on a partial-height 17x17 fuel assembly design used in the AP1000{sup R} reactor core. The Westinghouse SMR utilizes passive safety systems and proven components from the AP1000 plant design with a compact containment that houses the integral reactor vessel and the passive safety systems. A preliminary loss of coolant accident (LOCA) analysis of the Westinghouse SMR has been performed using themore » WCOBRA/TRAC-TF2 code, simulating a transient caused by a double ended guillotine (DEG) break in the direct vessel injection (DVI) line. WCOBRA/TRAC-TF2 is a new generation Westinghouse LOCA thermal-hydraulics code evolving from the US NRC licensed WCOBRA/TRAC code. It is designed to simulate PWR LOCA events from the smallest break size to the largest break size (DEG cold leg). A significant number of fluid dynamics models and heat transfer models were developed or improved in WCOBRA/TRAC-TF2. A large number of separate effects and integral effects tests were performed for a rigorous code assessment and validation. WCOBRA/TRAC-TF2 was introduced into the Westinghouse SMR design phase to assist a quick and robust passive cooling system design and to identify thermal-hydraulic phenomena for the development of the SMR Phenomena Identification Ranking Table (PIRT). The LOCA analysis of the Westinghouse SMR demonstrates that the DEG DVI break LOCA is mitigated by the injection and venting from the Westinghouse SMR passive safety systems without core heat up, achieving long term core cooling. (authors)« less
Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel
2015-04-01
Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert
2015-05-28
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less
Reliable sex and strain discrimination in the mouse vomeronasal organ and accessory olfactory bulb.
Tolokh, Illya I; Fu, Xiaoyan; Holy, Timothy E
2013-08-21
Animals modulate their courtship and territorial behaviors in response to olfactory cues produced by other animals. In rodents, detecting these cues is the primary role of the accessory olfactory system (AOS). We sought to systematically investigate the natural stimulus coding logic and robustness in neurons of the first two stages of accessory olfactory processing, the vomeronasal organ (VNO) and accessory olfactory bulb (AOB). We show that firing rate responses of just a few well-chosen mouse VNO or AOB neurons can be used to reliably encode both sex and strain of other mice from cues contained in urine. Additionally, we show that this population code can generalize to new concentrations of stimuli and appears to represent stimulus identity in terms of diverging paths in coding space. Together, the results indicate that firing rate code on the temporal order of seconds is sufficient for accurate classification of pheromonal patterns at different concentrations and may be used by AOS neural circuitry to discriminate among naturally occurring urine stimuli.
Domestic Ice Breaking (DOMICE) Simulation Model User Guide
2013-02-01
Second, add new ice data to the variable “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (D9_historical_ice_d3), which contains the...within that “ NBL ” scheme. The interpretation of the SIGRID ice codes into ice thickness estimates is also contained within the sub- module “District 9...User Guide) “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (see Section 5.1.1.3.2 of this User Guide) “Historical District 1 Weekly Air
NASA Technical Reports Server (NTRS)
Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.
1983-01-01
Volume 3 of a 3-volume technical memoranda which contains documentation of the GLAS fourth order genera circulation model is presented. The volume contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A dictionary of FORTRAN variables used in the Scalar Version, and listings of the FORTRAN Code compiled with the C-option, are included. Cross reference maps of local variables are included for each subroutine.
TEMPEST code simulations of hydrogen distribution in reactor containment structures. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trent, D.S.; Eyler, L.L.
The mass transport version of the TEMPEST computer code was used to simulate hydrogen distribution in geometric configurations relevant to reactor containment structures. Predicted results of Battelle-Frankfurt hydrogen distribution tests 1 to 6, and 12 are presented. Agreement between predictions and experimental data is good. Best agreement is obtained using the k-epsilon turbulence model in TEMPEST in flow cases where turbulent diffusion and stable stratification are dominant mechanisms affecting transport. The code's general analysis capabilities are summarized.
NASA Technical Reports Server (NTRS)
McCurdy, David R.; Roche, Joseph M.
2004-01-01
In support of NASA's Next Generation Launch Technology (NGLT) program, the Andrews Gryphon booster was studied. The Andrews Gryphon concept is a horizontal lift-off, two-stage-to-orbit, reusable launch vehicle that uses an air collection and enrichment system (ACES). The purpose of the ACES is to collect atmospheric oxygen during a subsonic flight loiter phase and cool it to cryogenic temperature, ultimately resulting in a reduced initial take-off weight To study the performance and size of an air-collection based booster, an initial airplane like shape was established as a baseline and modeled in a vehicle sizing code. The code, SIZER, contains a general series of volume, surface area, and fuel fraction relationships that tie engine and ACES performance with propellant requirements and volumetric constraints in order to establish vehicle closure for the given mission. A key element of system level weight optimization is the use of the SIZER program that provides rapid convergence and a great deal of flexibility for different tank architectures and material suites in order to study their impact on gross lift-off weight. This paper discusses important elements of the sizing code architecture followed by highlights of the baseline booster study.
Duke, Jon D.; Friedlin, Jeff
2010-01-01
Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964
Development of a MELCOR Sodium Chemistry (NAC) Package - FY17 Progress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
This report describes the status of the development of MELCOR Sodium Chemistry (NAC) package. This development is based on the CONTAIN-LMR sodium physics and chemistry models to be implemented in MELCOR. In the past three years, the sodium equation of state as a working fluid from the nuclear fusion safety research and from the SIMMER code has been implemented into MELCOR. The chemistry models from the CONTAIN-LMR code, such as the spray and pool fire mode ls, have also been implemented into MELCOR. This report describes the implemented models and the issues encountered. Model descriptions and input descriptions are provided.more » Development testing of the spray and pool fire models is described, including the code-to-code comparison with CONTAIN-LMR. The report ends with an expected timeline for the remaining models to be implemented, such as the atmosphere chemistry, sodium-concrete interactions, and experimental validation tests .« less
48 CFR 304.7001 - Numbering acquisitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... contracting office identification codes currently in use is contained in the DCIS Users' Manual, available at... than one code may apply in a specific situation, or for additional codes, refer to the DCIS Users' Manual or consult with the cognizant DCIS coordinator/focal point for guidance on which code governs...
NASA Technical Reports Server (NTRS)
Davis, J. E.; Medan, R. T.
1977-01-01
This segment of the POTFAN system is used to generate right hand sides (boundary conditions) of the system of equations associated with the flow field under consideration. These specified flow boundary conditions are encountered in the oblique derivative boundary value problem (boundary value problem of the third kind) and contain the Neumann boundary condition as a special case. Arbitrary angle of attack and/or sideslip and/or rotation rates may be specified, as well as an arbitrary, nonuniform external flow field and the influence of prescribed singularity distributions.
The Armed Forces Casualty Assistance Readiness Enhancement System (CARES): Design for Flexibility
2006-06-01
Special Form SQL Structured Query Language SSA Social Security Administration U USMA United States Military Academy V VB Visual Basic VBA Visual Basic for...of Abbreviations ................................................................... 26 Appendix B: Key VBA Macros and MS Excel Coding...internet portal, CARES Version 1.0 is a MS Excel spreadsheet application that contains a considerable number of Visual Basic for Applications ( VBA
NASA Astrophysics Data System (ADS)
Ivankovic, D.; Dadic, V.
2009-04-01
Some of oceanographic parameters have to be manually inserted into database; some (for example data from CTD probe) are inserted from various files. All this parameters requires visualization, validation and manipulation from research vessel or scientific institution, and also public presentation. For these purposes is developed web based system, containing dynamic sql procedures and java applets. Technology background is Oracle 10g relational database, and Oracle application server. Web interfaces are developed using PL/SQL stored database procedures (mod PL/SQL). Additional parts for data visualization include use of Java applets and JavaScript. Mapping tool is Google maps API (javascript) and as alternative java applet. Graph is realized as dynamically generated web page containing java applet. Mapping tool and graph are georeferenced. That means that click on some part of graph, automatically initiate zoom or marker onto location where parameter was measured. This feature is very useful for data validation. Code for data manipulation and visualization are partially realized with dynamic SQL and that allow as to separate data definition and code for data manipulation. Adding new parameter in system requires only data definition and description without programming interface for this kind of data.
17 CFR 232.106 - Prohibition against electronic submissions containing executable code.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false Prohibition against electronic submissions containing executable code. 232.106 Section 232.106 Commodity and Securities Exchanges SECURITIES... Filer Manual section also may be a violation of the Computer Fraud and Abuse Act of 1986, as amended...
17 CFR 232.106 - Prohibition against electronic submissions containing executable code.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false Prohibition against electronic submissions containing executable code. 232.106 Section 232.106 Commodity and Securities Exchanges SECURITIES... Filer Manual section also may be a violation of the Computer Fraud and Abuse Act of 1986, as amended...
17 CFR 232.106 - Prohibition against electronic submissions containing executable code.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false Prohibition against electronic submissions containing executable code. 232.106 Section 232.106 Commodity and Securities Exchanges SECURITIES... Filer Manual section also may be a violation of the Computer Fraud and Abuse Act of 1986, as amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
...) which specify that Illinois' surface coating VOC emission limitations shall not apply to touch-up and... Administrative Code (Ill. Adm. Code) by adding a ``small container exemption'' for pleasure craft surface coating... technology (RACT) policy. DATES: This final rule is effective on May 20, 2013. ADDRESSES: EPA has established...
27 CFR 22.113 - Receipt of tax-free alcohol.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., it shall be placed in the storage facilities prescribed by § 22.91 and kept there under lock until withdrawn for use. Unless required by city or State fire code regulations or authorized by the appropriate... alcohol is transferred to “safety” containers in accordance with fire code regulations, the containers to...
27 CFR 22.113 - Receipt of tax-free alcohol.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., it shall be placed in the storage facilities prescribed by § 22.91 and kept there under lock until withdrawn for use. Unless required by city or State fire code regulations or authorized by the appropriate... alcohol is transferred to “safety” containers in accordance with fire code regulations, the containers to...
27 CFR 22.113 - Receipt of tax-free alcohol.
Code of Federal Regulations, 2014 CFR
2014-04-01
..., it shall be placed in the storage facilities prescribed by § 22.91 and kept there under lock until withdrawn for use. Unless required by city or State fire code regulations or authorized by the appropriate... alcohol is transferred to “safety” containers in accordance with fire code regulations, the containers to...
27 CFR 22.113 - Receipt of tax-free alcohol.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., it shall be placed in the storage facilities prescribed by § 22.91 and kept there under lock until withdrawn for use. Unless required by city or State fire code regulations or authorized by the appropriate... alcohol is transferred to “safety” containers in accordance with fire code regulations, the containers to...
27 CFR 22.113 - Receipt of tax-free alcohol.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., it shall be placed in the storage facilities prescribed by § 22.91 and kept there under lock until withdrawn for use. Unless required by city or State fire code regulations or authorized by the appropriate... alcohol is transferred to “safety” containers in accordance with fire code regulations, the containers to...
DNA as a Binary Code: How the Physical Structure of Nucleotide Bases Carries Information
ERIC Educational Resources Information Center
McCallister, Gary
2005-01-01
The DNA triplet code also functions as a binary code. Because double-ring compounds cannot bind to double-ring compounds in the DNA code, the sequence of bases classified simply as purines or pyrimidines can encode for smaller groups of possible amino acids. This is an intuitive approach to teaching the DNA code. (Contains 6 figures.)
Evaluation of a visual layering methodology for colour coding control room displays.
Van Laar, Darren; Deshe, Ofer
2002-07-01
Eighteen people participated in an experiment in which they were asked to search for targets on control room like displays which had been produced using three different coding methods. The monochrome coding method displayed the information in black and white only, the maximally discriminable method contained colours chosen for their high perceptual discriminability, the visual layers method contained colours developed from psychological and cartographic principles which grouped information into a perceptual hierarchy. The visual layers method produced significantly faster search times than the other two coding methods which did not differ significantly from each other. Search time also differed significantly for presentation order and for the method x order interaction. There was no significant difference between the methods in the number of errors made. Participants clearly preferred the visual layers coding method. Proposals are made for the design of experiments to further test and develop the visual layers colour coding methodology.
GANDALF - Graphical Astrophysics code for N-body Dynamics And Lagrangian Fluids
NASA Astrophysics Data System (ADS)
Hubber, D. A.; Rosotti, G. P.; Booth, R. A.
2018-01-01
GANDALF is a new hydrodynamics and N-body dynamics code designed for investigating planet formation, star formation and star cluster problems. GANDALF is written in C++, parallelized with both OPENMP and MPI and contains a PYTHON library for analysis and visualization. The code has been written with a fully object-oriented approach to easily allow user-defined implementations of physics modules or other algorithms. The code currently contains implementations of smoothed particle hydrodynamics, meshless finite-volume and collisional N-body schemes, but can easily be adapted to include additional particle schemes. We present in this paper the details of its implementation, results from the test suite, serial and parallel performance results and discuss the planned future development. The code is freely available as an open source project on the code-hosting website github at https://github.com/gandalfcode/gandalf and is available under the GPLv2 license.
NASA Astrophysics Data System (ADS)
Tanikawa, Ataru; Yoshikawa, Kohji; Okamoto, Takashi; Nitadori, Keigo
2012-02-01
We present a high-performance N-body code for self-gravitating collisional systems accelerated with the aid of a new SIMD instruction set extension of the x86 architecture: Advanced Vector eXtensions (AVX), an enhanced version of the Streaming SIMD Extensions (SSE). With one processor core of Intel Core i7-2600 processor (8 MB cache and 3.40 GHz) based on Sandy Bridge micro-architecture, we implemented a fourth-order Hermite scheme with individual timestep scheme ( Makino and Aarseth, 1992), and achieved the performance of ˜20 giga floating point number operations per second (GFLOPS) for double-precision accuracy, which is two times and five times higher than that of the previously developed code implemented with the SSE instructions ( Nitadori et al., 2006b), and that of a code implemented without any explicit use of SIMD instructions with the same processor core, respectively. We have parallelized the code by using so-called NINJA scheme ( Nitadori et al., 2006a), and achieved ˜90 GFLOPS for a system containing more than N = 8192 particles with 8 MPI processes on four cores. We expect to achieve about 10 tera FLOPS (TFLOPS) for a self-gravitating collisional system with N ˜ 10 5 on massively parallel systems with at most 800 cores with Sandy Bridge micro-architecture. This performance will be comparable to that of Graphic Processing Unit (GPU) cluster systems, such as the one with about 200 Tesla C1070 GPUs ( Spurzem et al., 2010). This paper offers an alternative to collisional N-body simulations with GRAPEs and GPUs.
Organizing and Typing Persistent Objects Within an Object-Oriented Framework
NASA Technical Reports Server (NTRS)
Madany, Peter W.; Campbell, Roy H.
1991-01-01
Conventional operating systems provide little or no direct support for the services required for an efficient persistent object system implementation. We have built a persistent object scheme using a customization and extension of an object-oriented operating system called Choices. Choices includes a framework for the storage of persistent data that is suited to the construction of both conventional file system and persistent object system. In this paper we describe three areas in which persistent object support differs from file system support: storage organization, storage management, and typing. Persistent object systems must support various sizes of objects efficiently. Customizable containers, which are themselves persistent objects and can be nested, support a wide range of object sizes in Choices. Collections of persistent objects that are accessed as an aggregate and collections of light-weight persistent objects can be clustered in containers that are nested within containers for larger objects. Automated garbage collection schemes are added to storage management and have a major impact on persistent object applications. The Choices persistent object store provides extensible sets of persistent object types. The store contains not only the data for persistent objects but also the names of the classes to which they belong and the code for the operation of the classes. Besides presenting persistent object storage organization, storage management, and typing, this paper discusses how persistent objects are named and used within the Choices persistent data/file system framework.
Praxis language reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, J.H.
1981-01-01
This document is a language reference manual for the programming language Praxis. The document contains the specifications that must be met by any compiler for the language. The Praxis language was designed for systems programming in real-time process applications. Goals for the language and its implementations are: (1) highly efficient code generated by the compiler; (2) program portability; (3) completeness, that is, all programming requirements can be met by the language without needing an assembler; and (4) separate compilation to aid in design and management of large systems. The language does not provide any facilities for input/output, stack and queuemore » handling, string operations, parallel processing, or coroutine processing. These features can be implemented as routines in the language, using machine-dependent code to take advantage of facilities in the control environment on different machines.« less
The design plan of a VLSI single chip (255, 223) Reed-Solomon decoder
NASA Technical Reports Server (NTRS)
Hsu, I. S.; Shao, H. M.; Deutsch, L. J.
1987-01-01
The very large-scale integration (VLSI) architecture of a single chip (255, 223) Reed-Solomon decoder for decoding both errors and erasures is described. A decoding failure detection capability is also included in this system so that the decoder will recognize a failure to decode instead of introducing additional errors. This could happen whenever the received word contains too many errors and erasures for the code to correct. The number of transistors needed to implement this decoder is estimated at about 75,000 if the delay for received message is not included. This is in contrast to the older transform decoding algorithm which needs about 100,000 transistors. However, the transform decoder is simpler in architecture than the time decoder. It is therefore possible to implement a single chip (255, 223) Reed-Solomon decoder with today's VLSI technology. An implementation strategy for the decoder system is presented. This represents the first step in a plan to take advantage of advanced coding techniques to realize a 2.0 dB coding gain for future space missions.
Sherrod, David R.; Keith, Mackenzie K.
2018-03-30
A substantial part of the U.S. Pacific Northwest is underlain by Cenozoic volcanic and continental sedimentary rocks and, where widespread, these strata form important aquifers. The legacy geologic mapping presented with this report contains new thematic categorization added to state digital compilations published by the U.S. Geological Survey for Oregon, California, Idaho, Nevada, Utah, and Washington (Ludington and others, 2005). Our additional coding is designed to allow rapid characterization, mainly for hydrogeologic purposes, of similar rocks and deposits within a boundary expanded slightly beyond that of the Pacific Northwest Volcanic Aquifer System study area. To be useful for hydrogeologic analysis and to be more statistically manageable, statewide compilations from Ludington and others (2005) were mosaicked into a regional map and then reinterpreted into four main categories on the basis of (1) age, (2) composition, (3) hydrogeologic grouping, and (4) lithologic pattern. The coding scheme emphasizes Cenozoic volcanic or volcanic-related rocks and deposits, and of primary interest are the codings for composition and age.
Adapting a Clinical Data Repository to ICD-10-CM through the use of a Terminology Repository
Cimino, James J.; Remennick, Lyubov
2014-01-01
Clinical data repositories frequently contain patient diagnoses coded with the International Classification of Diseases, Ninth Revision (ICD-9-CM). These repositories now need to accommodate data coded with the Tenth Revision (ICD-10-CM). Database users wish to retrieve relevant data regardless of the system by which they are coded. We demonstrate how a terminology repository (the Research Entities Dictionary or RED) serves as an ontology relating terms of both ICD versions to each other to support seamless version-independent retrieval from the Biomedical Translational Research Information System (BTRIS) at the National Institutes of Health. We make use of the Center for Medicare and Medicaid Services’ General Equivalence Mappings (GEMs) to reduce the modeling effort required to determine whether ICD-10-CM terms should be added to the RED as new concepts or as synonyms of existing concepts. A divide-and-conquer approach is used to develop integration heuristics that offer a satisfactory interim solution and facilitate additional refinement of the integration as time and resources allow. PMID:25954344
Make Movies out of Your Dynamical Simulations with OGRE!
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.
2013-10-01
We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.
Make Movies out of Your Dynamical Simulations with OGRE!
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.
2014-01-01
We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.
Evolved Design, Integration, and Test of a Modular, Multi-Link, Spacecraft-Based Robotic Manipulator
2016-06-01
of the MATLAB code, the SPART model [24]. The portions of the SPART model relevant to this thesis are contained in (Appendices E –P). While the SPART...the kinematics and the dynamics of the system must be modeled and simulated numerically to understand how the system will behave for a given number... simulators with multiple-link robotic arms has been ongoing. B . STATE OF THE ART 1. An Overarching Context Space-based manipulators and the experimental
Computer Language For Optimization Of Design
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.; Lucas, Stephen H.
1991-01-01
SOL is computer language geared to solution of design problems. Includes mathematical modeling and logical capabilities of computer language like FORTRAN; also includes additional power of nonlinear mathematical programming methods at language level. SOL compiler takes SOL-language statements and generates equivalent FORTRAN code and system calls. Provides syntactic and semantic checking for recovery from errors and provides detailed reports containing cross-references to show where each variable used. Implemented on VAX/VMS computer systems. Requires VAX FORTRAN compiler to produce executable program.
Automated UMLS-Based Comparison of Medical Forms
Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard
2013-01-01
Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827
Brugha, Ruairí; Crowe, Sophie
2015-05-20
The relevance and effectiveness of the World Health Organization's (WHO's) Global Code of Practice on the International Recruitment of Health Personnel is being reviewed in 2015. The Code, which is a set of ethical norms and principles adopted by the World Health Assembly (WHA) in 2010, urges members states to train and retain the health personnel they need, thereby limiting demand for international migration, especially from the under-staffed health systems in low- and middle-income countries. Most countries failed to submit a first report in 2012 on implementation of the Code, including those source countries whose health systems are most under threat from the recruitment of their doctors and nurses, often to work in 4 major destination countries: the United States, United Kingdom, Canada and Australia. Political commitment by source country Ministers of Health needs to have been achieved at the May 2015 WHA to ensure better reporting by these countries on Code implementation for it to be effective. This paper uses ethics and health systems perspectives to analyse some of the drivers of international recruitment. The balance of competing ethics principles, which are contained in the Code's articles, reflects a tension that was evident during the drafting of the Code between 2007 and 2010. In 2007-2008, the right of health personnel to migrate was seen as a preeminent principle by US representatives on the Global Council which co-drafted the Code. Consensus on how to balance competing ethical principles--giving due recognition on the one hand to the obligations of health workers to the countries that trained them and the need for distributive justice given the global inequities of health workforce distribution in relation to need, and the right to migrate on the other hand--was only possible after President Obama took office in January 2009. It is in the interests of all countries to implement the Global Code and not just those that are losing their health personnel through international recruitment, given that it calls on all member states "to educate, retain and sustain a health workforce that is appropriate for their (need) ..." (Article 5.4), to ensure health systems' sustainability. However, in some wealthy destination countries, this means tackling national inequities and poorly designed health workforce strategies that result in foreign-trained doctors being recruited to work among disadvantaged populations and in primary care settings, allowing domestically trained doctors work in more attractive hospital settings. © 2015 by Kerman University of Medical Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William
ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theoristsmore » alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hongbin; Zhao, Haihua; Gleicher, Frederick Nathan
RELAP-7 is a nuclear systems safety analysis code being developed at the Idaho National Laboratory, and is the next generation tool in the RELAP reactor safety/systems analysis application series. RELAP-7 development began in 2011 to support the Risk Informed Safety Margins Characterization (RISMC) Pathway of the Light Water Reactor Sustainability (LWRS) program. The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical methods, and physical models in order to provide capabilities needed for the RISMC methodology and to support nuclear power safety analysis. The code is beingmore » developed based on Idaho National Laboratory’s modern scientific software development framework – MOOSE (the Multi-Physics Object-Oriented Simulation Environment). The initial development goal of the RELAP-7 approach focused primarily on the development of an implicit algorithm capable of strong (nonlinear) coupling of the dependent hydrodynamic variables contained in the 1-D/2-D flow models with the various 0-D system reactor components that compose various boiling water reactor (BWR) and pressurized water reactor nuclear power plants (NPPs). During Fiscal Year (FY) 2015, the RELAP-7 code has been further improved with expanded capability to support boiling water reactor (BWR) and pressurized water reactor NPPs analysis. The accumulator model has been developed. The code has also been coupled with other MOOSE-based applications such as neutronics code RattleSnake and fuel performance code BISON to perform multiphysics analysis. A major design requirement for the implicit algorithm in RELAP-7 is that it is capable of second-order discretization accuracy in both space and time, which eliminates the traditional first-order approximation errors. The second-order temporal is achieved by a second-order backward temporal difference, and the one-dimensional second-order accurate spatial discretization is achieved with the Galerkin approximation of Lagrange finite elements. During FY-2015, we have done numerical verification work to verify that the RELAP-7 code indeed achieves 2nd-order accuracy in both time and space for single phase models at the system level.« less
New PANDA Tests to Investigate Effects of Light Gases on Passive Safety Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paladino, D.; Auban, O.; Candreia, P.
The large- scale thermal-hydraulic PANDA facility (located at PSI in Switzerland), has been used over the last few years for investigating different passive decay- heat removal systems and containment phenomena for the next generation of light water reactors (Simplified Boiling Water Reactor: SBWR; European Simplified Boiling Water Reactor: ESBWR; Siedewasserreaktor: SWR-1000). Currently, as part of the European Commission 5. EURATOM Framework Programme project 'Testing and Enhanced Modelling of Passive Evolutionary Systems Technology for Containment Cooling' (TEMPEST), a new series of tests is being planned in the PANDA facility to experimentally investigate the distribution of non-condensable gases inside the containment andmore » their effect on the performance of the 'Passive Containment Cooling System' (PCCS). Hydrogen release caused by the metal-water reaction in the case of a postulated severe accident will be simulated in PANDA by injecting helium into the reactor pressure vessel. In order to provide suitable data for Computational Fluid Dynamic (CFD) code assessment and improvement, the instrumentation in PANDA has been upgraded for the new tests. In the present paper, a detailed discussion is given of the new PANDA tests to be performed to investigate the effects of light gas on passive safety systems. The tests are scheduled for the first half of the year 2002. (authors)« less
Idaho National Engineering Laboratory code assessment of the Rocky Flats transuranic waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-07-01
This report is an assessment of the content codes associated with transuranic waste shipped from the Rocky Flats Plant in Golden, Colorado, to INEL. The primary objective of this document is to characterize and describe the transuranic wastes shipped to INEL from Rocky Flats by item description code (IDC). This information will aid INEL in determining if the waste meets the waste acceptance criteria (WAC) of the Waste Isolation Pilot Plant (WIPP). The waste covered by this content code assessment was shipped from Rocky Flats between 1985 and 1989. These years coincide with the dates for information available in themore » Rocky Flats Solid Waste Information Management System (SWIMS). The majority of waste shipped during this time was certified to the existing WIPP WAC. This waste is referred to as precertified waste. Reassessment of these precertified waste containers is necessary because of changes in the WIPP WAC. To accomplish this assessment, the analytical and process knowledge available on the various IDCs used at Rocky Flats were evaluated. Rocky Flats sources for this information include employee interviews, SWIMS, Transuranic Waste Certification Program, Transuranic Waste Inspection Procedure, Backlog Waste Baseline Books, WIPP Experimental Waste Characterization Program (headspace analysis), and other related documents, procedures, and programs. Summaries are provided of: (a) certification information, (b) waste description, (c) generation source, (d) recovery method, (e) waste packaging and handling information, (f) container preparation information, (g) assay information, (h) inspection information, (i) analytical data, and (j) RCRA characterization.« less
Optical encryption and QR codes: secure and noise-free information retrieval.
Barrera, John Fredy; Mira, Alejandro; Torroba, Roberto
2013-03-11
We introduce for the first time the concept of an information "container" before a standard optical encrypting procedure. The "container" selected is a QR code which offers the main advantage of being tolerant to pollutant speckle noise. Besides, the QR code can be read by smartphones, a massively used device. Additionally, QR code includes another secure step to the encrypting benefits the optical methods provide. The QR is generated by means of worldwide free available software. The concept development probes that speckle noise polluting the outcomes of normal optical encrypting procedures can be avoided, then making more attractive the adoption of these techniques. Actual smartphone collected results are shown to validate our proposal.
Wilkinson, Karl A; Hine, Nicholas D M; Skylaris, Chris-Kriton
2014-11-11
We present a hybrid MPI-OpenMP implementation of Linear-Scaling Density Functional Theory within the ONETEP code. We illustrate its performance on a range of high performance computing (HPC) platforms comprising shared-memory nodes with fast interconnect. Our work has focused on applying OpenMP parallelism to the routines which dominate the computational load, attempting where possible to parallelize different loops from those already parallelized within MPI. This includes 3D FFT box operations, sparse matrix algebra operations, calculation of integrals, and Ewald summation. While the underlying numerical methods are unchanged, these developments represent significant changes to the algorithms used within ONETEP to distribute the workload across CPU cores. The new hybrid code exhibits much-improved strong scaling relative to the MPI-only code and permits calculations with a much higher ratio of cores to atoms. These developments result in a significantly shorter time to solution than was possible using MPI alone and facilitate the application of the ONETEP code to systems larger than previously feasible. We illustrate this with benchmark calculations from an amyloid fibril trimer containing 41,907 atoms. We use the code to study the mechanism of delamination of cellulose nanofibrils when undergoing sonification, a process which is controlled by a large number of interactions that collectively determine the structural properties of the fibrils. Many energy evaluations were needed for these simulations, and as these systems comprise up to 21,276 atoms this would not have been feasible without the developments described here.
General Model for Multicomponent Ablation Thermochemistry
NASA Technical Reports Server (NTRS)
Milos, Frank S.; Marschall, Jochen; Rasky, Daniel J. (Technical Monitor)
1994-01-01
A previous paper (AIAA 94-2042) presented equations and numerical procedures for modeling the thermochemical ablation and pyrolysis of thermal protection materials which contain multiple surface species. This work describes modifications and enhancements to the Multicomponent Ablation Thermochemistry (MAT) theory and code for application to the general case which includes surface area constraints, rate limited surface reactions, and non-thermochemical mass loss (failure). Detailed results and comparisons with data are presented for the Shuttle Orbiter reinforced carbon-carbon oxidation protection system which contains a mixture of sodium silicate (Na2SiO3), silica (SiO2), silicon carbide (SiC), and carbon (C).
J Genes for Heavy Chain Immunoglobulins of Mouse
NASA Astrophysics Data System (ADS)
Newell, Nanette; Richards, Julia E.; Tucker, Philip W.; Blattner, Frederick R.
1980-09-01
A 15.8-kilobase pair fragment of BALB/c mouse liver DNA, cloned in the Charon 4Aλ phage vector system, was shown to contain the μ heavy chain constant region (CHμ ) gene for the mouse immunoglobulin M. In addition, this fragment of DNA contains at least two J genes, used to code for the carboxyl terminal portion of heavy chain variable regions. These genes are located in genomic DNA about eight kilobase pairs to the 5' side of the CHμ gene. The complete nucleotide sequence of a 1120-base pair stretch of DNA that includes the two J genes has been determined.
Validation Results for LEWICE 2.0. [Supplement
NASA Technical Reports Server (NTRS)
Wright, William B.; Rutkowski, Adam
1999-01-01
Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.
Evaluation of Agency Non-Code Layered Pressure Vessels (LPVs)
NASA Technical Reports Server (NTRS)
Prosser, William H.
2014-01-01
In coordination with the Office of Safety and Mission Assurance and the respective Center Pressure System Managers (PSMs), the NASA Engineering and Safety Center (NESC) was requested to formulate a consensus draft proposal for the development of additional testing and analysis methods to establish the technical validity, and any limitation thereof, for the continued safe operation of facility non-code layered pressure vessels. The PSMs from each NASA Center were asked to participate as part of the assessment team by providing, collecting, and reviewing data regarding current operations of these vessels. This report contains the outcome of the assessment and the findings, observations, and NESC recommendations to the Agency and individual NASA Centers.
Evaluation of Agency Non-Code Layered Pressure Vessels (LPVs). Corrected Copy, Aug. 25, 2014
NASA Technical Reports Server (NTRS)
Prosser, William H.
2014-01-01
In coordination with the Office of Safety and Mission Assurance and the respective Center Pressure System Managers (PSMs), the NASA Engineering and Safety Center (NESC) was requested to formulate a consensus draft proposal for the development of additional testing and analysis methods to establish the technical validity, and any limitation thereof, for the continued safe operation of facility non-code layered pressure vessels. The PSMs from each NASA Center were asked to participate as part of the assessment team by providing, collecting, and reviewing data regarding current operations of these vessels. This report contains the outcome of the assessment and the findings, observations, and NESC recommendations to the Agency and individual NASA Centers.
PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; Suhs, Norman; Dietz, William; Rogers, Stuart; Nash, Steve; Chan, William; Tramel, Robert; Onufer, Jeff
2006-01-01
This viewgraph presentation reviews the use and requirements of Pegasus 5. PEGASUS 5 is a code which performs a pre-processing step for the Overset CFD method. The code prepares the overset volume grids for the flow solver by computing the domain connectivity database, and blanking out grid points which are contained inside a solid body. PEGASUS 5 successfully automates most of the overset process. It leads to dramatic reduction in user input over previous generations of overset software. It also can lead to an order of magnitude reduction in both turn-around time and user expertise requirements. It is also however not a "black-box" procedure; care must be taken to examine the resulting grid system.
Implicit and semi-implicit schemes in the Versatile Advection Code: numerical tests
NASA Astrophysics Data System (ADS)
Toth, G.; Keppens, R.; Botchev, M. A.
1998-04-01
We describe and evaluate various implicit and semi-implicit time integration schemes applied to the numerical simulation of hydrodynamical and magnetohydrodynamical problems. The schemes were implemented recently in the software package Versatile Advection Code, which uses modern shock capturing methods to solve systems of conservation laws with optional source terms. The main advantage of implicit solution strategies over explicit time integration is that the restrictive constraint on the allowed time step can be (partially) eliminated, thus the computational cost is reduced. The test problems cover one and two dimensional, steady state and time accurate computations, and the solutions contain discontinuities. For each test, we confront explicit with implicit solution strategies.
Digital 8-DPSK Modem For Trellis-Coded Communication
NASA Technical Reports Server (NTRS)
Jedrey, T. C.; Lay, N. E.; Rafferty, W.
1989-01-01
Digital real-time modem processes octuple differential-phase-shift-keyed trellis-coded modulation. Intended for use in communicating data at rate up to 4.8 kb/s in land-mobile satellite channel (Rician fading) of 5-kHz bandwidth at carrier frequency of 1 to 2 GHz. Modulator and demodulator contain digital signal processors performing modem functions. Design flexible in that functions altered via software. Modem successfully tested and evaluated in both laboratory and field experiments, including recent full-scale satellite experiment. In all cases, modem performed within 1 dB of theory. Other communication systems benefitting from this type of modem include land mobile (without satellites), paging, digitized voice, and frequency-modulation subcarrier data broadcasting.
Shapiro, James A
2016-06-08
The 21st century genomics-based analysis of evolutionary variation reveals a number of novel features impossible to predict when Dobzhansky and other evolutionary biologists formulated the neo-Darwinian Modern Synthesis in the middle of the last century. These include three distinct realms of cell evolution; symbiogenetic fusions forming eukaryotic cells with multiple genome compartments; horizontal organelle, virus and DNA transfers; functional organization of proteins as systems of interacting domains subject to rapid evolution by exon shuffling and exonization; distributed genome networks integrated by mobile repetitive regulatory signals; and regulation of multicellular development by non-coding lncRNAs containing repetitive sequence components. Rather than single gene traits, all phenotypes involve coordinated activity by multiple interacting cell molecules. Genomes contain abundant and functional repetitive components in addition to the unique coding sequences envisaged in the early days of molecular biology. Combinatorial coding, plus the biochemical abilities cells possess to rearrange DNA molecules, constitute a powerful toolbox for adaptive genome rewriting. That is, cells possess "Read-Write Genomes" they alter by numerous biochemical processes capable of rapidly restructuring cellular DNA molecules. Rather than viewing genome evolution as a series of accidental modifications, we can now study it as a complex biological process of active self-modification.
Shapiro, James A.
2016-01-01
The 21st century genomics-based analysis of evolutionary variation reveals a number of novel features impossible to predict when Dobzhansky and other evolutionary biologists formulated the neo-Darwinian Modern Synthesis in the middle of the last century. These include three distinct realms of cell evolution; symbiogenetic fusions forming eukaryotic cells with multiple genome compartments; horizontal organelle, virus and DNA transfers; functional organization of proteins as systems of interacting domains subject to rapid evolution by exon shuffling and exonization; distributed genome networks integrated by mobile repetitive regulatory signals; and regulation of multicellular development by non-coding lncRNAs containing repetitive sequence components. Rather than single gene traits, all phenotypes involve coordinated activity by multiple interacting cell molecules. Genomes contain abundant and functional repetitive components in addition to the unique coding sequences envisaged in the early days of molecular biology. Combinatorial coding, plus the biochemical abilities cells possess to rearrange DNA molecules, constitute a powerful toolbox for adaptive genome rewriting. That is, cells possess “Read–Write Genomes” they alter by numerous biochemical processes capable of rapidly restructuring cellular DNA molecules. Rather than viewing genome evolution as a series of accidental modifications, we can now study it as a complex biological process of active self-modification. PMID:27338490
ERIC Educational Resources Information Center
US Department of Commerce, 2004
2004-01-01
A census of governments is taken at 5-year intervals as required by law under title 13, United States Codes, Section 161. This 2002 census, similar to those taken since 1957, covers three major subject fields: government organization; public employment; and government finances. This document contains six parts that cover the entire range of state…
The Top 100. The Fastest Growing Careers for the 21st Century. Revised Edition.
ERIC Educational Resources Information Center
1998
This publication presents 100 careers the U.S. Department of Labor and other sources project as the fastest growing through the year 2006. A shaded bar on the bottom of the title page of each article contains a listing of codes for three commonly used government classification systems. Shaded bars at the bottom of other pages provide quick facts.…
Cryptanalysis of the Sodark Family of Cipher Algorithms
2017-09-01
software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a
Yerrapragada, Shaila; Shukla, Animesh; Hallsworth-Pepin, Kymberlie; Choi, Kwangmin; Wollam, Aye; Clifton, Sandra; Qin, Xiang; Muzny, Donna; Raghuraman, Sriram; Ashki, Haleh; Uzman, Akif; Highlander, Sarah K.; Fryszczyn, Bartlomiej G.; Fox, George E.; Tirumalai, Madhan R.; Liu, Yamei; Kim, Sun
2015-01-01
Tolypothrix sp. PCC 7601 is a freshwater filamentous cyanobacterium with complex responses to environmental conditions. Here, we present its 9.96-Mbp draft genome sequence, containing 10,065 putative protein-coding sequences, including 305 predicted two-component system proteins and 27 putative phytochrome-class photoreceptors, the most such proteins in any sequenced genome. PMID:25953173
Clayman, Marla L.; Makoul, Gregory; Harper, Maya M.; Koby, Danielle G.; Williams, Adam R.
2012-01-01
Objectives Describe the development and refinement of a scheme, Detail of Essential Elements and Participants in Shared Decision Making (DEEP-SDM), for coding Shared Decision Making (SDM) while reporting on the characteristics of decisions in a sample of patients with metastatic breast cancer. Methods The Evidence-Based Patient Choice instrument was modified to reflect Makoul and Clayman’s Integrative Model of SDM. Coding was conducted on video recordings of 20 women at the first visit with their medical oncologists after suspicion of disease progression. Noldus Observer XT v.8, a video coding software platform, was used for coding. Results The sample contained 80 decisions (range: 1-11), divided into 150 decision making segments. Most decisions were physician-led, although patients and physicians initiated similar numbers of decision-making conversations. Conclusion DEEP-SDM facilitates content analysis of encounters between women with metastatic breast cancer and their medical oncologists. Despite the fractured nature of decision making, it is possible to identify decision points and to code each of the Essential Elements of Shared Decision Making. Further work should include application of DEEP-SDM to non-cancer encounters. Practice Implications: A better understanding of how decisions unfold in the medical encounter can help inform the relationship of SDM to patient-reported outcomes. PMID:22784391
ERIC Educational Resources Information Center
Gordon, Wanda; Sork, Thomas J.
2001-01-01
Replicating an Indiana study, 261 responses from British Columbia adult educators revealed a high degree of support for codes of ethics and identified ethical dilemmas in practice. Half currently operated under a code. Responses to whether codes should have a regulatory function were mixed. (Contains 44 references.) (SK)
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2012 CFR
2012-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2014 CFR
2014-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2013 CFR
2013-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2011 CFR
2011-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2010 CFR
2010-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
[Representation of knowledge in respiratory medicine: ontology should help the coding process].
Blanc, F-X; Baneyx, A; Charlet, J; Housset, B
2010-09-01
Access to medical knowledge is a major issue for health professionals and requires the development of terminologies. The objective of the reported work was to construct an ontology of respiratory medicine, i.e. an organized and formalized terminology composed by specific knowledge. The purpose is to help the medico-economical coding process and to represent the relevant knowledge about the patient. Our researches cover the whole life cycle of an ontology, from the development of a methodology, to building it from texts, to its use in an operational system. A computerized tool, based on the ontology, allows both a medico-economical coding and a graphical medical one. This second one will be used to index hospital reports. Our ontology counts 1913 concepts and contains all the knowledge included in the PMSI part of the SPLF thesaurus. Our tool has been evaluated and showed a recall of 80% and an accuracy of 85% regarding the medico-economical coding. The work presented in this paper justifies the approach that has been used. It must be continued on a large scale to validate our coding principles and the possibility of making enquiries on patient reports concerning clinical research. Copyright © 2010. Published by Elsevier Masson SAS.
Decoding DNA labels by melting curve analysis using real-time PCR.
Balog, József A; Fehér, Liliána Z; Puskás, László G
2017-12-01
Synthetic DNA has been used as an authentication code for a diverse number of applications. However, existing decoding approaches are based on either DNA sequencing or the determination of DNA length variations. Here, we present a simple alternative protocol for labeling different objects using a small number of short DNA sequences that differ in their melting points. Code amplification and decoding can be done in two steps using quantitative PCR (qPCR). To obtain a DNA barcode with high complexity, we defined 8 template groups, each having 4 different DNA templates, yielding 158 (>2.5 billion) combinations of different individual melting temperature (Tm) values and corresponding ID codes. The reproducibility and specificity of the decoding was confirmed by using the most complex template mixture, which had 32 different products in 8 groups with different Tm values. The industrial applicability of our protocol was also demonstrated by labeling a drone with an oil-based paint containing a predefined DNA code, which was then successfully decoded. The method presented here consists of a simple code system based on a small number of synthetic DNA sequences and a cost-effective, rapid decoding protocol using a few qPCR reactions, enabling a wide range of authentication applications.
Telepharmacy and bar-code technology in an i.v. chemotherapy admixture area.
O'Neal, Brian C; Worden, John C; Couldry, Rick J
2009-07-01
A program using telepharmacy and bar-code technology to increase the presence of the pharmacist at a critical risk point during chemotherapy preparation is described. Telepharmacy hardware and software were acquired, and an inspection camera was placed in a biological safety cabinet to allow the pharmacy technician to take digital photographs at various stages of the chemotherapy preparation process. Once the pharmacist checks the medication vials' agreement with the work label, the technician takes the product into the biological safety cabinet, where the appropriate patient is selected from the pending work list, a queue of patient orders sent from the pharmacy information system. The technician then scans the bar code on the vial. Assuming the bar code matches, the technician photographs the work label, vials, diluents and fluids to be used, and the syringe (before injecting the contents into the bag) along with the vial. The pharmacist views all images as a part of the final product-checking process. This process allows the pharmacist to verify that the correct quantity of medication was transferred from the primary source to a secondary container without being physically present at the time of transfer. Telepharmacy and bar coding provide a means to improve the accuracy of chemotherapy preparation by decreasing the likelihood of using the incorrect product or quantity of drug. The system facilitates the reading of small product labels and removes the need for a pharmacist to handle contaminated syringes and vials when checking the final product.
The Role of Ontologies in Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.
2004-01-01
Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.
A portable platform for accelerated PIC codes and its application to GPUs using OpenACC
NASA Astrophysics Data System (ADS)
Hariri, F.; Tran, T. M.; Jocksch, A.; Lanti, E.; Progsch, J.; Messmer, P.; Brunner, S.; Gheller, C.; Villard, L.
2016-10-01
We present a portable platform, called PIC_ENGINE, for accelerating Particle-In-Cell (PIC) codes on heterogeneous many-core architectures such as Graphic Processing Units (GPUs). The aim of this development is efficient simulations on future exascale systems by allowing different parallelization strategies depending on the application problem and the specific architecture. To this end, this platform contains the basic steps of the PIC algorithm and has been designed as a test bed for different algorithmic options and data structures. Among the architectures that this engine can explore, particular attention is given here to systems equipped with GPUs. The study demonstrates that our portable PIC implementation based on the OpenACC programming model can achieve performance closely matching theoretical predictions. Using the Cray XC30 system, Piz Daint, at the Swiss National Supercomputing Centre (CSCS), we show that PIC_ENGINE running on an NVIDIA Kepler K20X GPU can outperform the one on an Intel Sandy bridge 8-core CPU by a factor of 3.4.
Uranus: a rapid prototyping tool for FPGA embedded computer vision
NASA Astrophysics Data System (ADS)
Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.
2007-01-01
The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.
Fast neutron counting in a mobile, trailer-based search platform
NASA Astrophysics Data System (ADS)
Hayward, Jason P.; Sparger, John; Fabris, Lorenzo; Newby, Robert J.
2017-12-01
Trailer-based search platforms for detection of radiological and nuclear threats are often based upon coded aperture gamma-ray imaging, because this method can be rendered insensitive to local variations in gamma background while still localizing the source well. Since gamma source emissions are rather easily shielded, in this work we consider the addition of fast neutron counting to a mobile platform for detection of sources containing Pu. A proof-of-concept system capable of combined gamma and neutron coded-aperture imaging was built inside of a trailer and used to detect a 252Cf source while driving along a roadway. Neutron detector types employed included EJ-309 in a detector plane and EJ-299-33 in a front mask plane. While the 252Cf gamma emissions were not readily detectable while driving by at 16.9 m standoff, the neutron emissions can be detected while moving. Mobile detection performance for this system and a scaled-up system design are presented, along with implications for threat sensing.
NASA Technical Reports Server (NTRS)
Myers, David E.; Martin, Carl J.; Blosser, Max L.
2000-01-01
A parametric weight assessment of advanced metallic panel, ceramic blanket, and ceramic tile thermal protection systems (TPS) was conducted using an implicit, one-dimensional (I-D) finite element sizing code. This sizing code contained models to account for coatings fasteners, adhesives, and strain isolation pads. Atmospheric entry heating profiles for two vehicles, the Access to Space (ATS) vehicle and a proposed Reusable Launch Vehicle (RLV), were used to ensure that the trends were not unique to a certain trajectory. Ten TPS concepts were compared for a range of applied heat loads and substructural heat capacities to identify general trends. This study found the blanket TPS concepts have the lightest weights over the majority of their applicable ranges, and current technology ceramic tiles and metallic TPS concepts have similar weights. A proposed, state-of-the-art metallic system which uses a higher temperature alloy and efficient multilayer insulation was predicted to be significantly lighter than the ceramic tile stems and approaches blanket TPS weights for higher integrated heat loads.
General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets
NASA Technical Reports Server (NTRS)
Marchen, Luis F.
2011-01-01
The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.
Application of Gaussian Process Modeling to Analysis of Functional Unreliability
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Youngblood
2014-06-01
This paper applies Gaussian Process (GP) modeling to analysis of the functional unreliability of a “passive system.” GPs have been used widely in many ways [1]. The present application uses a GP for emulation of a system simulation code. Such an emulator can be applied in several distinct ways, discussed below. All applications illustrated in this paper have precedents in the literature; the present paper is an application of GP technology to a problem that was originally analyzed [2] using neural networks (NN), and later [3, 4] by a method called “Alternating Conditional Expectations” (ACE). This exercise enables a multifacetedmore » comparison of both the processes and the results. Given knowledge of the range of possible values of key system variables, one could, in principle, quantify functional unreliability by sampling from their joint probability distribution, and performing a system simulation for each sample to determine whether the function succeeded for that particular setting of the variables. Using previously available system simulation codes, such an approach is generally impractical for a plant-scale problem. It has long been recognized, however, that a well-trained code emulator or surrogate could be used in a sampling process to quantify certain performance metrics, even for plant-scale problems. “Response surfaces” were used for this many years ago. But response surfaces are at their best for smoothly varying functions; in regions of parameter space where key system performance metrics may behave in complex ways, or even exhibit discontinuities, response surfaces are not the best available tool. This consideration was one of several that drove the work in [2]. In the present paper, (1) the original quantification of functional unreliability using NN [2], and later ACE [3], is reprised using GP; (2) additional information provided by the GP about uncertainty in the limit surface, generally unavailable in other representations, is discussed; (3) a simple forensic exercise is performed, analogous to the inverse problem of code calibration, but with an accident management spin: given an observation about containment pressure, what can we say about the system variables? References 1. For an introduction to GPs, see (for example) Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams (MIT, 2006). 2. Reliability Quantification of Advanced Reactor Passive Safety Systems, J. J. Vandenkieboom, PhD Thesis (University of Michigan, 1996). 3. Z. Cui, J. C. Lee, J. J. Vandenkieboom, and R. W. Youngblood, “Unreliability Quantification of a Containment Cooling System through ACE and ANN Algorithms,” Trans. Am. Nucl. Soc. 85, 178 (2001). 4. Risk and Safety Analysis of Nuclear Systems, J. C. Lee and N. J. McCormick (Wiley, 2011). See especially §11.2.4.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Müller, C.; Hughes, E. D.; Niederauer, G. F.
1998-10-01
Los Alamos National Laboratory (LANL) and Forschungszentrum Karlsruhe (FzK) are developing GASFLOW, a three-dimensional (3D) fluid dynamics field code as a best- estimate tool to characterize local phenomena within a flow field. Examples of 3D phenomena include circulation patterns; flow stratification; hydrogen distribution mixing and stratification; combustion and flame propagation; effects of noncondensable gas distribution on local condensation and evaporation; and aerosol entrainment, transport, and deposition. An analysis with GASFLOW will result in a prediction of the gas composition and discrete particle distribution in space and time throughout the facility and the resulting pressure and temperature loadings on the wallsmore » and internal structures with or without combustion. A major application of GASFLOW is for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containment and other facilities. It has been applied to situations involving transporting and distributing combustible gas mixtures. It has been used to study gas dynamic behavior in low-speed, buoyancy-driven flows, as well as sonic flows or diffusion dominated flows; and during chemically reacting flows, including deflagrations. The effects of controlling such mixtures by safety systems can be analyzed. The code version described in this manual is designated GASFLOW 2.1, which combines previous versions of the United States Nuclear Regulatory Commission code HMS (for Hydrogen Mixing Studies) and the Department of Energy and FzK versions of GASFLOW. The code was written in standard Fortran 90. This manual comprises three volumes. Volume I describes the governing physical equations and computational model. Volume II describes how to use the code to set up a model geometry, specify gas species and material properties, define initial and boundary conditions, and specify different outputs, especially graphical displays. Sample problems are included. Volume III contains some of the assessments performed by LANL and FzK« less
The kinetics of aerosol particle formation and removal in NPP severe accidents
NASA Astrophysics Data System (ADS)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.
2016-06-01
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.
The kinetics of aerosol particle formation and removal in NPP severe accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.
2016-06-08
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shumaker, Dana E.; Steefel, Carl I.
The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domingues, L.; Dantas, M.M.; Lima, N.
1999-09-20
Alcohol fermentation of lactose was investigated using a recombinant flocculating Saccharomyces cetevisiae, expressing the LAC4 (coding the {beta}-galactosidase) and LAC12 (coding for lactose permease) genes of Kluyveromyces marxianus. Data on yeast fermentation and growth on a medium containing lactose as the sole carbon source are presented. In the range of studied lactose concentrations, total lactose consumption was observed with a conversion yield of ethanol close to the expected theoretical value. For the continuously operating bioreactor, an ethanol productivity of 11 g L{sup {minus}1} h{sup {minus}1} (corresponding to a feed lactose concentration of 50 g L{sup {minus}1} and a dilution ratemore » of 0.55 h{sup {minus}1}) was obtained, which is 7 times larger than the continuous conventional systems. The system stability was confirmed by keeping it in operation for 6 months.« less
Parallel Unsteady Turbopump Simulations for Liquid Rocket Engines
NASA Technical Reports Server (NTRS)
Kiris, Cetin C.; Kwak, Dochan; Chan, William
2000-01-01
This paper reports the progress being made towards complete turbo-pump simulation capability for liquid rocket engines. Space Shuttle Main Engine (SSME) turbo-pump impeller is used as a test case for the performance evaluation of the MPI and hybrid MPI/Open-MP versions of the INS3D code. Then, a computational model of a turbo-pump has been developed for the shuttle upgrade program. Relative motion of the grid system for rotor-stator interaction was obtained by employing overset grid techniques. Time-accuracy of the scheme has been evaluated by using simple test cases. Unsteady computations for SSME turbo-pump, which contains 136 zones with 35 Million grid points, are currently underway on Origin 2000 systems at NASA Ames Research Center. Results from time-accurate simulations with moving boundary capability, and the performance of the parallel versions of the code will be presented in the final paper.
IGB grid: User's manual (A turbomachinery grid generation code)
NASA Technical Reports Server (NTRS)
Beach, T. A.; Hoffman, G.
1992-01-01
A grid generation code called IGB is presented for use in computational investigations of turbomachinery flowfields. It contains a combination of algebraic and elliptic techniques coded for use on an interactive graphics workstation. The instructions for use and a test case are included.
Changing Patient Classification System for Hospital Reimbursement in Romania
Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian
2010-01-01
Aim To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Methods Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). Results The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians’ knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Conclusion Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care. PMID:20564769
Changing patient classification system for hospital reimbursement in Romania.
Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian
2010-06-01
To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians' knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case-mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case-mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care.
Automatic mathematical modeling for real time simulation system
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1988-01-01
A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.
: A Scalable and Transparent System for Simulating MPI Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S
2010-01-01
is a scalable, transparent system for experimenting with the execution of parallel programs on simulated computing platforms. The level of simulated detail can be varied for application behavior as well as for machine characteristics. Unique features of are repeatability of execution, scalability to millions of simulated (virtual) MPI ranks, scalability to hundreds of thousands of host (real) MPI ranks, portability of the system to a variety of host supercomputing platforms, and the ability to experiment with scientific applications whose source-code is available. The set of source-code interfaces supported by is being expanded to support a wider set of applications, andmore » MPI-based scientific computing benchmarks are being ported. In proof-of-concept experiments, has been successfully exercised to spawn and sustain very large-scale executions of an MPI test program given in source code form. Low slowdowns are observed, due to its use of purely discrete event style of execution, and due to the scalability and efficiency of the underlying parallel discrete event simulation engine, sik. In the largest runs, has been executed on up to 216,000 cores of a Cray XT5 supercomputer, successfully simulating over 27 million virtual MPI ranks, each virtual rank containing its own thread context, and all ranks fully synchronized by virtual time.« less
Pianowski, Giselle; Meyer, Gregory J; Villemor-Amaral, Anna Elisa de
2016-01-01
Exner ( 1989 ) and Weiner ( 2003 ) identified 3 types of Rorschach codes that are most likely to contain personally relevant projective material: Distortions, Movement, and Embellishments. We examine how often these types of codes occur in normative data and whether their frequency changes for the 1st, 2nd, 3rd, 4th, or last response to a card. We also examine the impact on these variables of the Rorschach Performance Assessment System's (R-PAS) statistical modeling procedures that convert the distribution of responses (R) from Comprehensive System (CS) administered protocols to match the distribution of R found in protocols obtained using R-optimized administration guidelines. In 2 normative reference databases, the results indicated that about 40% of responses (M = 39.25) have 1 type of code, 15% have 2 types, and 1.5% have all 3 types, with frequencies not changing by response number. In addition, there were no mean differences in the original CS and R-optimized modeled records (M Cohen's d = -0.04 in both databases). When considered alongside findings showing minimal differences between the protocols of people randomly assigned to CS or R-optimized administration, the data suggest R-optimized administration should not alter the extent to which potential projective material is present in a Rorschach protocol.
Terminal Area Simulation System User's Guide - Version 10.0
NASA Technical Reports Server (NTRS)
Switzer, George F.; Proctor, Fred H.
2014-01-01
The Terminal Area Simulation System (TASS) is a three-dimensional, time-dependent, large eddy simulation model that has been developed for studies of wake vortex and weather hazards to aviation, along with other atmospheric turbulence, and cloud-scale weather phenomenology. This document describes the source code for TASS version 10.0 and provides users with needed documentation to run the model. The source code is programed in Fortran language and is formulated to take advantage of vector and efficient multi-processor scaling for execution on massively-parallel supercomputer clusters. The code contains different initialization modules allowing the study of aircraft wake vortex interaction with the atmosphere and ground, atmospheric turbulence, atmospheric boundary layers, precipitating convective clouds, hail storms, gust fronts, microburst windshear, supercell and mesoscale convective systems, tornadic storms, and ring vortices. The model is able to operate in either two- or three-dimensions with equations numerically formulated on a Cartesian grid. The primary output from the TASS is time-dependent domain fields generated by the prognostic equations and diagnosed variables. This document will enable a user to understand the general logic of TASS, and will show how to configure and initialize the model domain. Also described are the formats of the input and output files, as well as the parameters that control the input and output.
Medical decision making: guide to improved CPT coding.
Holt, Jim; Warsy, Ambreen; Wright, Paula
2010-04-01
The Current Procedural Terminology (CPT) coding system for office visits, which has been in use since 1995, has not been well studied, but it is generally agreed that the system contains much room for error. In fact, the available literature suggests that only slightly more than half of physicians will agree on the same CPT code for a given visit, and only 60% of professional coders will agree on the same code for a particular visit. In addition, the criteria used to assign a code are often related to the amount of written documentation. The goal of this study was to evaluate two novel methods to assess if the most appropriate CPT code is used: the level of medical decision making, or the sum of all problems mentioned by the patient during the visit. The authors-a professional coder, a residency faculty member, and a PGY-3 family medicine resident-reviewed 351 randomly selected visit notes from two residency programs in the Northeast Tennessee region for the level of documentation, the level of medical decision making, and the total number of problems addressed. The authors assigned appropriate CPT codes at each of those three levels. Substantial undercoding occurred at each of the three levels. Approximately 33% of visits were undercoded based on the written documentation. Approximately 50% of the visits were undercoded based on the level of documented medical decision making. Approximately 80% of the visits were undercoded based on the total number of problems which the patient presented during the visit. Interrater agreement was fair, and similar to that noted in other coding studies. Undercoding is not only common in a family medicine residency program but it also occurs at levels that would not be evident from a simple audit of the documentation on the visit note. Undercoding also occurs from not exploring problems mentioned by the patient and not documenting additional work that was performed. Family physicians may benefit from minor alterations in their documentation of office visit notes.
Turbomachinery Forced Response Prediction System (FREPS): User's Manual
NASA Technical Reports Server (NTRS)
Morel, M. R.; Murthy, D. V.
1994-01-01
The turbomachinery forced response prediction system (FREPS), version 1.2, is capable of predicting the aeroelastic behavior of axial-flow turbomachinery blades. This document is meant to serve as a guide in the use of the FREPS code with specific emphasis on its use at NASA Lewis Research Center (LeRC). A detailed explanation of the aeroelastic analysis and its development is beyond the scope of this document, and may be found in the references. FREPS has been developed by the NASA LeRC Structural Dynamics Branch. The manual is divided into three major parts: an introduction, the preparation of input, and the procedure to execute FREPS. Part 1 includes a brief background on the necessity of FREPS, a description of the FREPS system, the steps needed to be taken before FREPS is executed, an example input file with instructions, presentation of the geometric conventions used, and the input/output files employed and produced by FREPS. Part 2 contains a detailed description of the command names needed to create the primary input file that is required to execute the FREPS code. Also, Part 2 has an example data file to aid the user in creating their own input files. Part 3 explains the procedures required to execute the FREPS code on the Cray Y-MP, a computer system available at the NASA LeRC.
Implicit time-integration method for simultaneous solution of a coupled non-linear system
NASA Astrophysics Data System (ADS)
Watson, Justin Kyle
Historically large physical problems have been divided into smaller problems based on the physics involved. This is no different in reactor safety analysis. The problem of analyzing a nuclear reactor for design basis accidents is performed by a handful of computer codes each solving a portion of the problem. The reactor thermal hydraulic response to an event is determined using a system code like TRAC RELAP Advanced Computational Engine (TRACE). The core power response to the same accident scenario is determined using a core physics code like Purdue Advanced Core Simulator (PARCS). Containment response to the reactor depressurization in a Loss Of Coolant Accident (LOCA) type event is calculated by a separate code. Sub-channel analysis is performed with yet another computer code. This is just a sample of the computer codes used to solve the overall problems of nuclear reactor design basis accidents. Traditionally each of these codes operates independently from each other using only the global results from one calculation as boundary conditions to another. Industry's drive to uprate power for reactors has motivated analysts to move from a conservative approach to design basis accident towards a best estimate method. To achieve a best estimate calculation efforts have been aimed at coupling the individual physics models to improve the accuracy of the analysis and reduce margins. The current coupling techniques are sequential in nature. During a calculation time-step data is passed between the two codes. The individual codes solve their portion of the calculation and converge to a solution before the calculation is allowed to proceed to the next time-step. This thesis presents a fully implicit method of simultaneous solving the neutron balance equations, heat conduction equations and the constitutive fluid dynamics equations. It discusses the problems involved in coupling different physics phenomena within multi-physics codes and presents a solution to these problems. The thesis also outlines the basic concepts behind the nodal balance equations, heat transfer equations and the thermal hydraulic equations, which will be coupled to form a fully implicit nonlinear system of equations. The coupling of separate physics models to solve a larger problem and improve accuracy and efficiency of a calculation is not a new idea, however implementing them in an implicit manner and solving the system simultaneously is. Also the application to reactor safety codes is new and has not be done with thermal hydraulics and neutronics codes on realistic applications in the past. The coupling technique described in this thesis is applicable to other similar coupled thermal hydraulic and core physics reactor safety codes. This technique is demonstrated using coupled input decks to show that the system is solved correctly and then verified by using two derivative test problems based on international benchmark problems the OECD/NRC Three mile Island (TMI) Main Steam Line Break (MSLB) problem (representative of pressurized water reactor analysis) and the OECD/NRC Peach Bottom (PB) Turbine Trip (TT) benchmark (representative of boiling water reactor analysis).
A generic archive protocol and an implementation
NASA Technical Reports Server (NTRS)
Jordan, J. M.; Jennings, D. G.; Mcglynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.
1992-01-01
Archiving vast amounts of data has become a major part of every scientific space mission today. The Generic Archive/Retrieval Services Protocol (GRASP) addresses the question of how to archive the data collected in an environment where the underlying hardware archives may be rapidly changing. GRASP is a device independent specification defining a set of functions for storing and retrieving data from an archive, as well as other support functions. GRASP is divided into two levels: the Transfer Interface and the Action Interface. The Transfer Interface is computer/archive independent code while the Action Interface contains code which is dedicated to each archive/computer addressed. Implementations of the GRASP specification are currently available for DECstations running Ultrix, Sparcstations running SunOS, and microVAX/VAXstation 3100's. The underlying archive is assumed to function as a standard Unix or VMS file system. The code, written in C, is a single suite of files. Preprocessing commands define the machine unique code sections in the device interface. The implementation was written, to the greatest extent possible, using only ANSI standard C functions.
Water-use computer programs for Florida
Geiger, L.H.
1984-01-01
Using U.S. Geological Survey computer programs L149-L153, this report shows how to process water-use data for the functional water-use categories: public supply, rural supply, industrial self-supplied, irrigation, and thermo-electric power generation. The programs are used to selectively retrieve entries and list them in a format suitable for publication. Instructions are given for coding cards to produce tables of water-use data for each of the functional use categories. These cards contain entries that identify a particular water-use data-collection site in Florida. Entries on the cards include location information such as county code, water management district code, hydrologic unit code, and, where applicable, a site name and number. Annual and monthly pumpage is included. These entries are shown with several different headings; for example, surface water or ground water, freshwater or saline pumpages, or consumptive use. All the programs use a similar approach; however, the actual programs differ with each functional water-use category and are discussed separately. Data prepared for these programs can also be processed by the National Water-Use Data System. (USGS)
Stationary Liquid Fuel Fast Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Won Sik; Grandy, Andrew; Boroski, Andrew
For effective burning of hazardous transuranic (TRU) elements of used nuclear fuel, a transformational advanced reactor concept named SLFFR (Stationary Liquid Fuel Fast Reactor) was proposed based on stationary molten metallic fuel. The fuel enters the reactor vessel in a solid form, and then it is heated to molten temperature in a small melting heater. The fuel is contained within a closed, thick container with penetrating coolant channels, and thus it is not mixed with coolant nor flow through the primary heat transfer circuit. The makeup fuel is semi- continuously added to the system, and thus a very small excessmore » reactivity is required. Gaseous fission products are also removed continuously, and a fraction of the fuel is periodically drawn off from the fuel container to a processing facility where non-gaseous mixed fission products and other impurities are removed and then the cleaned fuel is recycled into the fuel container. A reference core design and a preliminary plant system design of a 1000 MWt TRU- burning SLFFR concept were developed using TRU-Ce-Co fuel, Ta-10W fuel container, and sodium coolant. Conservative design approaches were adopted to stay within the current material performance database. Detailed neutronics and thermal-fluidic analyses were performed to develop a reference core design. Region-dependent 33-group cross sections were generated based on the ENDF/B-VII.0 data using the MC2-3 code. Core and fuel cycle analyses were performed in theta-r-z geometries using the DIF3D and REBUS-3 codes. Reactivity coefficients and kinetics parameters were calculated using the VARI3D perturbation theory code. Thermo-fluidic analyses were performed using the ANSYS FLUENT computational fluid dynamics (CFD) code. Figure 0.1 shows a schematic radial layout of the reference 1000 MWt SLFFR core, and Table 0.1 summarizes the main design parameters of SLFFR-1000 loop plant. The fuel container is a 2.5 cm thick cylinder with an inner radius of 87.5 cm. The fuel container is penetrated by twelve hexagonal control assembly (CA) guide tubes, each of which has 3.0 mm thickness and 69.4 mm flat-to-flat outer distance. The distance between two neighboring CA guide tube is selected to be 26 cm to provide an adequate space for CA driving systems. The fuel container has 18181 penetrating coolant tubes of 6.0 mm inner diameter and 2.0 mm thickness. The coolant tubes are arranged in a triangular lattice with a lattice pitch of 1.21 cm. The fuel, structure, and coolant volume fractions inside the fuel container are 0.386, 0.383, and 0.231, respectively. Separate steel reflectors and B4C shields are used outside of the fuel container. Six gas expansion modules (GEMs) of 5.0 cm thickness are introduced in the radial reflector region. Between the radial reflector and the fuel container is a 2.5 cm sodium gap. The TRU inventory at the beginning of equilibrium cycle (BOEC) is 5081 kg, whereas the TRU inventory at the beginning of life (BOL) was 3541 kg. This is because the equilibrium cycle fuel contains a significantly smaller fissile fraction than the LWR TRU feed. The fuel inventory at BOEC is composed of 34.0 a/o TRU, 41.4 a/o Ce, 23.6 a/o Co, and 1.03 a/o solid fission products. Since uranium-free fuel is used, a theoretical maximum TRU consumption rate of 1.011 kg/day is achieved. The semi-continuous fuel cycle based on the 300-batch, 1- day cycle approximation yields a burnup reactivity loss of 26 pcm/day, and requires a daily reprocessing of 32.5 kg of SLFFR fuel. This yields a daily TRU charge rate of 17.45 kg, including a makeup TRU feed of 1.011 kg recovered from the LWR used fuel. The charged TRU-Ce-Co fuel is composed of 34.4 a/o TRU, 40.6 a/o Ce, and 25.0 a/o Co.« less
Radar Attitude Sensing System (RASS)
NASA Technical Reports Server (NTRS)
1971-01-01
The initial design and fabrication efforts for a radar attitude sensing system (RASS) are covered. The design and fabrication of the RASS system is being undertaken in two phases, 1B1 and 1B2. The RASS system as configured under phase 1B1 contains the solid state transmitter and local oscillator, the antenna system, the receiving system, and the altitude electronics. RASS employs a pseudo-random coded cw signal and receiver correlation techniques to measure range. The antenna is a planar, phased array, monopulse type, whose beam is electronically steerable using diode phase shifters. The beam steering computer and attitude sensing circuitry are to be included in Phase 1B2 of the program.
Romero-Fernández, Ma Mar; Royo-Bordonada, Miguel Angel; Rodríguez-Artalejo, Fernando
2010-07-01
To evaluate the level of compliance with the PAOS Code (Publicidad, Actividad, Obesidad y Salud), which establishes standards for the self-regulation of food marketing aimed at minors, in television advertising by food and beverage companies that have agreed to abide by the Code. The study sample consisted of food and beverage advertisements targeting children during 80 h of programming by four Spanish television networks. The level of compliance with each standard of the PAOS Code was classified into three categories: 'compliance', 'non-compliance' and 'uncertain compliance'. Overall, an advertisement was considered compliant with the PAOS Code if it met all the standards; non-compliant if it contravened one or more standards; and uncertain in all other cases. Of a total of 203 television advertisements from companies that agreed to the PAOS Code, the overall prevalence of non-compliance was 49.3% (v. 50.8% among those that did not agree to the code), with 20.7% of advertisements considered of uncertain compliance. Non-compliance was more frequent on Saturdays, in longer advertisements, in advertisements containing promotions or dairy products, and for advertisements from companies of French or US origin. Non-compliance with the PAOS Code was very high and was similar for companies that did and did not agree to the Code, casting doubt on the Code's effectiveness and oversight system. It seems the time has come to commit to statutory regulations that reduce the negative impact of advertising on children's diets, as demanded by public health experts and consumer associations.
California Library Laws. 1977.
ERIC Educational Resources Information Center
Silver, Cy H.
This document contains selections from the California Administrative Code, Education Code, Government Code, and others relating to public libraries, county law libraries and the State Library. The first section presents legal developments in California from 1974 to 1976 which are of interest to librarians. Laws and regulations are presented under…
Computer-assisted coding and clinical documentation: first things first.
Tully, Melinda; Carmichael, Angela
2012-10-01
Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.
Kim, Mi-Kyung; Ha, Heon-Su; Choi, Sun-Uk
2008-04-01
To facilitate molecular genetic studies of Streptomyces ambofaciens that produces spiramycin, a commercially important macrolide antibiotic used in human medicine against Gram-positive pathogenic bacteria, the conditions for the conjugal transfer of DNA from E. coli to S. ambofaciens were established using a bacteriophage phiC31 att/int system. The transconjugation efficiency of S. ambofaciens varied with the medium used; the highest frequency was obtained on AS-1 medium containing 10 mM MgCl(2) without heat treatment of the spores. In addition, by cloning and sequencing the attB site, we identified that S. ambofaciens contains a single attB site within an ORF coding for a pirin homolog, and its attB site sequence shows 100% nt identity to the sequence of S. coelicolor and S. lividans, which have the highest efficiency in transconjugation using the phiC31 att/int system.
78 FR 77327 - Standards for Condition of Food Containers
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... #0; #0;Rules and Regulations #0; Federal Register #0; #0; #0;This section of the FEDERAL REGISTER contains regulatory documents #0;having general applicability and legal effect, most of which are keyed #0;to and codified in the Code of Federal Regulations, which is published #0;under 50 titles pursuant to 44 U.S.C. 1510. #0; #0;The Code of Federal Regulations is sold...
Zhang, Yu; Yao, Youlin; Jiang, Siyuan; Lu, Yilu; Liu, Yunqiang; Tao, Dachang; Zhang, Sizhong; Ma, Yongxin
2015-04-01
To identify protein-protein interaction partners of PER1 (period circadian protein homolog 1), key component of the molecular oscillation system of the circadian rhythm in tumors using bacterial two-hybrid system technique. Human cervical carcinoma cell Hela library was adopted. Recombinant bait plasmid pBT-PER1 and pTRG cDNA plasmid library were cotransformed into the two-hybrid system reporter strain cultured in a special selective medium. Target clones were screened. After isolating the positive clones, the target clones were sequenced and analyzed. Fourteen protein coding genes were identified, 4 of which were found to contain whole coding regions of genes, which included optic atrophy 3 protein (OPA3) associated with mitochondrial dynamics and homo sapiens cutA divalent cation tolerance homolog of E. coli (CUTA) associated with copper metabolism. There were also cellular events related proteins and proteins which are involved in biochemical reaction and signal transduction-related proteins. Identification of potential interacting proteins with PER1 in tumors may provide us new insights into the functions of the circadian clock protein PER1 during tumorigenesis.
Parametric Weight Comparison of Current and Proposed Thermal Protection System (TPS) Concepts
NASA Technical Reports Server (NTRS)
Myers, David E.; Martin, Carl J.; Blosser, Max L.
1999-01-01
A parametric weight assessment of advanced metallic panel, ceramic blanket, and ceramic tile thermal protection systems (TPS) was conducted using an implicit, one-dimensional (1 -D) thermal finite element sizing code. This sizing code contained models to ac- count for coatings, fasteners, adhesives, and strain isolation pads. Atmospheric entry heating profiles for two vehicles, the Access to Space (ATS) rocket-powered single-stage-to-orbit (SSTO) vehicle and a proposed Reusable Launch Vehicle (RLV), were used to ensure that the trends were not unique to a particular trajectory. Eight TPS concepts were compared for a range of applied heat loads and substructural heat capacities to identify general trends. This study found the blanket TPS concepts have the lightest weights over the majority of their applicable ranges, and current technology ceramic tiles and metallic TPS concepts have similar weights. A proposed, state-of-the-art metallic system which uses a higher temperature alloy and efficient multilayer insulation was predicted to be significantly lighter than the ceramic tile systems and approaches blanket TPS weights for higher integrated heat loads.
Skyshine radiation from a pressurized water reactor containment dome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, W.H.
1986-06-01
The radiation dose rates resulting from airborne activities inside a postaccident pressurized water reactor containment are calculated by a discrete ordinates/Monte Carlo combined method. The calculated total dose rates and the skyshine component are presented as a function of distance from the containment at three different elevations for various gamma-ray source energies. The one-dimensional (ANISN code) is used to approximate the skyshine dose rates from the hemisphere dome, and the results are compared favorably to more rigorous results calculated by a three-dimensional Monte Carlo code.
Graywater Use by the Army -- Is It Time Yet?
2011-05-11
codes and regulations. Where chlorine rs used for disinfection , the non-potable water shall contain not more than 4 mg/L of chloramines or free...filtration • Often minimal treatment then underground irrigation system Many commercial package plants ► Filtered, disinfected product – fairly... Disinfection Identification (labeling and dying) Distribution Permit to construct BUILDING STRONG® Other Concerns Fixture flushing Cooling
Yerrapragada, Shaila; Shukla, Animesh; Hallsworth-Pepin, Kymberlie; Choi, Kwangmin; Wollam, Aye; Clifton, Sandra; Qin, Xiang; Muzny, Donna; Raghuraman, Sriram; Ashki, Haleh; Uzman, Akif; Highlander, Sarah K; Fryszczyn, Bartlomiej G; Fox, George E; Tirumalai, Madhan R; Liu, Yamei; Kim, Sun; Kehoe, David M; Weinstock, George M
2015-05-07
Tolypothrix sp. PCC 7601 is a freshwater filamentous cyanobacterium with complex responses to environmental conditions. Here, we present its 9.96-Mbp draft genome sequence, containing 10,065 putative protein-coding sequences, including 305 predicted two-component system proteins and 27 putative phytochrome-class photoreceptors, the most such proteins in any sequenced genome. Copyright © 2015 Yerrapragada et al.
Topics in quantum cryptography, quantum error correction, and channel simulation
NASA Astrophysics Data System (ADS)
Luo, Zhicheng
In this thesis, we mainly investigate four different topics: efficiently implementable codes for quantum key expansion [51], quantum error-correcting codes based on privacy amplification [48], private classical capacity of quantum channels [44], and classical channel simulation with quantum side information [49, 50]. For the first topic, we propose an efficiently implementable quantum key expansion protocol, capable of increasing the size of a pre-shared secret key by a constant factor. Previously, the Shor-Preskill proof [64] of the security of the Bennett-Brassard 1984 (BB84) [6] quantum key distribution protocol relied on the theoretical existence of good classical error-correcting codes with the "dual-containing" property. But the explicit and efficiently decodable construction of such codes is unknown. We show that we can lift the dual-containing constraint by employing the non-dual-containing codes with excellent performance and efficient decoding algorithms. For the second topic, we propose a construction of Calderbank-Shor-Steane (CSS) [19, 68] quantum error-correcting codes, which are originally based on pairs of mutually dual-containing classical codes, by combining a classical code with a two-universal hash function. We show, using the results of Renner and Koenig [57], that the communication rates of such codes approach the hashing bound on tensor powers of Pauli channels in the limit of large block-length. For the third topic, we prove a regularized formula for the secret key assisted capacity region of a quantum channel for transmitting private classical information. This result parallels the work of Devetak on entanglement assisted quantum communication capacity. This formula provides a new family protocol, the private father protocol, under the resource inequality framework that includes the private classical communication without the assisted secret keys as a child protocol. For the fourth topic, we study and solve the problem of classical channel simulation with quantum side information at the receiver. Our main theorem has two important corollaries: rate-distortion theory with quantum side information and common randomness distillation. Simple proofs of achievability of classical multi-terminal source coding problems can be made via a unified approach using the channel simulation theorem as building blocks. The fully quantum generalization of the problem is also conjectured with outer and inner bounds on the achievable rate pairs.
Front air bag nondeployments in frontal crashes fatal to drivers or right-front passengers.
Braver, Elisa R; McCartt, Anne T; Sherwood, Christopher P; Zuby, David S; Blanar, Laura; Scerbo, Marge
2010-04-01
Public concern has arisen about the reliability of front air bags because Fatality Analysis Reporting System (FARS) data indicate many nondeployed air bags in fatal frontal crashes. However, the accuracy of air bag deployment, the variable in question, is uncertain. This study aimed to provide more certain estimates of nondeployment incidence in fatal frontal crashes. Fatally injured passenger vehicle drivers and right-front passengers in frontal crashes were identified in two U.S. databases for calendar years 1998-2006 and model years 1994-2006: FARS, a census of police-reported fatal crashes on public roads, and National Automotive Sampling System/Crashworthiness Data System (NASS/CDS), a probability sample of tow-away crashes. NASS/CDS contains subsets of fatal crashes in FARS and collects detailed data using crash investigators. Front air bag deployment coding for front-seat occupant fatalities was compared in FARS and NASS/CDS, and case reviews were conducted. Among FARS frontal deaths with available deployment status (N = 43,169), front air bags were coded as not deployed for 18 percent of front occupants. In comparison, NASS/CDS (N = 628) reported 9 percent (weighted estimate) nondeployment among front occupants killed. Among crashes common to both databases, NASS/CDS reported deployments for 45 percent of front occupant deaths for which FARS had coded nondeployments. Detailed case reviews of NASS/CDS crashes indicated highly accurate coding for deployment status. Based on this case review, 8 percent (weighted estimate) of front occupant deaths in frontal crashes appeared to involve air bag nondeployments; 1-2 percent of front occupant deaths represented potential system failures where deployments would have been expected. Air bag deployments appeared unwarranted in most nondeployments based on crash characteristics. FARS data overstate the magnitude of the problem of air bag deployment failures; steps should be taken to improve coding. There are inherent uncertainties in judgments about whether or not air bags would be expected to deploy in some crashes. Continued monitoring of air bag performance is warranted.
77 FR 67628 - National Fire Codes: Request for Public Input for Revision of Codes and Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-13
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request... Technology, Commerce. ACTION: Notice. SUMMARY: This notice contains the list of National Fire Protection... the National Fire Protection Association (NFPA) to announce the NFPA's proposal to revise some of its...
78 FR 24725 - National Fire Codes: Request for Public Input for Revision of Codes and Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request... Technology, Commerce. ACTION: Notice. SUMMARY: This notice contains the list of National Fire Protection... the National Fire Protection Association (NFPA) to announce the NFPA's proposal to revise some of its...
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
38 CFR 39.63 - Architectural design standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Association Life Safety Code and Errata (NFPA 101), the 2003 edition of the NFPA 5000, Building Construction... section, all applicable local and State building codes and regulations must be observed. In areas not subject to local or State building codes, the recommendations contained in the 2003 edition of the NFPA...
Rationale for Student Dress Codes: A Review of School Handbooks
ERIC Educational Resources Information Center
Freeburg, Elizabeth W.; Workman, Jane E.; Lentz-Hees, Elizabeth S.
2004-01-01
Through dress codes, schools establish rules governing student appearance. This study examined stated rationales for dress and appearance codes in secondary school handbooks; 182 handbooks were received. Of 150 handbooks containing a rationale, 117 related dress and appearance regulations to students' right to a non-disruptive educational…
76 FR 46805 - Notice of Utah Adoption by Reference of the Pesticide Container Containment Rule
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... ENVIRONMENTAL PROTECTION AGENCY [FRL-9447-8] Notice of Utah Adoption by Reference of the Pesticide Container Containment Rule AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: This... Pesticide Container Containment (PCC) Rule regulations. In accordance with State of Utah Agricultural Code...
Chien, Maw-Sheng; Gilbert , Teresa L.; Huang, Chienjin; Landolt, Marsha L.; O'Hara, Patrick J.; Winton, James R.
1992-01-01
The complete sequence coding for the 57-kDa major soluble antigen of the salmonid fish pathogen, Renibacterium salmoninarum, was determined. The gene contained an opening reading frame of 1671 nucleotides coding for a protein of 557 amino acids with a calculated Mr value of 57190. The first 26 amino acids constituted a signal peptide. The deduced sequence for amino acid residues 27–61 was in agreement with the 35 N-terminal amino acid residues determined by microsequencing, suggesting the protein in synthesized as a 557-amino acid precursor and processed to produce a mature protein of Mr 54505. Two regions of the protein contained imperfect direct repeats. The first region contained two copies of an 81-residue repeat, the second contained five copies of an unrelated 25-residue repeat. Also, a perfect inverted repeat (including three in-frame UAA stop codons) was observed at the carboxyl-terminus of the gene.
Study of steam condensation at sub-atmospheric pressure: setting a basic research using MELCOR code
NASA Astrophysics Data System (ADS)
Manfredini, A.; Mazzini, M.
2017-11-01
One of the most serious accidents that can occur in the experimental nuclear fusion reactor ITER is the break of one of the headers of the refrigeration system of the first wall of the Tokamak. This results in water-steam mixture discharge in vacuum vessel (VV), with consequent pressurization of this container. To prevent the pressure in the VV exceeds 150 KPa absolute, a system discharges the steam inside a suppression pool, at an absolute pressure of 4.2 kPa. The computer codes used to analyze such incident (eg. RELAP 5 or MELCOR) are not validated experimentally for such conditions. Therefore, we planned a basic research, in order to have experimental data useful to validate the heat transfer correlations used in these codes. After a thorough literature search on this topic, ACTA, in collaboration with the staff of ITER, defined the experimental matrix and performed the design of the experimental apparatus. For the thermal-hydraulic design of the experiments, we executed a series of calculations by MELCOR. This code, however, was used in an unconventional mode, with the development of models suited respectively to low and high steam flow-rate tests. The article concludes with a discussion of the placement of experimental data within the map featuring the phenomenon characteristics, showing the importance of the new knowledge acquired, particularly in the case of chugging.
Birkett, Charlotte; Arandjelovic, Ognjen; Humphris, Gerald
2017-07-01
While increasingly appreciated for its importance, the interaction between health care professionals (HCP) and patients is notoriously difficult to study, with both methodological and practical challenges. The former has been addressed by the so-called Verona coding definitions of emotional sequences (VR-CoDES) - a system for identifying and coding patient emotions and the corresponding HCP responses - shown to be reliable and informative in a number of independent studies in different health care delivery contexts. In the preset work we focus on the practical challenge of the scalability of this coding system, namely on making it easily usable more widely and on applying it on larger patient cohorts. In particular, VR-CoDES is inherently complex and training is required to ensure consistent annotation of audio recordings or textual transcripts of consultations. Following up on our previous pilot investigation, in the the present paper we describe the first automatic, computer based algorithm capable of providing coarse level coding of textual transcripts. We investigate different representations of patient utterances and classification methodologies, and label each utterance as either containing an explicit expression of emotional distress (a `concern'), an implicit one (a `cue'), or neither. Using a data corpus comprising 200 consultations between radiotherapists and adult female breast cancer patients we demonstrate excellent labelling performance.
Energy Cost Impact of Non-Residential Energy Code Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jian; Hart, Philip R.; Rosenberg, Michael I.
2016-08-22
The 2012 International Energy Conservation Code contains 396 separate requirements applicable to non-residential buildings; however, there is no systematic analysis of the energy cost impact of each requirement. Consequently, limited code department budgets for plan review, inspection, and training cannot be focused on the most impactful items. An inventory and ranking of code requirements based on their potential energy cost impact is under development. The initial phase focuses on office buildings with simple HVAC systems in climate zone 4C. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance. A preliminary estimate of themore » probability of occurrence of each level of non-compliance was combined with the estimated lost savings for each level to rank the requirements according to expected savings impact. The methodology to develop and refine further energy cost impacts, specific to building type, system type, and climate location is demonstrated. As results are developed, an innovative alternative method for compliance verification can focus efforts so only the most impactful requirements from an energy cost perspective are verified for every building and a subset of the less impactful requirements are verified on a random basis across a building population. The results can be further applied in prioritizing training material development and specific areas of building official training.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Pacific Northwest Laboratory (PNL), operated by Battelle Memorial Institute under contract to the U.S. Department of Energy, operates tank systems for the U.S. Department of Energy, Richland Operations Office (DOE-RL), that contain dangerous waste constituents as defined by Washington State Department of Ecology (WDOE) Dangerous Waste Regulations, Washington Administrative Code (WAC) 173-303-040(18). Chapter 173-303-640(2) of the WAC requires the performance of integrity assessments for each existing tank system that treats or stores dangerous waste, except those operating under interim status with compliant secondary containment. This Integrity Assessment Plan (IAP) identifies all tasks that will be performed during the integritymore » assessment of the PNL-operated Radioactive Liquid Waste Systems (RLWS) associated with the 324 and 325 Buildings located in the 300 Area of the Hanford Site. It describes the inspections, tests, and analyses required to assess the integrity of the PNL RLWS (tanks, ancillary equipment, and secondary containment) and provides sufficient information for adequate budgeting and control of the assessment program. It also provides necessary information to permit the Independent, Qualified, Registered Professional Engineer (IQRPE) to approve the integrity assessment program.« less
Coordinated design of coding and modulation systems
NASA Technical Reports Server (NTRS)
Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.
1976-01-01
The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.
User Instructions for the Policy Analysis Modeling System (PAMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.
PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.
Development and applications of 3D-DIVIMP(HC) Monte Carlo impurity modeling code
NASA Astrophysics Data System (ADS)
Mu, Yarong
A self-contained gas injection system for the Divertor Material Evaluation System (DiMES) on DIII-D, the Porous Plug Injector (PPI), has been employed by A. McLean for in-situ study of chemical erosion in the tokamak divertor environment by injection of CH4. The principal contribution of the present thesis is a new interpretive code, 3D-DIVIMP(HC), which has been developed and successfully applied to the interpretation of the CH, C I, and C II emissions measured during the PPI experiments. The two principal types of experimental data which are compared here with 3D-DIVIMP(HC) code modeling are (a) absolute emissivities measured with a high resolution spectrometer, and (b) 2D filtered camera (TV) pictures taken from a view essentially straight down on the PPI. Incorporating the Janev-Reiter database for the breakup reactions of methane molecules in a plasma, 3D-DIVIMP(HC) is able to replicate these measurements to within the combined experimental and database uncertainties. It is therefore concluded that the basic elements of the physics and chemistry controlling the breakup of methane entering an attached divertor plasma have been identified and are incorporated in 3D-DIVIMP(HC).
Burst Firing is a Neural Code in an Insect Auditory System
Eyherabide, Hugo G.; Rokem, Ariel; Herz, Andreas V. M.; Samengo, Inés
2008-01-01
Various classes of neurons alternate between high-frequency discharges and silent intervals. This phenomenon is called burst firing. To analyze burst activity in an insect system, grasshopper auditory receptor neurons were recorded in vivo for several distinct stimulus types. The experimental data show that both burst probability and burst characteristics are strongly influenced by temporal modulations of the acoustic stimulus. The tendency to burst, hence, is not only determined by cell-intrinsic processes, but also by their interaction with the stimulus time course. We study this interaction quantitatively and observe that bursts containing a certain number of spikes occur shortly after stimulus deflections of specific intensity and duration. Our findings suggest a sparse neural code where information about the stimulus is represented by the number of spikes per burst, irrespective of the detailed interspike-interval structure within a burst. This compact representation cannot be interpreted as a firing-rate code. An information-theoretical analysis reveals that the number of spikes per burst reliably conveys information about the amplitude and duration of sound transients, whereas their time of occurrence is reflected by the burst onset time. The investigated neurons encode almost half of the total transmitted information in burst activity. PMID:18946533
Multimodal biometric digital watermarking on immigrant visas for homeland security
NASA Astrophysics Data System (ADS)
Sasi, Sreela; Tamhane, Kirti C.; Rajappa, Mahesh B.
2004-08-01
Passengers with immigrant Visa's are a major concern to the International Airports due to the various fraud operations identified. To curb tampering of genuine Visa, the Visa's should contain human identification information. Biometric characteristic is a common and reliable way to authenticate the identity of an individual [1]. A Multimodal Biometric Human Identification System (MBHIS) that integrates iris code, DNA fingerprint, and the passport number on the Visa photograph using digital watermarking scheme is presented. Digital Watermarking technique is well suited for any system requiring high security [2]. Ophthalmologists [3], [4], [5] suggested that iris scan is an accurate and nonintrusive optical fingerprint. DNA sequence can be used as a genetic barcode [6], [7]. While issuing Visa at the US consulates, the DNA sequence isolated from saliva, the iris code and passport number shall be digitally watermarked in the Visa photograph. This information is also recorded in the 'immigrant database'. A 'forward watermarking phase' combines a 2-D DWT transformed digital photograph with the personal identification information. A 'detection phase' extracts the watermarked information from this VISA photograph at the port of entry, from which iris code can be used for identification and DNA biometric for authentication, if an anomaly arises.
LENSED: a code for the forward reconstruction of lenses and sources from strong lensing observations
NASA Astrophysics Data System (ADS)
Tessore, Nicolas; Bellagamba, Fabio; Metcalf, R. Benton
2016-12-01
Robust modelling of strong lensing systems is fundamental to exploit the information they contain about the distribution of matter in galaxies and clusters. In this work, we present LENSED, a new code which performs forward parametric modelling of strong lenses. LENSED takes advantage of a massively parallel ray-tracing kernel to perform the necessary calculations on a modern graphics processing unit (GPU). This makes the precise rendering of the background lensed sources much faster, and allows the simultaneous optimization of tens of parameters for the selected model. With a single run, the code is able to obtain the full posterior probability distribution for the lens light, the mass distribution and the background source at the same time. LENSED is first tested on mock images which reproduce realistic space-based observations of lensing systems. In this way, we show that it is able to recover unbiased estimates of the lens parameters, even when the sources do not follow exactly the assumed model. Then, we apply it to a subsample of the Sloan Lens ACS Survey lenses, in order to demonstrate its use on real data. The results generally agree with the literature, and highlight the flexibility and robustness of the algorithm.
Kato, Hirotomo; Jochim, Ryan C.; Gomez, Eduardo A.; Sakoda, Ryo; Iwata, Hiroyuki; Valenzuela, Jesus G.; Hashiguchi, Yoshihisa
2010-01-01
Triatoma (T.) dimidiata is a hematophagous Hemiptera and a main vector of Chagas disease. The saliva of this and other blood-sucking insects contains potent pharmacologically active components that assist them in counteracting the host hemostatic and inflammatory systems during blood feeding. To describe the repertoire of potential bioactive salivary molecules from this insect, a number of randomly selected transcripts from the salivary gland cDNA library of T. dimidiata were sequenced and analyzed. This analysis showed that 77.5% of the isolated transcripts coded for putative secreted proteins, and 89.9% of these coded for variants of the lipocalin family proteins. The most abundant transcript was a homologue of procalin, the major allergen of T. protracta saliva, and contributed more than 50% of the transcripts coding for putative secreted proteins, suggesting that it may play an important role in the blood-feeding process. Other salivary transcripts encoding lipocalin family proteins had homology to triabin (a thrombin inhibitor), triafestin (an inhibitor of kallikrein–kinin system), pallidipin (an inhibitor of collagen-induced platelet aggregation) and others with unknown function. PMID:19900580
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-19
...This document contains a notice of pendency before the Department of Labor (the Department) of a proposed individual exemption from certain prohibited transaction restrictions of the Employee Retirement Income Security Act of 1974, as amended (ERISA), the Federal Employees' Retirement System Act of 1986, as amended (FERSA), and the Internal Revenue Code of 1986, as amended (the Code). The proposed transactions involve BlackRock, Inc. and its investment advisory, investment management and broker-dealer affiliates and their successors. The proposed exemption, if granted, would affect plans for which BlackRock, Inc. and its investment advisory, investment management and broker-dealer affiliates and their successors serve as fiduciaries, and the participants and beneficiaries of such plans.
Characteristics of health interventions: a systematic analysis of the Austrian Procedure Catalogue.
Neururer, Sabrina B; Pfeiffer, Karl-Peter
2012-01-01
The Austrian Procedure Catalogue contains 1,500 codes for health interventions used for performance-oriented hospital financing in Austria. It offers a multiaxial taxonomy. The aim of this study is to identify characteristics of medical procedures. Therefore a definition analysis followed by a typological analysis was conducted. Search strings were generated out of code descriptions regarding the heart, large vessels and cardiovascular system. Their definitions were looked up in the Pschyrembel Clinical Dictionary and documented. Out of these definitions, types which represent characteristics of health interventions were abstracted. The three axes of the Austrian Procedure Catalogue were approved as well as new, relevant information identified. The results are the foundation of a further enhancement of the Austrian Procedure Catalogue.
NASA Technical Reports Server (NTRS)
Mckee, James W.
1990-01-01
This volume (4 of 4) contains the description, structured flow charts, prints of the graphical displays, and source code to generate the displays for the AMPS graphical status system. The function of these displays is to present to the manager of the AMPS system a graphical status display with the hot boxes that allow the manager to get more detailed status on selected portions of the AMPS system. The development of the graphical displays is divided into two processes; the creation of the screen images and storage of them in files on the computer, and the running of the status program which uses the screen images.
Cave, John W; Xia, Li; Caudy, Michael
2011-01-01
In Drosophila melanogaster, achaete (ac) and m8 are model basic helix-loop-helix activator (bHLH A) and repressor genes, respectively, that have the opposite cell expression pattern in proneural clusters during Notch signaling. Previous studies have shown that activation of m8 transcription in specific cells within proneural clusters by Notch signaling is programmed by a "combinatorial" and "architectural" DNA transcription code containing binding sites for the Su(H) and proneural bHLH A proteins. Here we show the novel result that the ac promoter contains a similar combinatorial code of Su(H) and bHLH A binding sites but contains a different Su(H) site architectural code that does not mediate activation during Notch signaling, thus programming a cell expression pattern opposite that of m8 in proneural clusters.
Production of recombinant adenovirus containing human interlukin-4 gene.
Mojarrad, Majid; Abdolazimi, Yassan; Hajati, Jamshid; Modarressi, Mohammad Hossein
2011-11-01
Recombinant adenoviruses are currently used for a variety of purposes, including in vitro gene transfer, in vivo vaccination, and gene therapy. Ability to infect many cell types, high efficiency in gene transfer, entering both dividing and non dividing cells, and growing to high titers make this virus a good choice for using in various experiments. In the present experiment, a recombinant adenovirus containing human IL-4 coding sequence was made. IL-4 has several characteristics that made it a good choice for using in cancer gene therapy, controlling inflammatory diseases, and studies on autoimmune diseases. In brief, IL-4 coding sequence was amplified by and cloned in pAd-Track-CMV. Then, by means of homologous recombination between recombinant pAd-Track-CMV and Adeasy-1 plasmid in bacteria, recombinant adenovirus complete genome was made and IL-4 containing shuttle vector was incorporated into the viral backbone. After linearization, for virus packaging, viral genome was transfected into HEK-293 cell line. Viral production was conveniently followed with the aid of green fluorescent protein. Recombinant adenovirus produced here, was capable to infecting cell lines and express interlukin-4 in cell. This system can be used as a powerful, easy, and cost benefit tool in various studies on cancer gene therapy and also studies on immunogenetics.
Numerical analysis of projectile impact in woven texile structures
NASA Technical Reports Server (NTRS)
Roylance, D.
1977-01-01
Computer codes were developed for simulating the dynamic fracture and viscoelastic constitutive response due to stress wave interaction and reflections caused by ballistic impact on woven textiles. The method, which was developed for use in the design and analysis of protection devices for personnel armor, has potential for use in studies of rotor blade burst containment at high velocity. Alterations in coding required for burst containment problems are discussed.
Lu, Jiamiao; Williams, James A.; Luke, Jeremy; Zhang, Feijie; Chu, Kirk; Kay, Mark A.
2017-01-01
We previously developed a mini-intronic plasmid (MIP) expression system in which the essential bacterial elements for plasmid replication and selection are placed within an engineered intron contained within a universal 5′ UTR noncoding exon. Like minicircle DNA plasmids (devoid of bacterial backbone sequences), MIP plasmids overcome transcriptional silencing of the transgene. However, in addition MIP plasmids increase transgene expression by 2 and often >10 times higher than minicircle vectors in vivo and in vitro. Based on these findings, we examined the effects of the MIP intronic sequences in a recombinant adeno-associated virus (AAV) vector system. Recombinant AAV vectors containing an intron with a bacterial replication origin and bacterial selectable marker increased transgene expression by 40 to 100 times in vivo when compared with conventional AAV vectors. Therefore, inclusion of this noncoding exon/intron sequence upstream of the coding region can substantially enhance AAV-mediated gene expression in vivo. PMID:27903072
ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William
2008-04-01
ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a methodmore » for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.« less
Sakura, Midori; Lambrinos, Dimitrios; Labhart, Thomas
2008-02-01
Many insects exploit skylight polarization for visual compass orientation or course control. As found in crickets, the peripheral visual system (optic lobe) contains three types of polarization-sensitive neurons (POL neurons), which are tuned to different ( approximately 60 degrees diverging) e-vector orientations. Thus each e-vector orientation elicits a specific combination of activities among the POL neurons coding any e-vector orientation by just three neural signals. In this study, we hypothesize that in the presumed orientation center of the brain (central complex) e-vector orientation is population-coded by a set of "compass neurons." Using computer modeling, we present a neural network model transforming the signal triplet provided by the POL neurons to compass neuron activities coding e-vector orientation by a population code. Using intracellular electrophysiology and cell marking, we present evidence that neurons with the response profile of the presumed compass neurons do indeed exist in the insect brain: each of these compass neuron-like (CNL) cells is activated by a specific e-vector orientation only and otherwise remains silent. Morphologically, CNL cells are tangential neurons extending from the lateral accessory lobe to the lower division of the central body. Surpassing the modeled compass neurons in performance, CNL cells are insensitive to the degree of polarization of the stimulus between 99% and at least down to 18% polarization and thus largely disregard variations of skylight polarization due to changing solar elevations or atmospheric conditions. This suggests that the polarization vision system includes a gain control circuit keeping the output activity at a constant level.
Computational models for the viscous/inviscid analysis of jet aircraft exhaust plumes
NASA Astrophysics Data System (ADS)
Dash, S. M.; Pergament, H. S.; Thorpe, R. D.
1980-05-01
Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.
NASA Technical Reports Server (NTRS)
Dash, S. M.; Pergament, H. S.; Thorpe, R. D.
1980-01-01
Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.
Continuation of research into language concepts for the mission support environment: Source code
NASA Technical Reports Server (NTRS)
Barton, Timothy J.; Ratner, Jeremiah M.
1991-01-01
Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.
Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base
NASA Astrophysics Data System (ADS)
Savage, B.; Snoke, J. A.
2017-12-01
The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years, SAC files contained a fixed-length header. Time and distance-related values are stored in single precision, which has become a problem with the increase in desired precision for data compared to thirty years ago. A future goal is to address this precision problem, but in a backward compatible manner. We would also like to transition SAC to a more open source license.
Enhancements to the SSME transfer function modeling code
NASA Technical Reports Server (NTRS)
Irwin, R. Dennis; Mitchell, Jerrel R.; Bartholomew, David L.; Glenn, Russell D.
1995-01-01
This report details the results of a one year effort by Ohio University to apply the transfer function modeling and analysis tools developed under NASA Grant NAG8-167 (Irwin, 1992), (Bartholomew, 1992) to attempt the generation of Space Shuttle Main Engine High Pressure Turbopump transfer functions from time domain data. In addition, new enhancements to the transfer function modeling codes which enhance the code functionality are presented, along with some ideas for improved modeling methods and future work. Section 2 contains a review of the analytical background used to generate transfer functions with the SSME transfer function modeling software. Section 2.1 presents the 'ratio method' developed for obtaining models of systems that are subject to single unmeasured excitation sources and have two or more measured output signals. Since most of the models developed during the investigation use the Eigensystem Realization Algorithm (ERA) for model generation, Section 2.2 presents an introduction of ERA, and Section 2.3 describes how it can be used to model spectral quantities. Section 2.4 details the Residue Identification Algorithm (RID) including the use of Constrained Least Squares (CLS) and Total Least Squares (TLS). Most of this information can be found in the report (and is repeated for convenience). Section 3 chronicles the effort of applying the SSME transfer function modeling codes to the a51p394.dat and a51p1294.dat time data files to generate transfer functions from the unmeasured input to the 129.4 degree sensor output. Included are transfer function modeling attempts using five methods. The first method is a direct application of the SSME codes to the data files and the second method uses the underlying trends in the spectral density estimates to form transfer function models with less clustering of poles and zeros than the models obtained by the direct method. In the third approach, the time data is low pass filtered prior to the modeling process in an effort to filter out high frequency characteristics. The fourth method removes the presumed system excitation and its harmonics in order to investigate the effects of the excitation on the modeling process. The fifth method is an attempt to apply constrained RID to obtain better transfer functions through more accurate modeling over certain frequency ranges. Section 4 presents some new C main files which were created to round out the functionality of the existing SSME transfer function modeling code. It is now possible to go from time data to transfer function models using only the C codes; it is not necessary to rely on external software. The new C main files and instructions for their use are included. Section 5 presents current and future enhancements to the XPLOT graphics program which was delivered with the initial software. Several new features which have been added to the program are detailed in the first part of this section. The remainder of Section 5 then lists some possible features which may be added in the future. Section 6 contains the conclusion section of this report. Section 6.1 is an overview of the work including a summary and observations relating to finding transfer functions with the SSME code. Section 6.2 contains information relating to future work on the project.
Quick Response (QR) Codes for Audio Support in Foreign Language Learning
ERIC Educational Resources Information Center
Vigil, Kathleen Murray
2017-01-01
This study explored the potential benefits and barriers of using quick response (QR) codes as a means by which to provide audio materials to middle-school students learning Spanish as a foreign language. Eleven teachers of Spanish to middle-school students created transmedia materials containing QR codes linking to audio resources. Students…
77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... the process. The Code Revision Process contains four basic steps that are followed for developing new documents as well as revising existing documents. Step 1: Public Input Stage, which results in the First Draft Report (formerly ROP); Step 2: Comment Stage, which results in the Second Draft Report (formerly...
Building codes : obstacle or opportunity?
Alberto Goetzl; David B. McKeever
1999-01-01
Building codes are critically important in the use of wood products for construction. The codes contain regulations that are prescriptive or performance related for various kinds of buildings and construction types. A prescriptive standard might dictate that a particular type of material be used in a given application. A performance standard requires that a particular...
Computer-Based Learning of Spelling Skills in Children with and without Dyslexia
ERIC Educational Resources Information Center
Kast, Monika; Baschera, Gian-Marco; Gross, Markus; Jancke, Lutz; Meyer, Martin
2011-01-01
Our spelling training software recodes words into multisensory representations comprising visual and auditory codes. These codes represent information about letters and syllables of a word. An enhanced version, developed for this study, contains an additional phonological code and an improved word selection controller relying on a phoneme-based…
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.P. Cramer & Associates, Inc.
2002-05-31
We recently received data on the decoded coded wire tags (CWT's) recovered from spring chinook snouts we collected during spawning surveys in the Clearwater Basin last fall (2001). We were curious about what could be learned from the tags recovered (even though our project is over), so we did some cursory analyses and have described our findings in the attached memo. Snouts were processed and codes determined by Idaho Department of Fish and Game. Most snouts did not contain CWTs, because most ad-clipped fish were not given a CWT. Further, because adults were outplanted live, we do not know whatmore » codes they contained. Each of the hatcheries from which outplanted adults were obtained had several CWT code groups returning. That means that the best we can do with the codes recovered is compare the hatchery of origin for the tag with the hatchery from which outplants were taken. The results are interesting and not exactly as we would have predicted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamberg, L.D.
1998-02-23
This document serves as a notice of construction (NOC), pursuant to the requirements of Washington Administrative Code (WAC) 246-247-060, and as a request for approval to construct, pursuant to 40 Code of Federal Regulations (CFR) 61.07, for the Integrated Water Treatment System (IWTS) Filter Vessel Sparging Vent at 105-KW Basin. Additionally, the following description, and references are provided as the notices of startup, pursuant to 40 CFR 61.09(a)(1) and (2) in accordance with Title 40 Code of Federal Regulations, Part 61, National Emission Standards for Hazardous Air Pollutants. The 105-K West Reactor and its associated spent nuclear fuel (SNF) storagemore » basin were constructed in the early 1950s and are located on the Hanford Site in the 100-K Area about 1,400 feet from the Columbia River. The 105-KW Basin contains 964 Metric Tons of SNF stored under water in approximately 3,800 closed canisters. This SNF has been stored for varying periods of time ranging from 8 to 17 years. The 105-KW Basin is constructed of concrete with an epoxy coating and contains approximately 1.3 million gallons of water with an asphaltic membrane beneath the pool. The IWTS, which has been described in the Radioactive Air Emissions NOC for Fuel Removal for 105-KW Basin (DOE/RL-97-28 and page changes per US Department of Energy, Richland Operations Office letter 97-EAP-814) will be used to remove radionuclides from the basin water during fuel removal operations. The purpose of the modification described herein is to provide operational flexibility for the IWTS at the 105-KW basin. The proposed modification is scheduled to begin in calendar year 1998.« less
Luque-Almagro, V M; Escribano, M P; Manso, I; Sáez, L P; Cabello, P; Moreno-Vivián, C; Roldán, M D
2015-11-20
Pseudomonas pseudoalcaligenes CECT5344 is an alkaliphilic bacterium that can use cyanide as nitrogen source for growth, becoming a suitable candidate to be applied in biological treatment of cyanide-containing wastewaters. The assessment of the whole genome sequence of the strain CECT5344 has allowed the generation of DNA microarrays to analyze the response to different nitrogen sources. The mRNA of P. pseudoalcaligenes CECT5344 cells grown under nitrogen limiting conditions showed considerable changes when compared against the transcripts from cells grown with ammonium; up-regulated genes were, among others, the glnK gene encoding the nitrogen regulatory protein PII, the two-component ntrBC system involved in global nitrogen regulation, and the ammonium transporter-encoding amtB gene. The protein coding transcripts of P. pseudoalcaligenes CECT5344 cells grown with sodium cyanide or an industrial jewelry wastewater that contains high concentration of cyanide and metals like iron, copper and zinc, were also compared against the transcripts of cells grown with ammonium as nitrogen source. This analysis revealed the induction by cyanide and the cyanide-rich wastewater of four nitrilase-encoding genes, including the nitC gene that is essential for cyanide assimilation, the cyanase cynS gene involved in cyanate assimilation, the cioAB genes required for the cyanide-insensitive respiration, and the ahpC gene coding for an alkyl-hydroperoxide reductase that could be related with iron homeostasis and oxidative stress. The nitC and cynS genes were also induced in cells grown under nitrogen starvation conditions. In cells grown with the jewelry wastewater, a malate quinone:oxidoreductase mqoB gene and several genes coding for metal extrusion systems were specifically induced. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Empirical transfer functions for stations in the Central California seismological network
Bakun, W.H.; Dratler, Jay
1976-01-01
A sequence of calibration signals composed of a station identification code, a transient from the release of the seismometer mass at rest from a known displacement from the equilibrium position, and a transient from a known step in voltage to the amplifier input are generated by the automatic daily calibration system (ADCS) now operational in the U.S. Geological Survey central California seismographic network. Documentation of a sequence of interactive programs to compute, from the calibration data, the complex transfer functions for the seismographic system (ground motion through digitizer) the electronics (amplifier through digitizer), and the seismometer alone are presented. The analysis utilizes the Fourier transform technique originally suggested by Espinosa et al (1962). Section I is a general description of seismographic calibration. Section II contrasts the 'Fourier transform' and the 'least-squares' techniques for analyzing transient calibration signals. Theoretical consideration for the Fourier transform technique used here are described in Section III. Section IV is a detailed description of the sequence of calibration signals generated by the ADCS. Section V is a brief 'cookbook description' of the calibration programs; Section VI contains a detailed sample program execution. Section VII suggests the uses of the resultant empirical transfer functions. Supplemental interactive programs by which smooth response functions, suitable for reducing seismic data to ground motion, are also documented in Section VII. Appendices A and B contain complete listings of the Fortran source Codes while Appendix C is an update containing preliminary results obtained from an analysis of some of the calibration signals from stations in the seismographic network near Oroville, California.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacKinnon, R.J.; Sullivan, T.M.; Kinsey, R.R.
1997-05-01
The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performancemore » considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.« less
Laser identification system based on acousto-optical barcode scanner principles
NASA Astrophysics Data System (ADS)
Khansuvarov, Ruslan A.; Korol, Georgy I.; Preslenev, Leonid N.; Bestugin, Aleksandr R.; Paraskun, Arthur S.
2016-09-01
The main purpose of the bar code in the modern world is the unique identification of the product, service, or any of their features, so personal and stationary barcode scanners so widely used. One of the important parameters of bar code scanners is their reliability, accuracy of the barcode recognition, response time and performance. Nowadays, the most popular personal barcode scanners contain a mechanical part, which extremely impairs the reliability indices. Group of SUAI engineers has proposed bar code scanner based on laser beam acoustic deflection effect in crystals [RU patent No 156009 issued 4/16/2015] Through the use of an acousto-optic deflector element in barcode scanner described by a group of engineers SUAI, it can be implemented in the manual form factor, and the stationary form factor of a barcode scanner. Being a wave electronic device, an acousto-optic element in the composition of the acousto-optic barcode scanner allows you to clearly establish a mathematical link between the encoded function of the bar code with the accepted input photodetector intensities function that allows you to speak about the great probability of a bar code clear definition. This paper provides a description of the issued patent, the description of the principles of operation based on the mathematical analysis, a description of the layout of the implemented scanner.
HEVC for high dynamic range services
NASA Astrophysics Data System (ADS)
Kim, Seung-Hwan; Zhao, Jie; Misra, Kiran; Segall, Andrew
2015-09-01
Displays capable of showing a greater range of luminance values can render content containing high dynamic range information in a way such that the viewers have a more immersive experience. This paper introduces the design aspects of a high dynamic range (HDR) system, and examines the performance of the HDR processing chain in terms of compression efficiency. Specifically it examines the relation between recently introduced Society of Motion Picture and Television Engineers (SMPTE) ST 2084 transfer function and the High Efficiency Video Coding (HEVC) standard. SMPTE ST 2084 is designed to cover the full range of an HDR signal from 0 to 10,000 nits, however in many situations the valid signal range of actual video might be smaller than SMPTE ST 2084 supported range. The above restricted signal range results in restricted range of code values for input video data and adversely impacts compression efficiency. In this paper, we propose a code value remapping method that extends the restricted range code values into the full range code values so that the existing standards such as HEVC may better compress the video content. The paper also identifies related non-normative encoder-only changes that are required for remapping method for a fair comparison with anchor. Results are presented comparing the efficiency of the current approach versus the proposed remapping method for HM-16.2.
The Deterministic Mine Burial Prediction System
2009-01-12
or below the water-line, initial linear and angular velocities, and fall angle relative to the mine’s axis of symmetry. Other input data needed...c. Run_DMBP.m: start-up MATLAB script for the program 2. C:\\DMBP\\DMBP_src: This directory contains source code, geotechnical databases, and...approved for public release). b. \\Impact_35: The IMPACT35 model c. \\MakeTPARfiles: scripts for creating wave height and wave period input data from
1988-05-12
the "load IC" menu option. A prompt will appear in the typescript window requesting the name of the knowledge base to be loaded. Enter...highlighted and then a prompt will appear in the typescript window. The prompt will be requesting the name of the file containing the message to be read in...the file name, the system will begin reading in the message. The listified message is echoed back in the typescript window. After that, the screen
NWRA AVOSS Wake Vortex Prediction Algorithm. 3.1.1
NASA Technical Reports Server (NTRS)
Robins, R. E.; Delisi, D. P.; Hinton, David (Technical Monitor)
2002-01-01
This report provides a detailed description of the wake vortex prediction algorithm used in the Demonstration Version of NASA's Aircraft Vortex Spacing System (AVOSS). The report includes all equations used in the algorithm, an explanation of how to run the algorithm, and a discussion of how the source code for the algorithm is organized. Several appendices contain important supplementary information, including suggestions for enhancing the algorithm and results from test cases.
Software Management Environment (SME) installation guide
NASA Technical Reports Server (NTRS)
Kistler, David; Jeletic, Kellyann
1992-01-01
This document contains installation information for the Software Management Environment (SME), developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides a list of hardware and software requirements as well as detailed installation instructions and trouble-shooting information.
Symbolic Execution Enhanced System Testing
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Pasareanu, Corina S.; Raman, Vishwanath
2012-01-01
We describe a testing technique that uses information computed by symbolic execution of a program unit to guide the generation of inputs to the system containing the unit, in such a way that the unit's, and hence the system's, coverage is increased. The symbolic execution computes unit constraints at run-time, along program paths obtained by system simulations. We use machine learning techniques treatment learning and function fitting to approximate the system input constraints that will lead to the satisfaction of the unit constraints. Execution of system input predictions either uncovers new code regions in the unit under analysis or provides information that can be used to improve the approximation. We have implemented the technique and we have demonstrated its effectiveness on several examples, including one from the aerospace domain.
Documents Pertaining to Resource Conservation and Recovery Act Corrective Action Event Codes
Document containing RCRA Corrective Action event codes and definitions, including national requirements, initiating sources, dates, and guidance, from the first facility assessment until the Corrective Action is terminated.
Second NASA Workshop on Wiring for Space Applications
NASA Technical Reports Server (NTRS)
1994-01-01
This document contains the proceedings of the Second NASA Workshop on Wiring for Space Applications held at NASA LeRC in Cleveland, OH, 6-7 Oct. 1993. The workshop was sponsored by NASA Headquarters Code QW Office of Safety and Mission Quality, Technical Standards Division and hosted by NASA LeRC, Power Technology Division, Electrical Components and Systems Branch. The workshop addressed key technology issues in the field of electrical power wiring for space applications. Speakers from government, industry, and academia presented and discussed topics on arc tracking phenomena, wiring system design, insulation constructions, and system protection. Presentation materials provided by the various speakers are included in this document.
The complete mitochondrial genome of Hydra vulgaris (Hydroida: Hydridae).
Pan, Hong-Chun; Fang, Hong-Yan; Li, Shi-Wei; Liu, Jun-Hong; Wang, Ying; Wang, An-Tai
2014-12-01
The complete mitochondrial genome of Hydra vulgaris (Hydroida: Hydridae) is composed of two linear DNA molecules. The mitochondrial DNA (mtDNA) molecule 1 is 8010 bp long and contains six protein-coding genes, large subunit rRNA, methionine and tryptophan tRNAs, two pseudogenes consisting respectively of a partial copy of COI, and terminal sequences at two ends of the linear mtDNA, while the mtDNA molecule 2 is 7576 bp long and contains seven protein-coding genes, small subunit rRNA, methionine tRNA, a pseudogene consisting of a partial copy of COI and terminal sequences at two ends of the linear mtDNA. COI gene begins with GTG as start codon, whereas other 12 protein-coding genes start with a typical ATG initiation codon. In addition, all protein-coding genes are terminated with TAA as stop codon.
Fukushima Daiichi Unit 1 Ex-Vessel Prediction: Core Concrete Interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robb, Kevin R; Farmer, Mitchell; Francis, Matthew W
Lower head failure and corium concrete interaction were predicted to occur at Fukushima Daiichi Unit 1 (1F1) by several different system-level code analyses, including MELCOR v2.1 and MAAP5. Although these codes capture a wide range of accident phenomena, they do not contain detailed models for ex-vessel core melt behavior. However, specialized codes exist for analysis of ex-vessel melt spreading (e.g., MELTSPREAD) and long-term debris coolability (e.g., CORQUENCH). On this basis, an analysis was carried out to further evaluate ex-vessel behavior for 1F1 using MELTSPREAD and CORQUENCH. Best-estimate melt pour conditions predicted by MELCOR v2.1 and MAAP5 were used as input.more » MELTSPREAD was then used to predict the spatially dependent melt conditions and extent of spreading during relocation from the vessel. The results of the MELTSPREAD analysis are reported in a companion paper. This information was used as input for the long-term debris coolability analysis with CORQUENCH.« less
Fukushima Daiichi Unit 1 ex-vessel prediction: Core melt spreading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.; Robb, K. R.; Francis, M. W.
Lower head failure and corium-concrete interaction were predicted to occur at Fukushima Daiichi Unit 1 (1F1) by several different system-level code analyses, including MELCOR v2.1 and MAAP5. Although these codes capture a wide range of accident phenomena, they do not contain detailed models for ex-vessel core melt behavior. However, specialized codes exist for analysis of ex-vessel melt spreading (e.g., MELTSPREAD) and long-term debris coolability (e.g., CORQUENCH). On this basis, an analysis has been carried out to further evaluate ex-vessel behavior for 1F1 using MELTSPREAD and CORQUENCH. Best-estimate melt pour conditions predicted by MELCOR v2.1 and MAAP5 were used as input.more » MELTSPREAD was then used to predict the spatially-dependent melt conditions and extent of spreading during relocation from the vessel. Lastly, this information was then used as input for the long-term debris coolability analysis with CORQUENCH that is reported in a companion paper.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stofleth, Jerome H.; Tribble, Megan Kimberly; Crocker, Robert W.
2017-05-01
The V27 containment vessel was procured by the US Army Recovered Chemical Material Directorate ( RCMD ) as a replacement vessel for use on the P2 Explosive Destruction Systems. It is the third EDS vessel to be fabricated under Code Case 2564 of the ASME Boiler and Pressure Vessel Code, which provides rules for the design of impulsively loaded vessels. The explosive rating for the vessel, based on the Code Case, is nine (9) pounds TNT - equivalent for up to 637 detonations . This report documents the results of explosive tests that were done on the vessel at Sandiamore » National Laboratories in Albuquerque New Mexico to qualify the vessel for explosive use . The primary qualification test consisted of si x 1.5 pound charges of Composition C - 4 (equivalent to 11.25 pounds TNT) distributed around the vessel in accordance with the User Design Specification. Four subsequent tests using less explosive evaluated the effects of slight variations in orientation of the charges . All vessel acceptance criteria were met.« less
Neural networks for data compression and invariant image recognition
NASA Technical Reports Server (NTRS)
Gardner, Sheldon
1989-01-01
An approach to invariant image recognition (I2R), based upon a model of biological vision in the mammalian visual system (MVS), is described. The complete I2R model incorporates several biologically inspired features: exponential mapping of retinal images, Gabor spatial filtering, and a neural network associative memory. In the I2R model, exponentially mapped retinal images are filtered by a hierarchical set of Gabor spatial filters (GSF) which provide compression of the information contained within a pixel-based image. A neural network associative memory (AM) is used to process the GSF coded images. We describe a 1-D shape function method for coding of scale and rotationally invariant shape information. This method reduces image shape information to a periodic waveform suitable for coding as an input vector to a neural network AM. The shape function method is suitable for near term applications on conventional computing architectures equipped with VLSI FFT chips to provide a rapid image search capability.
Fukushima Daiichi Unit 1 ex-vessel prediction: Core melt spreading
Farmer, M. T.; Robb, K. R.; Francis, M. W.
2016-10-31
Lower head failure and corium-concrete interaction were predicted to occur at Fukushima Daiichi Unit 1 (1F1) by several different system-level code analyses, including MELCOR v2.1 and MAAP5. Although these codes capture a wide range of accident phenomena, they do not contain detailed models for ex-vessel core melt behavior. However, specialized codes exist for analysis of ex-vessel melt spreading (e.g., MELTSPREAD) and long-term debris coolability (e.g., CORQUENCH). On this basis, an analysis has been carried out to further evaluate ex-vessel behavior for 1F1 using MELTSPREAD and CORQUENCH. Best-estimate melt pour conditions predicted by MELCOR v2.1 and MAAP5 were used as input.more » MELTSPREAD was then used to predict the spatially-dependent melt conditions and extent of spreading during relocation from the vessel. Lastly, this information was then used as input for the long-term debris coolability analysis with CORQUENCH that is reported in a companion paper.« less
Portal of medical data models: information infrastructure for medical research and healthcare.
Dugas, Martin; Neuhaus, Philipp; Meidt, Alexandra; Doods, Justin; Storck, Michael; Bruland, Philipp; Varghese, Julian
2016-01-01
Information systems are a key success factor for medical research and healthcare. Currently, most of these systems apply heterogeneous and proprietary data models, which impede data exchange and integrated data analysis for scientific purposes. Due to the complexity of medical terminology, the overall number of medical data models is very high. At present, the vast majority of these models are not available to the scientific community. The objective of the Portal of Medical Data Models (MDM, https://medical-data-models.org) is to foster sharing of medical data models. MDM is a registered European information infrastructure. It provides a multilingual platform for exchange and discussion of data models in medicine, both for medical research and healthcare. The system is developed in collaboration with the University Library of Münster to ensure sustainability. A web front-end enables users to search, view, download and discuss data models. Eleven different export formats are available (ODM, PDF, CDA, CSV, MACRO-XML, REDCap, SQL, SPSS, ADL, R, XLSX). MDM contents were analysed with descriptive statistics. MDM contains 4387 current versions of data models (in total 10,963 versions). 2475 of these models belong to oncology trials. The most common keyword (n = 3826) is 'Clinical Trial'; most frequent diseases are breast cancer, leukemia, lung and colorectal neoplasms. Most common languages of data elements are English (n = 328,557) and German (n = 68,738). Semantic annotations (UMLS codes) are available for 108,412 data items, 2453 item groups and 35,361 code list items. Overall 335,087 UMLS codes are assigned with 21,847 unique codes. Few UMLS codes are used several thousand times, but there is a long tail of rarely used codes in the frequency distribution. Expected benefits of the MDM portal are improved and accelerated design of medical data models by sharing best practice, more standardised data models with semantic annotation and better information exchange between information systems, in particular Electronic Data Capture (EDC) and Electronic Health Records (EHR) systems. Contents of the MDM portal need to be further expanded to reach broad coverage of all relevant medical domains. Database URL: https://medical-data-models.org. © The Author(s) 2016. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Stagliano, T. R.; Witmer, E. A.; Rodal, J. J. A.
1979-01-01
Finite element modeling alternatives as well as the utility and limitations of the two dimensional structural response computer code CIVM-JET 4B for predicting the transient, large deflection, elastic plastic, structural responses of two dimensional beam and/or ring structures which are subjected to rigid fragment impact were investigated. The applicability of the CIVM-JET 4B analysis and code for the prediction of steel containment ring response to impact by complex deformable fragments from a trihub burst of a T58 turbine rotor was studied. Dimensional analysis considerations were used in a parametric examination of data from engine rotor burst containment experiments and data from sphere beam impact experiments. The use of the CIVM-JET 4B computer code for making parametric structural response studies on both fragment-containment structure and fragment-deflector structure was illustrated. Modifications to the analysis/computation procedure were developed to alleviate restrictions.
Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.
Uzun, Vassilya; Bilgin, Sami
2016-01-01
For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.
Proceedings of the Mobile Satellite System Architectures and Multiple Access Techniques Workshop
NASA Technical Reports Server (NTRS)
Dessouky, Khaled
1989-01-01
The Mobile Satellite System Architectures and Multiple Access Techniques Workshop served as a forum for the debate of system and network architecture issues. Particular emphasis was on those issues relating to the choice of multiple access technique(s) for the Mobile Satellite Service (MSS). These proceedings contain articles that expand upon the 12 presentations given in the workshop. Contrasting views on Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), and Time Division Multiple Access (TDMA)-based architectures are presented, and system issues relating to signaling, spacecraft design, and network management constraints are addressed. An overview article that summarizes the issues raised in the numerous discussion periods of the workshop is also included.
Microcontroller-based underwater acoustic ECG telemetry system.
Istepanian, R S; Woodward, B
1997-06-01
This paper presents a microcontroller-based underwater acoustic telemetry system for digital transmission of the electrocardiogram (ECG). The system is designed for the real time, through-water transmission of data representing any parameter, and it was used initially for transmitting in multiplexed format the heart rate, breathing rate and depth of a diver using self-contained underwater breathing apparatus (SCUBA). Here, it is used to monitor cardiovascular reflexes during diving and swimming. The programmable capability of the system provides an effective solution to the problem of transmitting data in the presence of multipath interference. An important feature of the paper is a comparative performance analysis of two encoding methods, Pulse Code Modulation (PCM) and Pulse Position Modulation (PPM).
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Bhat, R. B.
1979-01-01
A finite element program is linked with a general purpose optimization program in a 'programing system' which includes user supplied codes that contain problem dependent formulations of the design variables, objective function and constraints. The result is a system adaptable to a wide spectrum of structural optimization problems. In a sample of numerical examples, the design variables are the cross-sectional dimensions and the parameters of overall shape geometry, constraints are applied to stresses, displacements, buckling and vibration characteristics, and structural mass is the objective function. Thin-walled, built-up structures and frameworks are included in the sample. Details of the system organization and characteristics of the component programs are given.
System Design Description for the TMAD Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finfrock, S.H.
This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.
IAC-1.5 - INTEGRATED ANALYSIS CAPABILITY
NASA Technical Reports Server (NTRS)
Vos, R. G.
1994-01-01
The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and a database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a database, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automating data transfer among analysis programs. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation modules are supplied for building and viewing models. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. 3) System dynamics - A DISCOS interface allows full use of this simulation program for either nonlinear time domain analysis or linear frequency domain analysis. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. 5) Graphics - The graphics packages PLOT and MOSAIC are included in IAC. PLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc., while MOSAIC generates color raster displays of either tabular of array type data. Either DI3000 or PLOT-10 graphics software is required for full graphics capability. IAC is available by license for a period of 10 years to approved licensees. The licensed program product includes one complete set of supporting documentation. Additional copies of the documentation may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The basic central memory requirement is approximately 750KB. IAC includes the executive system, graphics modules, a database, general utilities, and the interfaces to all analysis and controls programs described above. Source code is provided for the control programs ORACLS, SAMSAN, NBOD2, and DISCOS. The following programs are also available from COSMIC a
IAC-1.5 - INTEGRATED ANALYSIS CAPABILITY
NASA Technical Reports Server (NTRS)
Vos, R. G.
1994-01-01
The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and a database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a database, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automating data transfer among analysis programs. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation modules are supplied for building and viewing models. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. 3) System dynamics - A DISCOS interface allows full use of this simulation program for either nonlinear time domain analysis or linear frequency domain analysis. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. 5) Graphics - The graphics packages PLOT and MOSAIC are included in IAC. PLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc., while MOSAIC generates color raster displays of either tabular of array type data. Either DI3000 or PLOT-10 graphics software is required for full graphics capability. IAC is available by license for a period of 10 years to approved licensees. The licensed program product includes one complete set of supporting documentation. Additional copies of the documentation may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The basic central memory requirement is approximately 750KB. IAC includes the executive system, graphics modules, a database, general utilities, and the interfaces to all analysis and controls programs described above. Source code is provided for the control programs ORACLS, SAMSAN, NBOD2, and DISCOS. The following programs are also available from COSMIC as separate packages: NASTRAN, SINDA/SINFLO, TRASYS II, DISCOS, ORACLS, SAMSAN, NBOD2, and INCA. IAC was developed in 1985.
Simulation studies using multibody dynamics code DART
NASA Technical Reports Server (NTRS)
Keat, James E.
1989-01-01
DART is a multibody dynamics code developed by Photon Research Associates for the Air Force Astronautics Laboratory (AFAL). The code is intended primarily to simulate the dynamics of large space structures, particularly during the deployment phase of their missions. DART integrates nonlinear equations of motion numerically. The number of bodies in the system being simulated is arbitrary. The bodies' interconnection joints can have an arbitrary number of degrees of freedom between 0 and 6. Motions across the joints can be large. Provision for simulating on-board control systems is provided. Conservation of energy and momentum, when applicable, are used to evaluate DART's performance. After a brief description of DART, studies made to test the program prior to its delivery to AFAL are described. The first is a large angle reorientating of a flexible spacecraft consisting of a rigid central hub and four flexible booms. Reorientation was accomplished by a single-cycle sine wave shape torque input. In the second study, an appendage, mounted on a spacecraft, was slewed through a large angle. Four closed-loop control systems provided control of this appendage and of the spacecraft's attitude. The third study simulated the deployment of the rim of a bicycle wheel configuration large space structure. This system contained 18 bodies. An interesting and unexpected feature of the dynamics was a pulsing phenomena experienced by the stays whole playout was used to control the deployment. A short description of the current status of DART is given.
Investigation of Containment Flooding Strategy for Mark-III Nuclear Power Plant with MAAP4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Su Weinian; Wang, S.-J.; Chiang, S.-C
2005-06-15
Containment flooding is an important strategy for severe accident management of a conventional boiling water reactor (BWR) system. The purpose of this work is to investigate the containment flooding strategy of the Mark-III system after a reactor pressure vessel (RPV) breach. The Kuosheng Power Plant is a typical BWR-6 nuclear power plant (NPP) with Mark-III containment. The Severe Accident Management Guideline (SAMG) of the Kuosheng NPP has been developed based on the BWR Owners Group (BWROG) Emergency Procedure and Severe Accident Guidelines, Rev. 2. Therefore, the Kuosheng NPP is selected as the plant for study, and the MAAP4 code ismore » chosen as the tool for analysis. A postulated specific station blackout sequence for the Kuosheng NPP is cited as a reference case for this analysis. Because of the design features of Mark-III containment, the debris in the reactor cavity may not be submerged after an RPV breach when one follows the containment flooding strategy as suggested in the BWROG generic guideline, and the containment integrity could be challenged eventually. A more specific containment flooding strategy with drywell venting after an RPV breach is investigated, and a more stable plant condition is achieved with this strategy. Accordingly, the containment flooding strategy after an RPV breach will be modified for the Kuosheng SAMG, and these results are applicable to typical Mark-III plants with drywell vent path.« less
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1993-01-01
The results included in the Ph.D. dissertation of Dr. Fu Quan Wang, who was supported by the grant as a Research Assistant from January 1989 through December 1992 are discussed. The sections contain a brief summary of the important aspects of this dissertation, which include: (1) erasurefree sequential decoding of trellis codes; (2) probabilistic construction of trellis codes; (3) construction of robustly good trellis codes; and (4) the separability of shaping and coding.
Oil and gas field code master list, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This document contains data collected through October 1993 and provides standardized field name spellings and codes for all identified oil and/or gas fields in the United States. Other Federal and State government agencies, as well as industry, use the EIA Oil and Gas Field Code Master List as the standard for field identification. A machine-readable version of the Oil and Gas Field Code Master List is available from the National Technical Information Service.
Recent plant studies using Victoria 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BIXLER,NATHAN E.; GASSER,RONALD D.
2000-03-08
VICTORIA 2.0 is a mechanistic computer code designed to analyze fission product behavior within the reactor coolant system (RCS) during a severe nuclear reactor accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS and secondary circuits. These predictions account for the chemical and aerosol processes that affect radionuclide behavior. VICTORIA 2.0 was released in early 1999; a new version VICTORIA 2.1, is now under development. The largest improvements in VICTORIA 2.1 are connected with the thermochemical database, which is being revised andmore » expanded following the recommendations of a peer review. Three risk-significant severe accident sequences have recently been investigated using the VICTORIA 2.0 code. The focus here is on how various chemistry options affect the predictions. Additionally, the VICTORIA predictions are compared with ones made using the MELCOR code. The three sequences are a station blackout in a GE BWR and steam generator tube rupture (SGTR) and pump-seal LOCA sequences in a 3-loop Westinghouse PWR. These sequences cover a range of system pressures, from fully depressurized to full system pressure. The chief results of this study are the fission product fractions that are retained in the core, RCS, secondary, and containment and the fractions that are released into the environment.« less
Validation and verification of the laser range safety tool (LRST)
NASA Astrophysics Data System (ADS)
Kennedy, Paul K.; Keppler, Kenneth S.; Thomas, Robert J.; Polhamus, Garrett D.; Smith, Peter A.; Trevino, Javier O.; Seaman, Daniel V.; Gallaway, Robert A.; Crockett, Gregg A.
2003-06-01
The U.S. Dept. of Defense (DOD) is currently developing and testing a number of High Energy Laser (HEL) weapons systems. DOD range safety officers now face the challenge of designing safe methods of testing HEL's on DOD ranges. In particular, safety officers need to ensure that diffuse and specular reflections from HEL system targets, as well as direct beam paths, are contained within DOD boundaries. If both the laser source and the target are moving, as they are for the Airborne Laser (ABL), a complex series of calculations is required and manual calculations are impractical. Over the past 5 years, the Optical Radiation Branch of the Air Force Research Laboratory (AFRL/HEDO), the ABL System Program Office, Logicon-RDA, and Northrup-Grumman, have worked together to develop a computer model called teh Laser Range Safety Tool (LRST), specifically designed for HEL reflection hazard analyses. The code, which is still under development, is currently tailored to support the ABL program. AFRL/HEDO has led an LRST Validation and Verification (V&V) effort since 1998, in order to determine if code predictions are accurate. This paper summarizes LRST V&V efforts to date including: i) comparison of code results with laboratory measurements of reflected laser energy and with reflection measurements made during actual HEL field tests, and ii) validation of LRST's hazard zone computations.
Error-correction coding for digital communications
NASA Astrophysics Data System (ADS)
Clark, G. C., Jr.; Cain, J. B.
This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.
Yeung, Ka Yee
2016-01-01
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface. PMID:27045593
Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2016-01-01
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-16
... Interest and Penalty Suspension Provisions Under Section 6404(g) of the Internal Revenue Code AGENCY.... SUMMARY: This document contains final regulations under section 6404(g)(2)(E) of the Internal Revenue Code... Procedure and Administration Regulations (26 CFR part 301) by adding rules under section 6404(g) relating to...
ERIC Educational Resources Information Center
Yip, Tor; Melling, Louise; Shaw, Kirsty J.
2016-01-01
An online instructional database containing information on commonly used pieces of laboratory equipment was created. In order to make the database highly accessible and to promote its use, QR codes were utilized. The instructional materials were available anytime and accessed using QR codes located on the equipment itself and within undergraduate…
NASA Technical Reports Server (NTRS)
Lee, L.-N.
1977-01-01
Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.
NASA Technical Reports Server (NTRS)
Lee, L. N.
1976-01-01
Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively small coding complexity, it is proposed to concatenate a byte oriented unit memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real time minimal byte error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.
Laser electro-optic system for rapid three-dimensional /3-D/ topographic mapping of surfaces
NASA Technical Reports Server (NTRS)
Altschuler, M. D.; Altschuler, B. R.; Taboada, J.
1981-01-01
It is pointed out that the generic utility of a robot in a factory/assembly environment could be substantially enhanced by providing a vision capability to the robot. A standard videocamera for robot vision provides a two-dimensional image which contains insufficient information for a detailed three-dimensional reconstruction of an object. Approaches which supply the additional information needed for the three-dimensional mapping of objects with complex surface shapes are briefly considered and a description is presented of a laser-based system which can provide three-dimensional vision to a robot. The system consists of a laser beam array generator, an optical image recorder, and software for controlling the required operations. The projection of a laser beam array onto a surface produces a dot pattern image which is viewed from one or more suitable perspectives. Attention is given to the mathematical method employed, the space coding technique, the approaches used for obtaining the transformation parameters, the optics for laser beam array generation, the hardware for beam array coding, and aspects of image acquisition.
Trajectories for High Specific Impulse High Specific Power Deep Space Exploration
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)
2002-01-01
Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.
Simulation of Trajectories for High Specific Impulse Deep Space Exploration
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)
2002-01-01
Difficulties in approximating flight times and deliverable masses for continuous thrust propulsion systems have complicated comparison and evaluation of proposed propulsion concepts. These continuous thrust propulsion systems are of interest to many groups, not the least of which are the electric propulsion and fusion communities. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. The analytical method derived in the companion paper was also used to simulate the trajectory. The accuracy of this method is discussed in the paper.
2013 R&D 100 Award: âMiniappsâ Bolster High Performance Computing
Belak, Jim; Richards, David
2018-06-12
Two Livermore computer scientists served on a Sandia National Laboratories-led team that developed Mantevo Suite 1.0, the first integrated suite of small software programs, also called "miniapps," to be made available to the high performance computing (HPC) community. These miniapps facilitate the development of new HPC systems and the applications that run on them. Miniapps (miniature applications) serve as stripped down surrogates for complex, full-scale applications that can require a great deal of time and effort to port to a new HPC system because they often consist of hundreds of thousands of lines of code. The miniapps are a prototype that contains some or all of the essentials of the real application but with many fewer lines of code, making the miniapp more versatile for experimentation. This allows researchers to more rapidly explore options and optimize system design, greatly improving the chances the full-scale application will perform successfully. These miniapps have become essential tools for exploring complex design spaces because they can reliably predict the performance of full applications.
A NEW HYBRID N-BODY-COAGULATION CODE FOR THE FORMATION OF GAS GIANT PLANETS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bromley, Benjamin C.; Kenyon, Scott J., E-mail: bromley@physics.utah.edu, E-mail: skenyon@cfa.harvard.edu
2011-04-20
We describe an updated version of our hybrid N-body-coagulation code for planet formation. In addition to the features of our 2006-2008 code, our treatment now includes algorithms for the one-dimensional evolution of the viscous disk, the accretion of small particles in planetary atmospheres, gas accretion onto massive cores, and the response of N-bodies to the gravitational potential of the gaseous disk and the swarm of planetesimals. To validate the N-body portion of the algorithm, we use a battery of tests in planetary dynamics. As a first application of the complete code, we consider the evolution of Pluto-mass planetesimals in amore » swarm of 0.1-1 cm pebbles. In a typical evolution time of 1-3 Myr, our calculations transform 0.01-0.1 M{sub sun} disks of gas and dust into planetary systems containing super-Earths, Saturns, and Jupiters. Low-mass planets form more often than massive planets; disks with smaller {alpha} form more massive planets than disks with larger {alpha}. For Jupiter-mass planets, masses of solid cores are 10-100 M{sub +}.« less
2012 financial outlook: physicians and podiatrists.
Schaum, Kathleen D
2012-04-01
Although the nationally unadjusted average Medicare allowable rates have not increased or decreased significantly, the new codes, the new coding regulations, the NCCI edits, and the Medicare contractors' local coverage determinations (LCDs) will greatly impact physicians' and podiatrists' revenue in 2012. Therefore, every wound care physician and podiatrist should take the time to update their charge sheets and their data entry systems with correct codes, units, and appropriate charges (that account for all the resources needed to perform each service or procedure). They should carefully read the LCDs that are pertinent to the work they perform. If the LCDs contain language that is unclear or incorrect, physicians and podiatrists should contact the Medicare contractor medical director and request a revision through the LCD Reconsideration Process. Medicare has stabilized the MPFS allowable rates for 2012-now physicians and podiatrists must do their part to implement the new coding, payment, and coverage regulations. To be sure that the entire revenue process is working properly, physicians and podiatrists should conduct quarterly, if not monthly, audits of their revenue cycle. Healthcare providers will maintain a healthy revenue cycle by conducting internal audits before outside auditors conduct audits that result in repayments that could have been prevented.
Special issue on network coding
NASA Astrophysics Data System (ADS)
Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly
2017-12-01
Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.
LCMV beamforming for a novel wireless local positioning system: a stationarity analysis
NASA Astrophysics Data System (ADS)
Tong, Hui; Zekavat, Seyed A.
2005-05-01
In this paper, we discuss the implementation of Linear Constrained Minimum Variance (LCMV) beamforming (BF) for a novel Wireless Local Position System (WLPS). WLPS main components are: (a) a dynamic base station (DBS), and (b) a transponder (TRX), both mounted on mobiles. WLPS might be considered as a node in a Mobile Adhoc NETwork (MANET). Each TRX is assigned an identification (ID) code. DBS transmits periodic short bursts of energy which contains an ID request (IDR) signal. The TRX transmits back its ID code (a signal with a limited duration) to the DBS as soon as it detects the IDR signal. Hence, the DBS receives non-continuous signals transmitted by TRX. In this work, we assume asynchronous Direct-Sequence Code Division Multiple Access (DS-CDMA) transmission from the TRX with antenna array/LCMV BF mounted at the DBS, and we discuss the implementation of the observed signal covariance matrix for LCMV BF. In LCMV BF, the observed covariance matrix should be estimated. Usually sample covariance matrix (SCM) is used to estimate this covariance matrix assuming a stationary model for the observed data which is the case in many communication systems. However, due to the non-stationary behavior of the received signal in WLPS systems, SCM does not lead to a high WLPS performance compared to even a conventional beamformer. A modified covariance matrix estimation method which utilizes the cyclostationarity property of WLPS system is introduced as a solution to this problem. It is shown that this method leads to a significant improvement in the WLPS performance.
Evaluation of computational endomicroscopy architectures for minimally-invasive optical biopsy
NASA Astrophysics Data System (ADS)
Dumas, John P.; Lodhi, Muhammad A.; Bajwa, Waheed U.; Pierce, Mark C.
2017-02-01
We are investigating compressive sensing architectures for applications in endomicroscopy, where the narrow diameter probes required for tissue access can limit the achievable spatial resolution. We hypothesize that the compressive sensing framework can be used to overcome the fundamental pixel number limitation in fiber-bundle based endomicroscopy by reconstructing images with more resolvable points than fibers in the bundle. An experimental test platform was assembled to evaluate and compare two candidate architectures, based on introducing a coded amplitude mask at either a conjugate image or Fourier plane within the optical system. The benchtop platform consists of a common illumination and object path followed by separate imaging arms for each compressive architecture. The imaging arms contain a digital micromirror device (DMD) as a reprogrammable mask, with a CCD camera for image acquisition. One arm has the DMD positioned at a conjugate image plane ("IP arm"), while the other arm has the DMD positioned at a Fourier plane ("FP arm"). Lenses were selected and positioned within each arm to achieve an element-to-pixel ratio of 16 (230,400 mask elements mapped onto 14,400 camera pixels). We discuss our mathematical model for each system arm and outline the importance of accounting for system non-idealities. Reconstruction of a 1951 USAF resolution target using optimization-based compressive sensing algorithms produced images with higher spatial resolution than bicubic interpolation for both system arms when system non-idealities are included in the model. Furthermore, images generated with image plane coding appear to exhibit higher spatial resolution, but more noise, than images acquired through Fourier plane coding.
Generic Kalman Filter Software
NASA Technical Reports Server (NTRS)
Lisano, Michael E., II; Crues, Edwin Z.
2005-01-01
The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.
Analysis of 16S-23S rRNA intergenic spacer regions of Vibrio cholerae and Vibrio mimicus.
Chun, J; Huq, A; Colwell, R R
1999-05-01
Vibrio cholerae identification based on molecular sequence data has been hampered by a lack of sequence variation from the closely related Vibrio mimicus. The two species share many genes coding for proteins, such as ctxAB, and show almost identical 16S DNA coding for rRNA (rDNA) sequences. Primers targeting conserved sequences flanking the 3' end of the 16S and the 5' end of the 23S rDNAs were used to amplify the 16S-23S rRNA intergenic spacer regions of V. cholerae and V. mimicus. Two major (ca. 580 and 500 bp) and one minor (ca. 750 bp) amplicons were consistently generated for both species, and their sequences were determined. The largest fragment contains three tRNA genes (tDNAs) coding for tRNAGlu, tRNALys, and tRNAVal, which has not previously been found in bacteria examined to date. The 580-bp amplicon contained tDNAIle and tDNAAla, whereas the 500-bp fragment had single tDNA coding either tRNAGlu or tRNAAla. Little variation, i.e., 0 to 0.4%, was found among V. cholerae O1 classical, O1 El Tor, and O139 epidemic strains. Slightly more variation was found against the non-O1/non-O139 serotypes (ca. 1% difference) and V. mimicus (2 to 3% difference). A pair of oligonucleotide primers were designed, based on the region differentiating all of V. cholerae strains from V. mimicus. The PCR system developed was subsequently evaluated by using representatives of V. cholerae from environmental and clinical sources, and of other taxa, including V. mimicus. This study provides the first molecular tool for identifying the species V. cholerae.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antonelli, Perry Edward
A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface willmore » also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.« less
Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Curlett, Brian P.
1994-01-01
XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.
A method for the design of transonic flexible wings
NASA Technical Reports Server (NTRS)
Smith, Leigh Ann; Campbell, Richard L.
1990-01-01
Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
SIRU development. Volume 3: Software description and program documentation
NASA Technical Reports Server (NTRS)
Oehrle, J.
1973-01-01
The development and initial evaluation of a strapdown inertial reference unit (SIRU) system are discussed. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. The basic SIRU software coding system used in the DDP-516 computer is documented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, P.J.
1996-07-01
A new reactive flow model for highly non-ideal explosives and propellants is presented. These compositions, which contain large amounts of metal, upon explosion have reaction kinetics that are characteristic of both fast detonation and slow metal combustion chemistry. A reaction model for these systems was incorporated into the two-dimensional, finite element, Lagrangian hydrodynamic code, DYNA2D. A description of how to determine the model parameters is given. The use of the model and variations are applied to AP, Al, and nitramine underwater explosive and propellant systems.
Patient-specific dosimetry based on quantitative SPECT imaging and 3D-DFT convolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akabani, G.; Hawkins, W.G.; Eckblade, M.B.
1999-01-01
The objective of this study was to validate the use of a 3-D discrete Fourier Transform (3D-DFT) convolution method to carry out the dosimetry for I-131 for soft tissues in radioimmunotherapy procedures. To validate this convolution method, mathematical and physical phantoms were used as a basis of comparison with Monte Carlo transport (MCT) calculations which were carried out using the EGS4 system code. The mathematical phantom consisted of a sphere containing uniform and nonuniform activity distributions. The physical phantom consisted of a cylinder containing uniform and nonuniform activity distributions. Quantitative SPECT reconstruction was carried out using the Circular Harmonic Transformmore » (CHT) algorithm.« less
Coding for Communication Channels with Dead-Time Constraints
NASA Technical Reports Server (NTRS)
Moision, Bruce; Hamkins, Jon
2004-01-01
Coding schemes have been designed and investigated specifically for optical and electronic data-communication channels in which information is conveyed via pulse-position modulation (PPM) subject to dead-time constraints. These schemes involve the use of error-correcting codes concatenated with codes denoted constrained codes. These codes are decoded using an interactive method. In pulse-position modulation, time is partitioned into frames of Mslots of equal duration. Each frame contains one pulsed slot (all others are non-pulsed). For a given channel, the dead-time constraints are defined as a maximum and a minimum on the allowable time between pulses. For example, if a Q-switched laser is used to transmit the pulses, then the minimum allowable dead time is the time needed to recharge the laser for the next pulse. In the case of bits recorded on a magnetic medium, the minimum allowable time between pulses depends on the recording/playback speed and the minimum distance between pulses needed to prevent interference between adjacent bits during readout. The maximum allowable dead time for a given channel is the maximum time for which it is possible to satisfy the requirement to synchronize slots. In mathematical shorthand, the dead-time constraints for a given channel are represented by the pair of integers (d,k), where d is the minimum allowable number of zeroes between ones and k is the maximum allowable number of zeroes between ones. A system of the type to which the present schemes apply is represented by a binary- input, real-valued-output channel model illustrated in the figure. At the transmitting end, information bits are first encoded by use of an error-correcting code, then further encoded by use of a constrained code. Several constrained codes for channels subject to constraints of (d,infinity) have been investigated theoretically and computationally. The baseline codes chosen for purposes of comparison were simple PPM codes characterized by M-slot PPM frames separated by d-slot dead times.
Inkjet-compatible single-component polydiacetylene precursors for thermochromic paper sensors.
Yoon, Bora; Shin, Hyora; Kang, Eun-Mi; Cho, Dae Won; Shin, Kayeong; Chung, Hoeil; Lee, Chan Woo; Kim, Jong-Man
2013-06-12
Inkjet-printable diacetylene (DA) supramolecules, which can be dispersed in water without using additional surfactants, have been developed. The supramolecules are generated from DA monomers that contain bisurea groups, which are capable of forming hydrogen-bonding networks, and hydrophilic oligoethylene oxide moieties. Because of suitable size distribution and stability characteristics, the single DA component ink can be readily transferred to paper substrates by utilizing a common office inkjet printer. UV irradiation of the DA-printed paper results in generation of blue-colored polydiacetylene (PDA) images, which show reversible thermochromic transitions in specific temperature ranges. Inkjet-printed PDAs, in the format of a two-dimensional (2D) quick response (QR) code on a real parking ticket, serve as a dual anticounterfeiting system that combines easy decoding of the QR code and colorimetric PDA reversibility for validating the authenticity of the tickets. This single-component ink system has great potential for use in paper-based devices, temperature sensors, and anticounterfeiting barcodes.
Reference Materials and Subject Matter Knowledge Codes for Airman Knowledge Testing
DOT National Transportation Integrated Search
2004-06-08
The listings of reference materials and subject matter knowledge codes have been : prepared by the Federal Aviation Administration (FAA) to establish specific : references for all knowledge standards. The listings contain reference materials : to be ...
Hematopoietic Project - SEER Registrars
Use this manual and corresponding database for coding cases diagnosed January 1, 2010 and forward. The changes do not require recoding of old cases. Contains data collection rules for hematopoietic and lymphoid neoplasms (2010+). Access a database and coding manual.
An expanded genetic code in mammalian cells with a functional quadruplet codon.
Niu, Wei; Schultz, Peter G; Guo, Jiantao
2013-07-19
We have utilized in vitro evolution to identify tRNA variants with significantly enhanced activity for the incorporation of unnatural amino acids into proteins in response to a quadruplet codon in both bacterial and mammalian cells. This approach will facilitate the creation of an optimized and standardized system for the genetic incorporation of unnatural amino acids using quadruplet codons, which will allow the biosynthesis of biopolymers that contain multiple unnatural building blocks.
NASA Technical Reports Server (NTRS)
Dahl, Milo D.
2010-01-01
Codes for predicting supersonic jet mixing and broadband shock-associated noise were assessed using a database containing noise measurements of a jet issuing from a convergent nozzle. Two types of codes were used to make predictions. Fast running codes containing empirical models were used to compute both the mixing noise component and the shock-associated noise component of the jet noise spectrum. One Reynolds-averaged, Navier-Stokes-based code was used to compute only the shock-associated noise. To enable the comparisons of the predicted component spectra with data, the measured total jet noise spectra were separated into mixing noise and shock-associated noise components. Comparisons were made for 1/3-octave spectra and some power spectral densities using data from jets operating at 24 conditions covering essentially 6 fully expanded Mach numbers with 4 total temperature ratios.
Lefkoff, L.J.; Gorelick, S.M.
1987-01-01
A FORTRAN-77 computer program code that helps solve a variety of aquifer management problems involving the control of groundwater hydraulics. It is intended for use with any standard mathematical programming package that uses Mathematical Programming System input format. The computer program creates the input files to be used by the optimization program. These files contain all the hydrologic information and management objectives needed to solve the management problem. Used in conjunction with a mathematical programming code, the computer program identifies the pumping or recharge strategy that achieves a user 's management objective while maintaining groundwater hydraulic conditions within desired limits. The objective may be linear or quadratic, and may involve the minimization of pumping and recharge rates or of variable pumping costs. The problem may contain constraints on groundwater heads, gradients, and velocities for a complex, transient hydrologic system. Linear superposition of solutions to the transient, two-dimensional groundwater flow equation is used by the computer program in conjunction with the response matrix optimization method. A unit stress is applied at each decision well and transient responses at all control locations are computed using a modified version of the U.S. Geological Survey two dimensional aquifer simulation model. The program also computes discounted cost coefficients for the objective function and accounts for transient aquifer conditions. (Author 's abstract)
Luo, Yong; Wu, Wenqi; Babu, Ravindra; Tang, Kanghua; Luo, Bing
2012-01-01
COMPASS is an indigenously developed Chinese global navigation satellite system and will share many features in common with GPS (Global Positioning System). Since the ultra-tight GPS/INS (Inertial Navigation System) integration shows its advantage over independent GPS receivers in many scenarios, the federated ultra-tight COMPASS/INS integration has been investigated in this paper, particularly, by proposing a simplified prefilter model. Compared with a traditional prefilter model, the state space of this simplified system contains only carrier phase, carrier frequency and carrier frequency rate tracking errors. A two-quadrant arctangent discriminator output is used as a measurement. Since the code tracking error related parameters were excluded from the state space of traditional prefilter models, the code/carrier divergence would destroy the carrier tracking process, and therefore an adaptive Kalman filter algorithm tuning process noise covariance matrix based on state correction sequence was incorporated to compensate for the divergence. The federated ultra-tight COMPASS/INS integration was implemented with a hardware COMPASS intermediate frequency (IF), and INS's accelerometers and gyroscopes signal sampling system. Field and simulation test results showed almost similar tracking and navigation performances for both the traditional prefilter model and the proposed system; however, the latter largely decreased the computational load. PMID:23012564
New quantum codes constructed from quaternary BCH codes
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena
2016-10-01
In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
This document outlines the development of a high fidelity, best estimate nuclear power plant severe transient simulation capability that will complement or enhance the integral system codes historically used for licensing and analysis of severe accidents. As with other tools in the Risk Informed Safety Margin Characterization (RISMC) Toolkit, the ultimate user of Enhanced Severe Transient Analysis and Prevention (ESTAP) capability is the plant decision-maker; the deliverable to that customer is a modern, simulation-based safety analysis capability, applicable to a much broader class of safety issues than is traditional Light Water Reactor (LWR) licensing analysis. Currently, the RISMC pathway’s majormore » emphasis is placed on developing RELAP-7, a next-generation safety analysis code, and on showing how to use RELAP-7 to analyze margin from a modern point of view: that is, by characterizing margin in terms of the probabilistic spectra of the “loads” applied to systems, structures, and components (SSCs), and the “capacity” of those SSCs to resist those loads without failing. The first objective of the ESTAP task, and the focus of one task of this effort, is to augment RELAP-7 analyses with user-selected multi-dimensional, multi-phase models of specific plant components to simulate complex phenomena that may lead to, or exacerbate, severe transients and core damage. Such phenomena include: coolant crossflow between PWR assemblies during a severe reactivity transient, stratified single or two-phase coolant flow in primary coolant piping, inhomogeneous mixing of emergency coolant water or boric acid with hot primary coolant, and water hammer. These are well-documented phenomena associated with plant transients but that are generally not captured in system codes. They are, however, generally limited to specific components, structures, and operating conditions. The second ESTAP task is to similarly augment a severe (post-core damage) accident integral analyses code with high fidelity simulations that would allow investigation of multi-dimensional, multi-phase containment phenomena that are only treated approximately in established codes.« less
Bradshaw, Debbie; Groenewald, Pamela; Bourne, David E.; Mahomed, Hassan; Nojilana, Beatrice; Daniels, Johan; Nixon, Jo
2006-01-01
OBJECTIVE: To review the quality of the coding of the cause of death (COD) statistics and assess the mortality information needs of the City of Cape Town. METHODS: Using an action research approach, a study was set up to investigate the quality of COD information, the accuracy of COD coding and consistency of coding practices in the larger health subdistricts. Mortality information needs and the best way of presenting the statistics to assist health managers were explored. FINDINGS: Useful information was contained in 75% of death certificates, but nearly 60% had only a single cause certified; 55% of forms were coded accurately. Disagreement was mainly because routine coders coded the immediate instead of the underlying COD. An abridged classification of COD, based on causes of public health importance, prevalent causes and selected combinations of diseases was implemented with training on underlying cause. Analysis of the 2001 data identified the leading causes of death and premature mortality and illustrated striking differences in the disease burden and profile between health subdistricts. CONCLUSION: Action research is particularly useful for improving information systems and revealed the need to standardize the coding practice to identify underlying cause. The specificity of the full ICD classification is beyond the level of detail on the death certificates currently available. An abridged classification for coding provides a practical tool appropriate for local level public health surveillance. Attention to the presentation of COD statistics is important to enable the data to inform decision-makers. PMID:16583080
Bradshaw, Debbie; Groenewald, Pamela; Bourne, David E; Mahomed, Hassan; Nojilana, Beatrice; Daniels, Johan; Nixon, Jo
2006-03-01
To review the quality of the coding of the cause of death (COD) statistics and assess the mortality information needs of the City of Cape Town. Using an action research approach, a study was set up to investigate the quality of COD information, the accuracy of COD coding and consistency of coding practices in the larger health subdistricts. Mortality information needs and the best way of presenting the statistics to assist health managers were explored. Useful information was contained in 75% of death certificates, but nearly 60% had only a single cause certified; 55% of forms were coded accurately. Disagreement was mainly because routine coders coded the immediate instead of the underlying COD. An abridged classification of COD, based on causes of public health importance, prevalent causes and selected combinations of diseases was implemented with training on underlying cause. Analysis of the 2001 data identified the leading causes of death and premature mortality and illustrated striking differences in the disease burden and profile between health subdistricts. Action research is particularly useful for improving information systems and revealed the need to standardize the coding practice to identify underlying cause. The specificity of the full ICD classification is beyond the level of detail on the death certificates currently available. An abridged classification for coding provides a practical tool appropriate for local level public health surveillance. Attention to the presentation of COD statistics is important to enable the data to inform decision-makers.
New Mandates and Imperatives in the Revised "ACA Code of Ethics"
ERIC Educational Resources Information Center
Kaplan, David M.; Kocet, Michael M.; Cottone, R. Rocco; Glosoff, Harriet L.; Miranti, Judith G.; Moll, E. Christine; Bloom, John W.; Bringaze, Tammy B.; Herlihy, Barbara; Lee, Courtland C.; Tarvydas, Vilia M.
2009-01-01
The first major revision of the "ACA Code of Ethics" in a decade occurred in late 2005, with the updated edition containing important new mandates and imperatives. This article provides interviews with members of the Ethics Revision Task Force that flesh out seminal changes in the revised "ACA Code of Ethics" in the areas of confidentiality,…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-20
... Functions and Authority Under Sections 315 and 325 of Title 32, United States Code Memorandum for the... United States of America, including section 301 of title 3, United States Code, I hereby delegate to you: (a) the functions and authority of the President contained in section 315 of title 32, United States...
Boyd, Andrew D; ‘John’ Li, Jianrong; Kenost, Colleen; Joese, Binoy; Min Yang, Young; Kalagidis, Olympia A; Zenku, Ilir; Saner, Donald; Bahroos, Neil; Lussier, Yves A
2015-01-01
In the United States, International Classification of Disease Clinical Modification (ICD-9-CM, the ninth revision) diagnosis codes are commonly used to identify patient cohorts and to conduct financial analyses related to disease. In October 2015, the healthcare system of the United States will transition to ICD-10-CM (the tenth revision) diagnosis codes. One challenge posed to clinical researchers and other analysts is conducting diagnosis-related queries across datasets containing both coding schemes. Further, healthcare administrators will manage growth, trends, and strategic planning with these dually-coded datasets. The majority of the ICD-9-CM to ICD-10-CM translations are complex and nonreciprocal, creating convoluted representations and meanings. Similarly, mapping back from ICD-10-CM to ICD-9-CM is equally complex, yet different from mapping forward, as relationships are likewise nonreciprocal. Indeed, 10 of the 21 top clinical categories are complex as 78% of their diagnosis codes are labeled as “convoluted” by our analyses. Analysis and research related to external causes of morbidity, injury, and poisoning will face the greatest challenges due to 41 745 (90%) convolutions and a decrease in the number of codes. We created a web portal tool and translation tables to list all ICD-9-CM diagnosis codes related to the specific input of ICD-10-CM diagnosis codes and their level of complexity: “identity” (reciprocal), “class-to-subclass,” “subclass-to-class,” “convoluted,” or “no mapping.” These tools provide guidance on ambiguous and complex translations to reveal where reports or analyses may be challenging to impossible. Web portal: http://www.lussierlab.org/transition-to-ICD9CM/ Tables annotated with levels of translation complexity: http://www.lussierlab.org/publications/ICD10to9 PMID:25681260
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.
MELTSPREAD3 is a transient one-dimensional computer code that has been developed to predict the gravity-driven flow and freezing behavior of molten reactor core materials (corium) in containment geometries. Predictions can be made for corium flowing across surfaces under either dry or wet cavity conditions. The spreading surfaces that can be selected are steel, concrete, a user-specified material (e.g., a ceramic), or an arbitrary combination thereof. The corium can have a wide range of compositions of reactor core materials that includes distinct oxide phases (predominantly Zr, and steel oxides) plus metallic phases (predominantly Zr and steel). The code requires input thatmore » describes the containment geometry, melt “pour” conditions, and cavity atmospheric conditions (i.e., pressure, temperature, and cavity flooding information). For cases in which the cavity contains a preexisting water layer at the time of RPV failure, melt jet breakup and particle bed formation can be calculated mechanistically given the time-dependent melt pour conditions (input data) as well as the heatup and boiloff of water in the melt impingement zone (calculated). For core debris impacting either the containment floor or previously spread material, the code calculates the transient hydrodynamics and heat transfer which determine the spreading and freezing behavior of the melt. The code predicts conditions at the end of the spreading stage, including melt relocation distance, depth and material composition profiles, substrate ablation profile, and wall heatup. Code output can be used as input to other models such as CORQUENCH that evaluate long term core-concrete interaction behavior following the transient spreading stage. MELTSPREAD3 was originally developed to investigate BWR Mark I liner vulnerability, but has been substantially upgraded and applied to other reactor designs (e.g., the EPR), and more recently to the plant accidents at Fukushima Daiichi. The most recent round of improvements that are documented in this report have been specifically implemented to support industry in developing Severe Accident Water Management (SAWM) strategies for Boiling Water Reactors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
MORIDIS, GEORGE
2016-05-02
MeshMaker v1.5 is a code that describes the system geometry and discretizes the domain in problems of flow and transport through porous and fractured media that are simulated using the TOUGH+ [Moridis and Pruess, 2014] or TOUGH2 [Pruess et al., 1999; 2012] families of codes. It is a significantly modified and drastically enhanced version of an earlier simpler facility that was embedded in the TOUGH2 codes [Pruess et al., 1999; 2012], from which it could not be separated. The code (MeshMaker.f90) is a stand-alone product written in FORTRAN 95/2003, is written according to the tenets of Object-Oriented Programming, has amore » modular structure and can perform a number of mesh generation and processing operations. It can generate two-dimensional radially symmetric (r,z) meshes, and one-, two-, and three-dimensional rectilinear (Cartesian) grids in (x,y,z). The code generates the file MESH, which includes all the elements and connections that describe the discretized simulation domain and conforming to the requirements of the TOUGH+ and TOUGH2 codes. Multiple-porosity processing for simulation of flow in naturally fractured reservoirs can be invoked by means of a keyword MINC, which stands for Multiple INteracting Continua. The MINC process operates on the data of the primary (porous medium) mesh as provided on disk file MESH, and generates a secondary mesh containing fracture and matrix elements with identical data formats on file MINC.« less
Chen, Wen; Zhang, Xuan; Li, Jing; Huang, Shulan; Xiang, Shuanglin; Hu, Xiang; Liu, Changning
2018-05-09
Zebrafish is a full-developed model system for studying development processes and human disease. Recent studies of deep sequencing had discovered a large number of long non-coding RNAs (lncRNAs) in zebrafish. However, only few of them had been functionally characterized. Therefore, how to take advantage of the mature zebrafish system to deeply investigate the lncRNAs' function and conservation is really intriguing. We systematically collected and analyzed a series of zebrafish RNA-seq data, then combined them with resources from known database and literatures. As a result, we obtained by far the most complete dataset of zebrafish lncRNAs, containing 13,604 lncRNA genes (21,128 transcripts) in total. Based on that, a co-expression network upon zebrafish coding and lncRNA genes was constructed and analyzed, and used to predict the Gene Ontology (GO) and the KEGG annotation of lncRNA. Meanwhile, we made a conservation analysis on zebrafish lncRNA, identifying 1828 conserved zebrafish lncRNA genes (1890 transcripts) that have their putative mammalian orthologs. We also found that zebrafish lncRNAs play important roles in regulation of the development and function of nervous system; these conserved lncRNAs present a significant sequential and functional conservation, with their mammalian counterparts. By integrative data analysis and construction of coding-lncRNA gene co-expression network, we gained the most comprehensive dataset of zebrafish lncRNAs up to present, as well as their systematic annotations and comprehensive analyses on function and conservation. Our study provides a reliable zebrafish-based platform to deeply explore lncRNA function and mechanism, as well as the lncRNA commonality between zebrafish and human.
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
Subband/transform functions for image processing
NASA Technical Reports Server (NTRS)
Glover, Daniel
1993-01-01
Functions for image data processing written for use with the MATLAB(TM) software package are presented. These functions provide the capability to transform image data with block transformations (such as the Walsh Hadamard) and to produce spatial frequency subbands of the transformed data. Block transforms are equivalent to simple subband systems. The transform coefficients are reordered using a simple permutation to give subbands. The low frequency subband is a low resolution version of the original image, while the higher frequency subbands contain edge information. The transform functions can be cascaded to provide further decomposition into more subbands. If the cascade is applied to all four of the first stage subbands (in the case of a four band decomposition), then a uniform structure of sixteen bands is obtained. If the cascade is applied only to the low frequency subband, an octave structure of seven bands results. Functions for the inverse transforms are also given. These functions can be used for image data compression systems. The transforms do not in themselves produce data compression, but prepare the data for quantization and compression. Sample quantization functions for subbands are also given. A typical compression approach is to subband the image data, quantize it, then use statistical coding (e.g., run-length coding followed by Huffman coding) for compression. Contour plots of image data and subbanded data are shown.
NASA Technical Reports Server (NTRS)
1997-01-01
The NASA Safety Standard, which establishes a uniform process for hydrogen system design, materials selection, operation, storage, and transportation, is presented. The guidelines include suggestions for safely storing, handling, and using hydrogen in gaseous (GH2), liquid (LH2), or slush (SLH2) form whether used as a propellant or non-propellant. The handbook contains 9 chapters detailing properties and hazards, facility design, design of components, materials compatibility, detection, and transportation. Chapter 10 serves as a reference and the appendices contained therein include: assessment examples; scaling laws, explosions, blast effects, and fragmentation; codes, standards, and NASA directives; and relief devices along with a list of tables and figures, abbreviations, a glossary and an index for ease of use. The intent of the handbook is to provide enough information that it can be used alone, but at the same time, reference data sources that can provide much more detail if required.
RMP Guidance for Warehouses - Appendix A/B: 40 CFR part 68/Selected NAICS Codes
These appendices contain the full text of 40 Code of Federal Regulations Part 68, Chemical Accident Prevention Provisions; which includes hazard assessment, emergency response, substance thresholds, reporting requirements, and the Risk Management Plan.
2013-12-01
Programming code in the Python language used in AIS data preprocessing is contained in Appendix A. The MATLAB programming code used to apply the Hough...described in Chapter III is applied to archived AIS data in this chapter. The implementation of the method, including programming techniques used, is...is contained in the second. To provide a proof of concept for the algorithm described in Chapter III, the PYTHON programming language was used for
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trent, D.S.; Eyler, L.L.
In this study several aspects of simulating hydrogen distribution in geometric configurations relevant to reactor containment structures were investigated using the TEMPEST computer code. Of particular interest was the performance of the TEMPEST turbulence model in a density-stratified environment. Computed results illustrated that the TEMPEST numerical procedures predicted the measured phenomena with good accuracy under a variety of conditions and that the turbulence model used is a viable approach in complex turbulent flow simulation.
Reeve, Wayne; van Berkum, Peter; Ardley, Julie; ...
2017-03-04
Bradyrhizobium elkanii USDA 76 T (INSCD = ARAG00000000), the type strain for Bradyrhizobium elkanii, is an aerobic, motile, Gram-negative, non-spore-forming rod that was isolated from an effective nitrogen-fixing root nodule of Glycine max (L. Merr) grown in the USA. Because of its significance as a microsymbiont of this economically important legume, B. elkanii USDA 76 T was selected as part of the DOE Joint Genome Institute 2010 Genomic Encyclopedia for Bacteria and Archaea-Root Nodule Bacteria sequencing project. Here the symbiotic abilities of B. elkanii USDA 76 T are described, together with its genome sequence information and annotation. The 9,484,767 bpmore » high-quality draft genome is arranged in 2 scaffolds of 25 contigs, containing 9060 protein-coding genes and 91 RNA-only encoding genes. The B. elkanii USDA 76 T genome contains a low GC content region with symbiotic nod and fix genes, indicating the presence of a symbiotic island integration. A comparison of five B. elkanii genomes that formed a clique revealed that 356 of the 9060 protein coding genes of USDA 76 T were unique, including 22 genes of an intact resident prophage. A conserved set of 7556 genes were also identified for this species, including genes encoding a general secretion pathway as well as type II, III, IV and VI secretion system proteins. The type III secretion system has previously been characterized as a host determinant for Rj and/or rj soybean cultivars. Here we show that the USDA 76 T genome contains genes encoding all the type III secretion system components, including a translocon complex protein NopX required for the introduction of effector proteins into host cells. While many bradyrhizobial strains are unable to nodulate the soybean cultivar Clark (rj1), USDA 76 T was able to elicit nodules on Clark (rj1), although in reduced numbers, when plants were grown in Leonard jars containing sand or vermiculite. In these conditions, we postulate that the presence of NopX allows USDA 76 T to introduce various effector molecules into this host to enable nodulation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reeve, Wayne; van Berkum, Peter; Ardley, Julie
Bradyrhizobium elkanii USDA 76 T (INSCD = ARAG00000000), the type strain for Bradyrhizobium elkanii, is an aerobic, motile, Gram-negative, non-spore-forming rod that was isolated from an effective nitrogen-fixing root nodule of Glycine max (L. Merr) grown in the USA. Because of its significance as a microsymbiont of this economically important legume, B. elkanii USDA 76 T was selected as part of the DOE Joint Genome Institute 2010 Genomic Encyclopedia for Bacteria and Archaea-Root Nodule Bacteria sequencing project. Here the symbiotic abilities of B. elkanii USDA 76 T are described, together with its genome sequence information and annotation. The 9,484,767 bpmore » high-quality draft genome is arranged in 2 scaffolds of 25 contigs, containing 9060 protein-coding genes and 91 RNA-only encoding genes. The B. elkanii USDA 76 T genome contains a low GC content region with symbiotic nod and fix genes, indicating the presence of a symbiotic island integration. A comparison of five B. elkanii genomes that formed a clique revealed that 356 of the 9060 protein coding genes of USDA 76 T were unique, including 22 genes of an intact resident prophage. A conserved set of 7556 genes were also identified for this species, including genes encoding a general secretion pathway as well as type II, III, IV and VI secretion system proteins. The type III secretion system has previously been characterized as a host determinant for Rj and/or rj soybean cultivars. Here we show that the USDA 76 T genome contains genes encoding all the type III secretion system components, including a translocon complex protein NopX required for the introduction of effector proteins into host cells. While many bradyrhizobial strains are unable to nodulate the soybean cultivar Clark (rj1), USDA 76 T was able to elicit nodules on Clark (rj1), although in reduced numbers, when plants were grown in Leonard jars containing sand or vermiculite. In these conditions, we postulate that the presence of NopX allows USDA 76 T to introduce various effector molecules into this host to enable nodulation.« less
Channel coding in the space station data system network
NASA Technical Reports Server (NTRS)
Healy, T.
1982-01-01
A detailed discussion of the use of channel coding for error correction, privacy/secrecy, channel separation, and synchronization is presented. Channel coding, in one form or another, is an established and common element in data systems. No analysis and design of a major new system would fail to consider ways in which channel coding could make the system more effective. The presence of channel coding on TDRS, Shuttle, the Advanced Communication Technology Satellite Program system, the JSC-proposed Space Operations Center, and the proposed 30/20 GHz Satellite Communication System strongly support the requirement for the utilization of coding for the communications channel. The designers of the space station data system have to consider the use of channel coding.
Production of Recombinant Adenovirus Containing Human Interlukin-4 Gene
Mojarrad, Majid; Abdolazimi, Yassan; Hajati, Jamshid; Modarressi, Mohammad Hossein
2011-01-01
Objective(s) Recombinant adenoviruses are currently used for a variety of purposes, including in vitro gene transfer, in vivo vaccination, and gene therapy. Ability to infect many cell types, high efficiency in gene transfer, entering both dividing and non dividing cells, and growing to high titers make this virus a good choice for using in various experiments. In the present experiment, a recombinant adenovirus containing human IL-4 coding sequence was made. IL-4 has several characteristics that made it a good choice for using in cancer gene therapy, controlling inflammatory diseases, and studies on autoimmune diseases. Materials and Methods In brief, IL-4 coding sequence was amplified by and cloned in pAd-Track-CMV. Then, by means of homologous recombination between recombinant pAd-Track-CMV and Adeasy-1 plasmid in bacteria, recombinant adenovirus complete genome was made and IL-4 containing shuttle vector was incorporated into the viral backbone. After linearization, for virus packaging, viral genome was transfected into HEK-293 cell line. Viral production was conveniently followed with the aid of green fluorescent protein. Results Recombinant adenovirus produced here, was capable to infecting cell lines and express interlukin-4 in cell. Conclusion This system can be used as a powerful, easy, and cost benefit tool in various studies on cancer gene therapy and also studies on immunogenetics. PMID:23493491
CSlib, a library to couple codes via Client/Server messaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve
The CSlib is a small, portable library which enables two (or more) independent simulation codes to be coupled, by exchanging messages with each other. Both codes link to the library when they are built, and can them communicate with each other as they run. The messages contain data or instructions that the two codes send back-and-forth to each other. The messaging can take place via files, sockets, or MPI. The latter is a standard distributed-memory message-passing library.