Extensions under development for the HEVC standard
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.
2013-09-01
This paper discusses standardization activities for extending the capabilities of the High Efficiency Video Coding (HEVC) standard - the first edition of which was completed in early 2013. These near-term extensions are focused on three areas: range extensions (such as enhanced chroma formats, monochrome video, and increased bit depth), bitstream scalability extensions for spatial and fidelity scalability, and 3D video extensions (including stereoscopic/multi-view coding, and probably also depth map coding and combinations thereof). Standardization extensions on each of these topics will be completed by mid-2014, and further work beyond that timeframe is also discussed.
Seeling, Patrick; Reisslein, Martin
2014-01-01
Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC.
2014-01-01
Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC. PMID:24701145
Minerva: Cylindrical coordinate extension for Athena
NASA Astrophysics Data System (ADS)
Skinner, M. Aaron; Ostriker, Eve C.
2013-02-01
Minerva is a cylindrical coordinate extension of the Athena astrophysical MHD code of Stone, Gardiner, Teuben, and Hawley. The extension follows the approach of Athena's original developers and has been designed to alter the existing Cartesian-coordinates code as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport (CT), a central feature of the Athena algorithm, while making use of previously implemented code modules such as the Riemann solvers. Angular momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully.
Additional extensions to the NASCAP computer code, volume 1
NASA Technical Reports Server (NTRS)
Mandell, M. J.; Katz, I.; Stannard, P. R.
1981-01-01
Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.; Topiwala, Pankaj N.; Luthra, Ajay
2004-11-01
H.264/MPEG-4 AVC is the latest international video coding standard. It was jointly developed by the Video Coding Experts Group (VCEG) of the ITU-T and the Moving Picture Experts Group (MPEG) of ISO/IEC. It uses state-of-the-art coding tools and provides enhanced coding efficiency for a wide range of applications, including video telephony, video conferencing, TV, storage (DVD and/or hard disk based, especially high-definition DVD), streaming video, digital video authoring, digital cinema, and many others. The work on a new set of extensions to this standard has recently been completed. These extensions, known as the Fidelity Range Extensions (FRExt), provide a number of enhanced capabilities relative to the base specification as approved in the Spring of 2003. In this paper, an overview of this standard is provided, including the highlights of the capabilities of the new FRExt features. Some comparisons with the existing MPEG-2 and MPEG-4 Part 2 standards are also provided.
Development and application of structural dynamics analysis capabilities
NASA Technical Reports Server (NTRS)
Heinemann, Klaus W.; Hozaki, Shig
1994-01-01
Extensive research activities were performed in the area of multidisciplinary modeling and simulation of aerospace vehicles that are relevant to NASA Dryden Flight Research Facility. The efforts involved theoretical development, computer coding, and debugging of the STARS code. New solution procedures were developed in such areas as structures, CFD, and graphics, among others. Furthermore, systems-oriented codes were developed for rendering the code truly multidisciplinary and rather automated in nature. Also, work was performed in pre- and post-processing of engineering analysis data.
Implementation issues in source coding
NASA Technical Reports Server (NTRS)
Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.
1989-01-01
An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
A rapid mission analysis code based on the use of approximate flight path equations of motion is presented. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed characteristics were specified in tabular form. The code also contains extensive flight envelope performance mapping capabilities. Approximate take off and landing analyses were performed. At high speeds, centrifugal lift effects were accounted for. Extensive turbojet and ramjet engine scaling procedures were incorporated in the code.
Entropy-Based Bounds On Redundancies Of Huffman Codes
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.
1992-01-01
Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.
Research on Automatic Programming
1975-12-31
Sequential processes, deadlocks, and semaphore primitives , Ph.D. Thesis, Harvard University, November 1974; Center for Research in Computing...verified. 13 Code generated to effect the synchronization makes use of the ECL control extension facility (Prenner’s CI, see [Prenner]). The... semaphore operations [Dijkstra] is being developed. Initial results for this code generator are very encouraging; in many cases generated code is
The Particle Accelerator Simulation Code PyORBIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M
2015-01-01
The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less
Algorithm and code development for unsteady three-dimensional Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Obayashi, Shigeru
1994-01-01
Aeroelastic tests require extensive cost and risk. An aeroelastic wind-tunnel experiment is an order of magnitude more expensive than a parallel experiment involving only aerodynamics. By complementing the wind-tunnel experiments with numerical simulations, the overall cost of the development of aircraft can be considerably reduced. In order to accurately compute aeroelastic phenomenon it is necessary to solve the unsteady Euler/Navier-Stokes equations simultaneously with the structural equations of motion. These equations accurately describe the flow phenomena for aeroelastic applications. At ARC a code, ENSAERO, is being developed for computing the unsteady aerodynamics and aeroelasticity of aircraft, and it solves the Euler/Navier-Stokes equations. The purpose of this cooperative agreement was to enhance ENSAERO in both algorithm and geometric capabilities. During the last five years, the algorithms of the code have been enhanced extensively by using high-resolution upwind algorithms and efficient implicit solvers. The zonal capability of the code has been extended from a one-to-one grid interface to a mismatching unsteady zonal interface. The geometric capability of the code has been extended from a single oscillating wing case to a full-span wing-body configuration with oscillating control surfaces. Each time a new capability was added, a proper validation case was simulated, and the capability of the code was demonstrated.
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Java Source Code Analysis for API Migration to Embedded Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, Victor; McCoy, James A.; Guerrero, Jonathan
Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less
Software Model Checking of ARINC-653 Flight Code with MCP
NASA Technical Reports Server (NTRS)
Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud
2010-01-01
The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.
Development of an Aeroelastic Code Based on an Euler/Navier-Stokes Aerodynamic Solver
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.; Srivastava, Rakesh; Keith, Theo G., Jr.; Stefko, George L.; Janus, Mark J.
1996-01-01
This paper describes the development of an aeroelastic code (TURBO-AE) based on an Euler/Navier-Stokes unsteady aerodynamic analysis. A brief review of the relevant research in the area of propulsion aeroelasticity is presented. The paper briefly describes the original Euler/Navier-Stokes code (TURBO) and then details the development of the aeroelastic extensions. The aeroelastic formulation is described. The modeling of the dynamics of the blade using a modal approach is detailed, along with the grid deformation approach used to model the elastic deformation of the blade. The work-per-cycle approach used to evaluate aeroelastic stability is described. Representative results used to verify the code are presented. The paper concludes with an evaluation of the development thus far, and some plans for further development and validation of the TURBO-AE code.
Coding conventions and principles for a National Land-Change Modeling Framework
Donato, David I.
2017-07-14
This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.
NASA Technical Reports Server (NTRS)
Mcgaw, Michael A.; Saltsman, James F.
1993-01-01
A recently developed high-temperature fatigue life prediction computer code is presented and an example of its usage given. The code discussed is based on the Total Strain version of Strainrange Partitioning (TS-SRP). Included in this code are procedures for characterizing the creep-fatigue durability behavior of an alloy according to TS-SRP guidelines and predicting cyclic life for complex cycle types for both isothermal and thermomechanical conditions. A reasonably extensive materials properties database is included with the code.
Development of V/STOL methodology based on a higher order panel method
NASA Technical Reports Server (NTRS)
Bhateley, I. C.; Howell, G. A.; Mann, H. W.
1983-01-01
The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.
Nonambipolar Transport and Torque in Perturbed Equilibria
NASA Astrophysics Data System (ADS)
Logan, N. C.; Park, J.-K.; Wang, Z. R.; Berkery, J. W.; Kim, K.; Menard, J. E.
2013-10-01
A new Perturbed Equilibrium Nonambipolar Transport (PENT) code has been developed to calculate the neoclassical toroidal torque from radial current composed of both passing and trapped particles in perturbed equilibria. This presentation outlines the physics approach used in the development of the PENT code, with emphasis on the effects of retaining general aspect-ratio geometric effects. First, nonambipolar transport coefficients and corresponding neoclassical toroidal viscous (NTV) torque in perturbed equilibria are re-derived from the first order gyro-drift-kinetic equation in the ``combined-NTV'' PENT formalism. The equivalence of NTV torque and change in potential energy due to kinetic effects [J-K. Park, Phys. Plas., 2011] is then used to showcase computational challenges shared between PENT and stability codes MISK and MARS-K. Extensive comparisons to a reduced model, which makes numerous large aspect ratio approximations, are used throughout to emphasize geometry dependent physics such as pitch angle resonances. These applications make extensive use of the PENT code's native interfacing with the Ideal Perturbed Equilibrium Code (IPEC), and the combination of these codes is a key step towards an iterative solver for self-consistent perturbed equilibrium torque. Supported by US DOE contract #DE-AC02-09CH11466 and the DOE Office of Science Graduate Fellowship administered by the Oak Ridge Institute for Science & Education under contract #DE-AC05-06OR23100.
Extensive Air Showers in the Classroom
ERIC Educational Resources Information Center
Badala, A.; Blanco, F.; La Rocca, P.; Pappalardo, G. S.; Pulvirenti, A.; Riggi, F.
2007-01-01
The basic properties of extensive air showers of particles produced in the interaction of a high-energy primary cosmic ray in the Earth's atmosphere are discussed in the context of educational cosmic ray projects involving undergraduate students and high-school teams. Simulation results produced by an air shower development code were made…
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)
NASA Technical Reports Server (NTRS)
Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.
2003-01-01
A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.
A Deterministic Transport Code for Space Environment Electrons
NASA Technical Reports Server (NTRS)
Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.
2010-01-01
A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.
Compression performance of HEVC and its format range and screen content coding extensions
NASA Astrophysics Data System (ADS)
Li, Bin; Xu, Jizheng; Sullivan, Gary J.
2015-09-01
This paper presents a comparison-based test of the objective compression performance of the High Efficiency Video Coding (HEVC) standard, its format range extensions (RExt), and its draft screen content coding extensions (SCC). The current dominant standard, H.264/MPEG-4 AVC, is used as an anchor reference in the comparison. The conditions used for the comparison tests were designed to reflect relevant application scenarios and to enable a fair comparison to the maximum extent feasible - i.e., using comparable quantization settings, reference frame buffering, intra refresh periods, rate-distortion optimization decision processing, etc. It is noted that such PSNR-based objective comparisons generally provide more conservative estimates of HEVC benefit than are found in subjective studies. The experimental results show that, when compared with H.264/MPEG-4 AVC, HEVC version 1 provides a bit rate savings for equal PSNR of about 23% for all-intra coding, 34% for random access coding, and 38% for low-delay coding. This is consistent with prior studies and the general characterization that HEVC can provide about a bit rate savings of about 50% for equal subjective quality for most applications. The HEVC format range extensions provide a similar bit rate savings of about 13-25% for all-intra coding, 28-33% for random access coding, and 32-38% for low-delay coding at different bit rate ranges. For lossy coding of screen content, the HEVC screen content coding extensions achieve a bit rate savings of about 66%, 63%, and 61% for all-intra coding, random access coding, and low-delay coding, respectively. For lossless coding, the corresponding bit rate savings are about 40%, 33%, and 32%, respectively.
Applying a Service-Oriented Architecture to Operational Flight Program Development
2007-09-01
using two Java 2 Enterprise Edition (J2EE) Web servers. The weapon models were accessed using a SUN Microsystems Java Web Services Development Pack...Oriented Architectures 22 CROSSTALK The Journal of Defense Software Engineering September 2007 tion, and Spring/ Hibernate to provide the data access...tion since a major coding effort was avoided. The majority of the effort was tweaking pre-existing Java source code and editing of eXtensible Markup
Van Laere, Sven; Nyssen, Marc; Verbeke, Frank
2017-01-01
Clinical coding is a requirement to provide valuable data for billing, epidemiology and health care resource allocation. In sub-Saharan Africa, we observe a growing awareness of the need for coding of clinical data, not only in health insurances, but also in governments and the hospitals. Presently, coding systems in sub-Saharan Africa are often used for billing purposes. In this paper we consider the use of a nomenclature to also have a clinical impact. Often coding systems are assumed to be complex and too extensive to be used in daily practice. Here, we present a method for constructing a new nomenclature based on existing coding systems by considering a minimal subset in the sub-Saharan region. Evaluation of completeness will be done nationally using the requirements of national registries. The nomenclature requires an extension character for dealing with codes that have to be used for multiple registries. Hospitals will benefit most by using this extension character.
The Clawpack Community of Codes
NASA Astrophysics Data System (ADS)
Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.
2014-12-01
Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.
Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim
2005-01-01
With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.
The Athena Astrophysical MHD Code in Cylindrical Geometry
NASA Astrophysics Data System (ADS)
Skinner, M. A.; Ostriker, E. C.
2011-10-01
We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.
The Development of the World Anti-Doping Code.
Young, Richard
2017-01-01
This chapter addresses both the development and substance of the World Anti-Doping Code, which came into effect in 2003, as well as the subsequent Code amendments, which came into effect in 2009 and 2015. Through an extensive process of stakeholder input and collaboration, the World Anti-Doping Code has transformed the hodgepodge of inconsistent and competing pre-2003 anti-doping rules into a harmonized and effective approach to anti-doping. The Code, as amended, is now widely recognized worldwide as the gold standard in anti-doping. The World Anti-Doping Code originally went into effect on January 1, 2004. The first amendments to the Code went into effect on January 1, 2009, and the second amendments on January 1, 2015. The Code and the related international standards are the product of a long and collaborative process designed to make the fight against doping more effective through the adoption and implementation of worldwide harmonized rules and best practices. © 2017 S. Karger AG, Basel.
Fully-Implicit Navier-Stokes (FIN-S)
NASA Technical Reports Server (NTRS)
Kirk, Benjamin S.
2010-01-01
FIN-S is a SUPG finite element code for flow problems under active development at NASA Lyndon B. Johnson Space Center and within PECOS: a) The code is built on top of the libMesh parallel, adaptive finite element library. b) The initial implementation of the code targeted supersonic/hypersonic laminar calorically perfect gas flows & conjugate heat transfer. c) Initial extension to thermochemical nonequilibrium about 9 months ago. d) The technologies in FIN-S have been enhanced through a strongly collaborative research effort with Sandia National Labs.
Parallel CARLOS-3D code development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Putnam, J.M.; Kotulski, J.D.
1996-02-01
CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions tomore » the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.« less
On the validation of a code and a turbulence model appropriate to circulation control airfoils
NASA Technical Reports Server (NTRS)
Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.
1988-01-01
A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Traleika Glacier X-Stack Extension Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fryman, Joshua
The XStack Extension Project continued along the direction of the XStack program in exploring the software tools and frameworks to support a task-based community runtime towards the goal of Exascale programming. The momentum built as part of the XStack project, with the development of the task-based Open Community Runtime (OCR) and related tools, was carried through during the XStack Extension with the focus areas of easing application development, improving performance and supporting more features. The infrastructure set up for a community-driven open-source development continued to be used towards these areas, with continued co-development of runtime and applications. A variety ofmore » OCR programming environments were studied, as described in Sections Revolutionary Programming Environments & Applications – to assist with application development on OCR, and we develop OCR Translator, a ROSE-based source-to-source compiler that parses high-level annotations in an MPI program to generate equivalent OCR code. Figure 2 compares the number of OCR objects needed to generate the 2D stencil workload using the translator, against manual approaches based on SPMD library or native coding. The rate of increase with the translator, with an increase in number of ranks, is consistent with other approaches. This is explored further in Section OCR Translator.« less
Extension of applicable neutron energy of DARWIN up to 1 GeV.
Satoh, D; Sato, T; Endo, A; Matsufuji, N; Takada, M
2007-01-01
The radiation-dose monitor, DARWIN, needs a set of response functions of the liquid organic scintillator to assess a neutron dose. SCINFUL-QMD is a Monte Carlo based computer code to evaluate the response functions. In order to improve the accuracy of the code, a new light-output function based on the experimental data was developed for the production and transport of protons deuterons, tritons, (3)He nuclei and alpha particles, and incorporated into the code. The applicable energy of DARWIN was extended to 1 GeV using the response functions calculated by the modified SCINFUL-QMD code.
A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.
1994-01-01
Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.
New developments of the CARTE thermochemical code: A two-phase equation of state for nanocarbons
NASA Astrophysics Data System (ADS)
Dubois, Vincent; Pineau, Nicolas
2016-01-01
We developed a new equation of state (EOS) for nanocarbons in the thermodynamic range of high explosives detonation products (up to 50 GPa and 4000 K). This EOS was fitted to an extensive database of thermodynamic properties computed by molecular dynamics simulations of nanodiamonds and nano-onions with the LCBOPII potential. We reproduced the detonation properties of a variety of high explosives with the CARTE thermochemical code, including carbon-poor and carbon-rich explosives, with excellent accuracy.
NASA Technical Reports Server (NTRS)
Butler, Madeline J.; Sonneborn, George; Perkins, Dorothy C.
1994-01-01
The Mission Operations and Data Systems Directorate (MO&DSD, Code 500), the Space Sciences Directorate (Code 600), and the Flight Projects Directorate (Code 400) have developed a new approach to combine the science and mission operations for the FUSE mission. FUSE, the last of the Delta-class Explorer missions, will obtain high resolution far ultraviolet spectra (910 - 1220 A) of stellar and extragalactic sources to study the evolution of galaxies and conditions in the early universe. FUSE will be launched in 2000 into a 24-hour highly eccentric orbit. Science operations will be conducted in real time for 16-18 hours per day, in a manner similar to the operations performed today for the International Ultraviolet Explorer. In a radical departure from previous missions, the operations concept combines spacecraft and science operations and data processing functions in a single facility to be housed in the Laboratory for Astronomy and Solar Physics (Code 680). A small missions operations team will provide the spacecraft control, telescope operations and data handling functions in a facility designated as the Science and Mission Operations Center (SMOC). This approach will utilize the Transportable Payload Operations Control Center (TPOCC) architecture for both spacecraft and instrument commanding. Other concepts of integrated operations being developed by the Code 500 Renaissance Project will also be employed for the FUSE SMOC. The primary objective of this approach is to reduce development and mission operations costs. The operations concept, integration of mission and science operations, and extensive use of existing hardware and software tools will decrease both development and operations costs extensively. This paper describes the FUSE operations concept, discusses the systems engineering approach used for its development, and the software, hardware and management tools that will make its implementation feasible.
Breaking the Code: The Creative Use of QR Codes to Market Extension Events
ERIC Educational Resources Information Center
Hill, Paul; Mills, Rebecca; Peterson, GaeLynn; Smith, Janet
2013-01-01
The use of smartphones has drastically increased in recent years, heralding an explosion in the use of QR codes. The black and white square barcodes that link the physical and digital world are everywhere. These simple codes can provide many opportunities to connect people in the physical world with many of Extension online resources. The…
MIADS2 ... an alphanumeric map information assembly and display system for a large computer
Elliot L. Amidon
1966-01-01
A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...
NASA Astrophysics Data System (ADS)
Tanikawa, Ataru; Yoshikawa, Kohji; Okamoto, Takashi; Nitadori, Keigo
2012-02-01
We present a high-performance N-body code for self-gravitating collisional systems accelerated with the aid of a new SIMD instruction set extension of the x86 architecture: Advanced Vector eXtensions (AVX), an enhanced version of the Streaming SIMD Extensions (SSE). With one processor core of Intel Core i7-2600 processor (8 MB cache and 3.40 GHz) based on Sandy Bridge micro-architecture, we implemented a fourth-order Hermite scheme with individual timestep scheme ( Makino and Aarseth, 1992), and achieved the performance of ˜20 giga floating point number operations per second (GFLOPS) for double-precision accuracy, which is two times and five times higher than that of the previously developed code implemented with the SSE instructions ( Nitadori et al., 2006b), and that of a code implemented without any explicit use of SIMD instructions with the same processor core, respectively. We have parallelized the code by using so-called NINJA scheme ( Nitadori et al., 2006a), and achieved ˜90 GFLOPS for a system containing more than N = 8192 particles with 8 MPI processes on four cores. We expect to achieve about 10 tera FLOPS (TFLOPS) for a self-gravitating collisional system with N ˜ 10 5 on massively parallel systems with at most 800 cores with Sandy Bridge micro-architecture. This performance will be comparable to that of Graphic Processing Unit (GPU) cluster systems, such as the one with about 200 Tesla C1070 GPUs ( Spurzem et al., 2010). This paper offers an alternative to collisional N-body simulations with GRAPEs and GPUs.
Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1989-01-01
An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.
Infrastructure for Rapid Development of Java GUI Programs
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip
2006-01-01
The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.
A generalized one-dimensional computer code for turbomachinery cooling passage flow calculations
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Roelke, Richard J.; Meitner, Peter L.
1989-01-01
A generalized one-dimensional computer code for analyzing the flow and heat transfer in the turbomachinery cooling passages was developed. This code is capable of handling rotating cooling passages with turbulators, 180 degree turns, pin fins, finned passages, by-pass flows, tip cap impingement flows, and flow branching. The code is an extension of a one-dimensional code developed by P. Meitner. In the subject code, correlations for both heat transfer coefficient and pressure loss computations were developed to model each of the above mentioned type of coolant passages. The code has the capability of independently computing the friction factor and heat transfer coefficient on each side of a rectangular passage. Either the mass flow at the inlet to the channel or the exit plane pressure can be specified. For a specified inlet total temperature, inlet total pressure, and exit static pressure, the code computers the flow rates through the main branch and the subbranches, flow through tip cap for impingement cooling, in addition to computing the coolant pressure, temperature, and heat transfer coefficient distribution in each coolant flow branch. Predictions from the subject code for both nonrotating and rotating passages agree well with experimental data. The code was used to analyze the cooling passage of a research cooled radial rotor.
Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim
2003-01-01
With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.
Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim
2003-01-01
With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813
Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.
2012-01-01
An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.
The Limits of Coding with Joint Constraints on Detected and Undetected Error Rates
NASA Technical Reports Server (NTRS)
Dolinar, Sam; Andrews, Kenneth; Pollara, Fabrizio; Divsalar, Dariush
2008-01-01
We develop a remarkably tight upper bound on the performance of a parameterized family of bounded angle maximum-likelihood (BA-ML) incomplete decoders. The new bound for this class of incomplete decoders is calculated from the code's weight enumerator, and is an extension of Poltyrev-type bounds developed for complete ML decoders. This bound can also be applied to bound the average performance of random code ensembles in terms of an ensemble average weight enumerator. We also formulate conditions defining a parameterized family of optimal incomplete decoders, defined to minimize both the total codeword error probability and the undetected error probability for any fixed capability of the decoder to detect errors. We illustrate the gap between optimal and BA-ML incomplete decoding via simulation of a small code.
An object oriented extension to CLIPS
NASA Technical Reports Server (NTRS)
Sobkowicz, Clifford
1990-01-01
A presentation of software sub-system developed to augment C Language Production Systems (CLIPS) with facilities for object oriented Knowledge representation. Functions are provided to define classes, instantiate objects, access attributes, and assert object related facts. This extension is implemented via the CLIPS user function interface and does not require modification of any CLIPS code. It does rely on internal CLIPS functions for memory management and symbol representation.
Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder
NASA Technical Reports Server (NTRS)
Staats, Matt
2009-01-01
We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.
An overview of new video coding tools under consideration for VP10: the successor to VP9
NASA Astrophysics Data System (ADS)
Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu
2015-09-01
Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.
Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials
NASA Technical Reports Server (NTRS)
Luebbers, Raymond J.; Beggs, John H.
1991-01-01
Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
LeRC-HT: NASA Lewis Research Center General Multiblock Navier-Stokes Heat Transfer Code Developed
NASA Technical Reports Server (NTRS)
Heidmann, James D.; Gaugler, Raymond E.
1999-01-01
For the last several years, LeRC-HT, a three-dimensional computational fluid dynamics (CFD) computer code for analyzing gas turbine flow and convective heat transfer, has been evolving at the NASA Lewis Research Center. The code is unique in its ability to give a highly detailed representation of the flow field very close to solid surfaces. This is necessary for an accurate representation of fluid heat transfer and viscous shear stresses. The code has been used extensively for both internal cooling passage flows and hot gas path flows--including detailed film cooling calculations, complex tip-clearance gap flows, and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool (at least 35 technical papers have been published relative to the code and its application), but it should be useful for detailed design analysis. We now plan to make this code available to selected users for further evaluation.
NASA Astrophysics Data System (ADS)
Dattoli, G.; Migliorati, M.; Schiavi, A.
2007-05-01
The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.
Progressive Dictionary Learning with Hierarchical Predictive Structure for Scalable Video Coding.
Dai, Wenrui; Shen, Yangmei; Xiong, Hongkai; Jiang, Xiaoqian; Zou, Junni; Taubman, David
2017-04-12
Dictionary learning has emerged as a promising alternative to the conventional hybrid coding framework. However, the rigid structure of sequential training and prediction degrades its performance in scalable video coding. This paper proposes a progressive dictionary learning framework with hierarchical predictive structure for scalable video coding, especially in low bitrate region. For pyramidal layers, sparse representation based on spatio-temporal dictionary is adopted to improve the coding efficiency of enhancement layers (ELs) with a guarantee of reconstruction performance. The overcomplete dictionary is trained to adaptively capture local structures along motion trajectories as well as exploit the correlations between neighboring layers of resolutions. Furthermore, progressive dictionary learning is developed to enable the scalability in temporal domain and restrict the error propagation in a close-loop predictor. Under the hierarchical predictive structure, online learning is leveraged to guarantee the training and prediction performance with an improved convergence rate. To accommodate with the stateof- the-art scalable extension of H.264/AVC and latest HEVC, standardized codec cores are utilized to encode the base and enhancement layers. Experimental results show that the proposed method outperforms the latest SHVC and HEVC simulcast over extensive test sequences with various resolutions.
Summary of papers on current and anticipated uses of thermal-hydraulic codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less
The electromagnetic modeling of thin apertures using the finite-difference time-domain technique
NASA Technical Reports Server (NTRS)
Demarest, Kenneth R.
1987-01-01
A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.
Borcherdt, Roger D.
2012-01-01
VS30, defined as the average seismic shear-wave velocity from the surface to a depth of 30 meters, has found wide-spread use as a parameter to characterize site response for simplified earthquake resistant design as implemented in building codes worldwide. VS30 , as initially introduced by the author for the US 1994 NEHRP Building Code, provides unambiguous definitions of site classes and site coefficients for site-dependent response spectra based on correlations derived from extensive borehole logging and comparative ground-motion measurement programs in California. Subsequent use of VS30 for development of strong ground motion prediction equations (GMPEs) and measurement of extensive sets of VS borehole data have confirmed the previous empirical correlations and established correlations of SVS30 with VSZ at other depths. These correlations provide closed form expressions to predict S30 V at a large number of additional sites and further justify S30 V as a parameter to characterize site response for simplified building codes, GMPEs, ShakeMap, and seismic hazard mapping.
Coupled-cluster based R-matrix codes (CCRM): Recent developments
NASA Astrophysics Data System (ADS)
Sur, Chiranjib; Pradhan, Anil K.
2008-05-01
We report the ongoing development of the new coupled-cluster R-matrix codes (CCRM) for treating electron-ion scattering and radiative processes within the framework of the relativistic coupled-cluster method (RCC), interfaced with the standard R-matrix methodology. The RCC method is size consistent and in principle equivalent to an all-order many-body perturbation theory. The RCC method is one of the most accurate many-body theories, and has been applied for several systems. This project should enable the study of electron-interactions with heavy atoms/ions, utilizing not only high speed computing platforms but also improved theoretical description of the relativistic and correlation effects for the target atoms/ions as treated extensively within the RCC method. Here we present a comprehensive outline of the newly developed theoretical method and a schematic representation of the new suite of CCRM codes. We begin with the flowchart and description of various stages involved in this development. We retain the notations and nomenclature of different stages as analogous to the standard R-matrix codes.
Message Into Medium: An Extension of the Dual Coding Hypothesis.
ERIC Educational Resources Information Center
Simpson, Timothy J.
This paper examines the dual coding hypothesis, a model of the coding of visual and textual information, from the perspective of a mass media professional, such as a teacher, interested in accurately presenting both visual and textual material to a mass audience (i.e., students). It offers an extension to the theory, based upon the various skill…
The chemical basis for the origin of the genetic code and the process of protein synthesis
NASA Technical Reports Server (NTRS)
1981-01-01
The principles upon which the process of protein synthesis and the genetic code were established are elucidated. Extensive work on nuclear magnetic resonance studies of both monomermonomer and monoamino acid polynucleotide interactions is included. A new method of general utility for studying any amino acid interacting with any polynucleotide was developed. This system involves the use of methyl esters of amino acids interacting with polynucleotides.
An extensible circuit QED architecture for quantum computation
NASA Astrophysics Data System (ADS)
Dicarlo, Leo
Realizing a logical qubit robust to single errors in its constituent physical elements is an immediate challenge for quantum information processing platforms. A longer-term challenge will be achieving quantum fault tolerance, i.e., improving logical qubit resilience by increasing redundancy in the underlying quantum error correction code (QEC). In QuTech, we target these challenges in collaboration with industrial and academic partners. I will present the circuit QED quantum hardware, room-temperature control electronics, and software components of the complete architecture. I will show the extensibility of each component to the Surface-17 and -49 circuits needed to reach the objectives with surface-code QEC, and provide an overview of latest developments. Research funded by IARPA and Intel Corporation.
Verification and Validation Strategy for LWRS Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carl M. Stoots; Richard R. Schultz; Hans D. Gougar
2012-09-01
One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less
A Comprehensive Validation Approach Using The RAVEN Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J
2015-06-01
The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less
NASA Technical Reports Server (NTRS)
Clark, R. T.; Mccallister, R. D.
1982-01-01
The particular coding option identified as providing the best level of coding gain performance in an LSI-efficient implementation was the optimal constraint length five, rate one-half convolutional code. To determine the specific set of design parameters which optimally matches this decoder to the LSI constraints, a breadboard MCD (maximum-likelihood convolutional decoder) was fabricated and used to generate detailed performance trade-off data. The extensive performance testing data gathered during this design tradeoff study are summarized, and the functional and physical MCD chip characteristics are presented.
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.
1998-01-01
This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.
Combustion chamber analysis code
NASA Technical Reports Server (NTRS)
Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.
1993-01-01
A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.
Extension, validation and application of the NASCAP code
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.
1979-01-01
Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.
Nodal network generator for CAVE3
NASA Technical Reports Server (NTRS)
Palmieri, J. V.; Rathjen, K. A.
1982-01-01
A new extension of CAVE3 code was developed that automates the creation of a finite difference math model in digital form ready for input to the CAVE3 code. The new software, Nodal Network Generator, is broken into two segments. One segment generates the model geometry using a Tektronix Tablet Digitizer and the other generates the actual finite difference model and allows for graphic verification using Tektronix 4014 Graphic Scope. Use of the Nodal Network Generator is described.
CESAR: A Code for Nuclear Fuel and Waste Characterisation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vidal, J.M.; Grouiller, J.P.; Launay, A.
2006-07-01
CESAR (Simplified Evolution Code Applied to Reprocessing) is a depletion code developed through a joint program between CEA and COGEMA. In the late 1980's, the first use of this code dealt with nuclear measurement at the Laboratories of the La Hague reprocessing plant. The use of CESAR was then extended to characterizations of all entrance materials and for characterisation, via tracer, of all produced waste. The code can distinguish more than 100 heavy nuclides, 200 fission products and 100 activation products, and it can characterise both the fuel and the structural material of the fuel. CESAR can also make depletionmore » calculations from 3 months to 1 million years of cooling time. Between 2003-2005, the 5. version of the code was developed. The modifications were related to the harmonisation of the code's nuclear data with the JEF2.2 nuclear data file. This paper describes the code and explains the extensive use of this code at the La Hague reprocessing plant and also for prospective studies. The second part focuses on the modifications of the latest version, and describes the application field and the qualification of the code. Many companies and the IAEA use CESAR today. CESAR offers a Graphical User Interface, which is very user-friendly. (authors)« less
DNA rearrangements directed by non-coding RNAs in ciliates
Mochizuki, Kazufumi
2013-01-01
Extensive programmed rearrangement of DNA, including DNA elimination, chromosome fragmentation, and DNA descrambling, takes place in the newly developed macronucleus during the sexual reproduction of ciliated protozoa. Recent studies have revealed that two distant classes of ciliates use distinct types of non-coding RNAs to regulate such DNA rearrangement events. DNA elimination in Tetrahymena is regulated by small non-coding RNAs that are produced and utilized in an RNAi-related process. It has been proposed that the small RNAs produced from the micronuclear genome are used to identify eliminated DNA sequences by whole-genome comparison between the parental macronucleus and the micronucleus. In contrast, DNA descrambling in Oxytricha is guided by long non-coding RNAs that are produced from the parental macronuclear genome. These long RNAs are proposed to act as templates for the direct descrambling events that occur in the developing macronucleus. Both cases provide useful examples to study epigenetic chromatin regulation by non-coding RNAs. PMID:21956937
NASA Astrophysics Data System (ADS)
Clark, Stephen; Winske, Dan; Schaeffer, Derek; Everson, Erik; Bondarenko, Anton; Constantin, Carmen; Niemann, Christoph
2014-10-01
We present 3D hybrid simulations of laser produced expanding debris clouds propagating though a magnetized ambient plasma in the context of magnetized collisionless shocks. New results from the 3D code are compared to previously obtained simulation results using a 2D hybrid code. The 3D code is an extension of a previously developed 2D code developed at Los Alamos National Laboratory. It has been parallelized and ported to execute on a cluster environment. The new simulations are used to verify scaling relationships, such as shock onset time and coupling parameter (Rm /ρd), developed via 2D simulations. Previous 2D results focus primarily on laboratory shock formation relevant to experiments being performed on the Large Plasma Device, where the shock propagates across the magnetic field. The new 3D simulations show wave structure and dynamics oblique to the magnetic field that introduce new physics to be considered in future experiments.
Verification testing of the compression performance of the HEVC screen content coding extensions
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng
2017-09-01
This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.
MKHITAR GOSH'S MEDIEVAL LAW CODE AND ITS IMPLICATIONS FOR ARMENIAN COMMUNITIES ABROAD.
Davtyan, Susanna; Khachatryan, Mikayel; Johrian, Ara; Ghazaryan, Karen
2014-07-01
The Law Book of the medieval Armenian legal and economic thought is an exceptional work that encompasses valuable information of the Armenian nation's domestic life. Mkhitar Gosh was considered to be one of the most outstanding figures and lawyers (lawmakers) of all times. Armenian Law Code after Mkhitar Gosh is writhed at 12 century. One of the primary sources for the law code was Armenian customary law. This Code became moral code for guiding for hall Armenians over the world because of high moral spirit reflecting Armenian mentality. This article presents the brief history of extension of legal rules setting out in the Law Code. The Law Code was established and widely used not only in Armenia but also in a number of Armenian communities abroad (Russian, Poland, Georgia, Latvia, India etc.). Law Code was accepted by all Armenians. Moreover, it served for the development of legislation for a number of civilized European and Asian countries.
Analysis of view synthesis prediction architectures in modern coding standards
NASA Astrophysics Data System (ADS)
Tian, Dong; Zou, Feng; Lee, Chris; Vetro, Anthony; Sun, Huifang
2013-09-01
Depth-based 3D formats are currently being developed as extensions to both AVC and HEVC standards. The availability of depth information facilitates the generation of intermediate views for advanced 3D applications and displays, and also enables more efficient coding of the multiview input data through view synthesis prediction techniques. This paper outlines several approaches that have been explored to realize view synthesis prediction in modern video coding standards such as AVC and HEVC. The benefits and drawbacks of various architectures are analyzed in terms of performance, complexity, and other design considerations. It is hence concluded that block-based VSP prediction for multiview video signals provides attractive coding gains with comparable complexity as traditional motion/disparity compensation.
Practical moral codes in the transgenic organism debate.
Cooley, D R; Goreham, Gary; Youngs, George A
2004-01-01
In one study funded by the United States Department of Agriculture, people from North Dakota were interviewed to discover which moral principles they use in evaluating the morality of transgenic organisms and their introduction into markets. It was found that although the moral codes the human subjects employed were very similar, their views on transgenics were vastly different. In this paper, the codes that were used by the respondents are developed, compared to that of the academically composed Belmont Report, and then modified to create the more practical Common Moral Code. At the end, it is shown that the Common Moral Code has inherent inconsistency flaws that might be resolvable, but would require extensive work on the definition of terms and principles. However, the effort is worthwhile, especially if it results in a common moral code that all those involved in the debate are willing to use in negotiating a resolution to their differences.
NASA Astrophysics Data System (ADS)
Bertelli, N.; Valeo, E. J.; Phillips, C. K.
2015-11-01
A non Maxwellian extension of the full wave TORIC v.5 code in the mid/high harmonic and minority heating regimes has been revisited. In both regimes the treatment of the non-Maxwellian ions is needed in order to improve the analysis of combined fast wave (FW) and neutral beam injection (NBI) heated discharges in the current fusion devices. Additionally, this extension is also needed in time-dependent analysis where the combined heating experiments are generally considered. Initial numerical cases with thermal ions and with a non-Maxwellian ions are presented for both regimes. The simulations are then compared with results from the AORSA code, which has already been extended to include non-Maxwellian ions. First attempts to apply this extension in a self-consistent way with the NUBEAM module, which is included in the TRANSP code, are also discussed. Work supported by US DOE Contracts # DE-FC02-01ER54648 and DE-AC02-09CH11466.
Additional application of the NASCAP code. Volume 1: NASCAP extension
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J.; Mandell, M. J.; Parks, D. E.; Schnuelle, G. W.; Stannard, P. R.; Steen, P. G.
1981-01-01
The NASCAP computer program comprehensively analyzes problems of spacecraft charging. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Several changes were made to NASCAP, and a new code, NASCAP/LEO, was developed. In addition, detailed studies of several spacecraft-environmental interactions and of the SCATHA spacecraft were performed. The NASCAP/LEO program handles situations of relatively short Debye length encountered by large space structures or by any satellite in low earth orbit (LEO).
Recent Improvements in the FDNS CFD Code and its Associated Process
NASA Technical Reports Server (NTRS)
West, Jeff S.; Dorney, Suzanne M.; Turner, Jim (Technical Monitor)
2002-01-01
This viewgraph presentation gives an overview on recent improvements in the Finite Difference Navier Stokes (FDNS) computational fluid dynamics (CFD) code and its associated process. The development of a utility, PreViewer, has essentially eliminated the creeping of simple human error into the FDNS Solution process. Extension of PreViewer to encapsulate the Domain Decompression process has made practical the routine use of parallel processing. The combination of CVS source control and ATS consistency validation significantly increases the efficiency of the CFD process.
Space Radiation Transport Code Development: 3DHZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and PHITS are provided for a variety of boundary conditions and geometries. Improvements provided by the 3D corrections are made clear in the comparisons. Developments needed to connect 3DHZETRN to vehicle design and optimization studies will be discussed. Future theoretical development will relax the forward plus isotropic interaction assumption to more general angular dependence.
Deep Hashing for Scalable Image Search.
Lu, Jiwen; Liong, Venice Erin; Zhou, Jie
2017-05-01
In this paper, we propose a new deep hashing (DH) approach to learn compact binary codes for scalable image search. Unlike most existing binary codes learning methods, which usually seek a single linear projection to map each sample into a binary feature vector, we develop a deep neural network to seek multiple hierarchical non-linear transformations to learn these binary codes, so that the non-linear relationship of samples can be well exploited. Our model is learned under three constraints at the top layer of the developed deep network: 1) the loss between the compact real-valued code and the learned binary vector is minimized, 2) the binary codes distribute evenly on each bit, and 3) different bits are as independent as possible. To further improve the discriminative power of the learned binary codes, we extend DH into supervised DH (SDH) and multi-label SDH by including a discriminative term into the objective function of DH, which simultaneously maximizes the inter-class variations and minimizes the intra-class variations of the learned binary codes with the single-label and multi-label settings, respectively. Extensive experimental results on eight widely used image search data sets show that our proposed methods achieve very competitive results with the state-of-the-arts.
Long non-coding RNAs and mRNAs profiling during spleen development in pig.
Che, Tiandong; Li, Diyan; Jin, Long; Fu, Yuhua; Liu, Yingkai; Liu, Pengliang; Wang, Yixin; Tang, Qianzi; Ma, Jideng; Wang, Xun; Jiang, Anan; Li, Xuewei; Li, Mingzhou
2018-01-01
Genome-wide transcriptomic studies in humans and mice have become extensive and mature. However, a comprehensive and systematic understanding of protein-coding genes and long non-coding RNAs (lncRNAs) expressed during pig spleen development has not been achieved. LncRNAs are known to participate in regulatory networks for an array of biological processes. Here, we constructed 18 RNA libraries from developing fetal pig spleen (55 days before birth), postnatal pig spleens (0, 30, 180 days and 2 years after birth), and the samples from the 2-year-old Wild Boar. A total of 15,040 lncRNA transcripts were identified among these samples. We found that the temporal expression pattern of lncRNAs was more restricted than observed for protein-coding genes. Time-series analysis showed two large modules for protein-coding genes and lncRNAs. The up-regulated module was enriched for genes related to immune and inflammatory function, while the down-regulated module was enriched for cell proliferation processes such as cell division and DNA replication. Co-expression networks indicated the functional relatedness between protein-coding genes and lncRNAs, which were enriched for similar functions over the series of time points examined. We identified numerous differentially expressed protein-coding genes and lncRNAs in all five developmental stages. Notably, ceruloplasmin precursor (CP), a protein-coding gene participating in antioxidant and iron transport processes, was differentially expressed in all stages. This study provides the first catalog of the developing pig spleen, and contributes to a fuller understanding of the molecular mechanisms underpinning mammalian spleen development.
NASA Technical Reports Server (NTRS)
Coakley, T. J.; Hsieh, T.
1985-01-01
Numerical simulation of steady and unsteady transonic diffuser flows using two different computer codes are discussed and compared with experimental data. The codes solve the Reynolds-averaged, compressible, Navier-Stokes equations using various turbulence models. One of the codes has been applied extensively to diffuser flows and uses the hybrid method of MacCormack. This code is relatively inefficient numerically. The second code, which was developed more recently, is fully implicit and is relatively efficient numerically. Simulations of steady flows using the implicit code are shown to be in good agreement with simulations using the hybrid code. Both simulations are in good agreement with experimental results. Simulations of unsteady flows using the two codes are in good qualitative agreement with each other, although the quantitative agreement is not as good as in the steady flow cases. The implicit code is shown to be eight times faster than the hybrid code for unsteady flow calculations and up to 32 times faster for steady flow calculations. Results of calculations using alternative turbulence models are also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, A. J.; Fanning, T. H.
The United States has extensive experience with the design, construction, and operation of sodium cooled fast reactors (SFRs) over the last six decades. Despite the closure of various facilities, the U.S. continues to dedicate research and development (R&D) efforts to the design of innovative experimental, prototype, and commercial facilities. Accordingly, in support of the rich operating history and ongoing design efforts, the U.S. has been developing and maintaining a series of tools with capabilities that envelope all facets of SFR design and safety analyses. This paper provides an overview of the current U.S. SFR analysis toolset, including codes such asmore » SAS4A/SASSYS-1, MC2-3, SE2-ANL, PERSENT, NUBOW-3D, and LIFE-METAL, as well as the higher-fidelity tools (e.g. PROTEUS) being integrated into the toolset. Current capabilities of the codes are described and key ongoing development efforts are highlighted for some codes.« less
An Object-Oriented Computer Code for Aircraft Engine Weight Estimation
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Naylor, Bret A.
2009-01-01
Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn Research Center (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA's NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc., that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300-passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case.
NASA Technical Reports Server (NTRS)
Beutner, Thomas John
1993-01-01
Porous wall wind tunnels have been used for several decades and have proven effective in reducing wall interference effects in both low speed and transonic testing. They allow for testing through Mach 1, reduce blockage effects and reduce shock wave reflections in the test section. Their usefulness in developing computational fluid dynamics (CFD) codes has been limited, however, by the difficulties associated with modelling the effect of a porous wall in CFD codes. Previous approaches to modelling porous wall effects have depended either upon a simplified linear boundary condition, which has proven inadequate, or upon detailed measurements of the normal velocity near the wall, which require extensive wind tunnel time. The current work was initiated in an effort to find a simple, accurate method of modelling a porous wall boundary condition in CFD codes. The development of such a method would allow data from porous wall wind tunnels to be used more readily in validating CFD codes. This would be beneficial when transonic validations are desired, or when large models are used to achieve high Reynolds numbers in testing. A computational and experimental study was undertaken to investigate a new method of modelling solid and porous wall boundary conditions in CFD codes. The method utilized experimental measurements at the walls to develop a flow field solution based on the method of singularities. This flow field solution was then imposed as a pressure boundary condition in a CFD simulation of the internal flow field. The effectiveness of this method in describing the effect of porosity changes on the wall was investigated. Also, the effectiveness of this method when only sparse experimental measurements were available has been investigated. The current work demonstrated this approach for low speed flows and compared the results with experimental data obtained from a heavily instrumented variable porosity test section. The approach developed was simple, computationally inexpensive, and did not require extensive or intrusive measurements of the boundary conditions during the wind tunnel test. It may be applied to both solid and porous wall wind tunnel tests.
NASA Technical Reports Server (NTRS)
Rost, Martin C.; Sayood, Khalid
1991-01-01
A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.
QGene 4.0, an extensible Java QTL-analysis platform.
Joehanes, Roby; Nelson, James C
2008-12-01
Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.
EVA - A Textual Data Processing Tool.
ERIC Educational Resources Information Center
Jakopin, Primoz
EVA, a text processing tool designed to be self-contained and useful for a variety of languages, is described briefly, and its extensive coded character set is illustrated. Features, specifications, and database functions are noted. Its application in development of a Slovenian literary dictionary is also described. (MSE)
NASA Technical Reports Server (NTRS)
Bittker, David A.
1996-01-01
A generalized version of the NASA Lewis general kinetics code, LSENS, is described. The new code allows the use of global reactions as well as molecular processes in a chemical mechanism. The code also incorporates the capability of performing sensitivity analysis calculations for a perfectly stirred reactor rapidly and conveniently at the same time that the main kinetics calculations are being done. The GLSENS code has been extensively tested and has been found to be accurate and efficient. Nine example problems are presented and complete user instructions are given for the new capabilities. This report is to be used in conjunction with the documentation for the original LSENS code.
Internet Protocol Over Telemetry Testing for Earth Science Capability Demo Summary
NASA Technical Reports Server (NTRS)
Franz, Russ; Pestana, Mark; Bessent, Shedrick; Hang, Richard; Ng, Howard
2006-01-01
The development and flight tests described here focused on utilizing existing pulse code modulation (PCM) telemetry equipment to enable on-vehicle networks of instruments and computers to be a simple extension of the ground station network. This capability is envisioned as a necessary component of a global range that supports test and development of manned and unmanned airborne vehicles.
Extensions and improvements on XTRAN3S
NASA Technical Reports Server (NTRS)
Borland, C. J.
1989-01-01
Improvements to the XTRAN3S computer program are summarized. Work on this code, for steady and unsteady aerodynamic and aeroelastic analysis in the transonic flow regime has concentrated on the following areas: (1) Maintenance of the XTRAN3S code, including correction of errors, enhancement of operational capability, and installation on the Cray X-MP system; (2) Extension of the vectorization concepts in XTRAN3S to include additional areas of the code for improved execution speed; (3) Modification of the XTRAN3S algorithm for improved numerical stability for swept, tapered wing cases and improved computational efficiency; and (4) Extension of the wing-only version of XTRAN3S to include pylon and nacelle or external store capability.
Numerical Propulsion System Simulation Architecture
NASA Technical Reports Server (NTRS)
Naiman, Cynthia G.
2004-01-01
The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.
Development of 3D electromagnetic modeling tools for airborne vehicles
NASA Technical Reports Server (NTRS)
Volakis, John L.
1992-01-01
The main goal of this project is to develop methodologies for scattering by airborne composite vehicles. Although our primary focus continues to be the development of a general purpose code for analyzing the entire structure as a single unit, a number of other tasks are also pursued in parallel with this effort. These tasks are important in testing the overall approach and in developing suitable models for materials coatings, junctions and, more generally, in assessing the effectiveness of the various parts comprising the final code. Here, we briefly discuss our progress on the five different tasks which were pursued during this period. Our progress on each of these tasks is described in the detailed reports (listed at the end of this report) and the memoranda included. The first task described below is, of course, the core of this project and deals with the development of the overall code. Undoubtedly, it is the outcome of the research which was funded by NASA-Ames and the Navy over the past three years. During this year we developed the first finite element code for scattering by structures of arbitrary shape and composition. The code employs a new absorbing boundary condition which allows termination of the finite element mesh only 0.3 lambda from the outer surface of the target. This leads to a remarkable reduction of the mesh size and is a unique feature of the code. Other unique features of this code include capabilities to model resistive sheets, impedance sheets and anisotropic materials. This last capability is the latest feature of the code and is still under development. The code has been extensively validated for a number of composite geometries and some examples are given. The validation of the code is still in progress for anisotropic and larger non-metallic geometries and cavities. The developed finite element code is based on a Galerkin's formulation and employs edge-based tetrahedral elements for discretizing the dielectric sections and the region between the target and the outer mesh termination boundary (ATB). This boundary is placed in conformity with the target's outer surface, thus resulting in additional reduction of the unknown count.
Full-f version of GENE for turbulence in open-field-line systems
NASA Astrophysics Data System (ADS)
Pan, Q.; Told, D.; Shi, E. L.; Hammett, G. W.; Jenko, F.
2018-06-01
Unique properties of plasmas in the tokamak edge, such as large amplitude fluctuations and plasma-wall interactions in the open-field-line regions, require major modifications of existing gyrokinetic codes originally designed for simulating core turbulence. To this end, the global version of the 3D2V gyrokinetic code GENE, so far employing a δf-splitting technique, is extended to simulate electrostatic turbulence in straight open-field-line systems. The major extensions are the inclusion of the velocity-space nonlinearity, the development of a conducting-sheath boundary, and the implementation of the Lenard-Bernstein collision operator. With these developments, the code can be run as a full-f code and can handle particle loss to and reflection from the wall. The extended code is applied to modeling turbulence in the Large Plasma Device (LAPD), with a reduced mass ratio and a much lower collisionality. Similar to turbulence in a tokamak scrape-off layer, LAPD turbulence involves collisions, parallel streaming, cross-field turbulent transport with steep profiles, and particle loss at the parallel boundary.
5D Tempest simulations of kinetic edge turbulence
NASA Astrophysics Data System (ADS)
Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M. R.; Hittinger, J. A.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.; Umansky, M. V.; Qin, H.
2006-10-01
Results are presented from the development and application of TEMPEST, a nonlinear five dimensional (3d2v) gyrokinetic continuum code. The simulation results and theoretical analysis include studies of H-mode edge plasma neoclassical transport and turbulence in real divertor geometry and its relationship to plasma flow generation with zero external momentum input, including the important orbit-squeezing effect due to the large electric field flow-shear in the edge. In order to extend the code to 5D, we have formulated a set of fully nonlinear electrostatic gyrokinetic equations and a fully nonlinear gyrokinetic Poisson's equation which is valid for both neoclassical and turbulence simulations. Our 5D gyrokinetic code is built on 4D version of Tempest neoclassical code with extension to a fifth dimension in binormal direction. The code is able to simulate either a full torus or a toroidal segment. Progress on performing 5D turbulence simulations will be reported.
Overview of the relevant CFD work at Thiokol Corporation
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Loh, Hai-Tien
1992-01-01
An in-house developed proprietary advanced computational fluid dynamics code called SHARP (Trademark) is a primary tool for many flow simulations and design analyses. The SHARP code is a time dependent, two dimensional (2-D) axisymmetric numerical solution technique for the compressible Navier-Stokes equations. The solution technique in SHARP uses a vectorizable implicit, second order accurate in time and space, finite volume scheme based on an upwind flux-difference splitting of a Roe-type approximated Riemann solver, Van Leer's flux vector splitting, and a fourth order artificial dissipation scheme with a preconditioning to accelerate the flow solution. Turbulence is simulated by an algebraic model, and ultimately the kappa-epsilon model. Some other capabilities of the code are 2-D two-phase Lagrangian particle tracking and cell blockages. Extensive development and testing has been conducted on the 3-D version of the code with flow, combustion, and turbulence interactions. The emphasis here is on the specific applications of SHARP in Solid Rocket Motor design. Information is given in viewgraph form.
Study of fusion product effects in field-reversed mirrors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driemeyer, D.E.
1980-01-01
The effect of fusion products (fps) on Field-Reversed Mirror (FRM) reactor concepts has been evaluated through the development of two new computer models. The first code (MCFRM) treats fps as test particles in a fixed background plasma, which is represented as a fluid. MCFRM includes a Monte Carlo treatment of Coulomb scattering and thus provides an accurate treatment of fp behavior even at lower energies where pitch-angle scattering becomes important. The second code (FRMOD) is a steady-state, globally averaged, two-fluid (ion and electron), point model of the FRM plasma that incorporates fp heating and ash buildup values which are consistentmore » with the MCFRM calculations. These codes have been used extensively in the development of an advanced-fuel FRM reactor design (SAFFIRE). A Catalyzed-D version of the plant is also discussed along with an investigation of the steady-state energy distribution of fps in the FRM. User guides for the two computer codes are also included.« less
Sonea, Laura; Buse, Mihail; Gulei, Diana; Onaciu, Anca; Simon, Ioan; Braicu, Cornelia; Berindan-Neagoe, Ioana
2018-05-01
Lung cancer continues to be the leading topic concerning global mortality rate caused by can-cer; it needs to be further investigated to reduce these dramatic unfavorable statistic data. Non-coding RNAs (ncRNAs) have been shown to be important cellular regulatory factors and the alteration of their expression levels has become correlated to extensive number of pathologies. Specifically, their expres-sion profiles are correlated with development and progression of lung cancer, generating great interest for further investigation. This review focuses on the complex role of non-coding RNAs, namely miR-NAs, piwi-interacting RNAs, small nucleolar RNAs, long non-coding RNAs and circular RNAs in the process of developing novel biomarkers for diagnostic and prognostic factors that can then be utilized for personalized therapies toward this devastating disease. To support the concept of personalized medi-cine, we will focus on the roles of miRNAs in lung cancer tumorigenesis, their use as diagnostic and prognostic biomarkers and their application for patient therapy.
Development of an hp-version finite element method for computational optimal control
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Warner, Michael S.
1993-01-01
The purpose of this research effort is to develop a means to use, and to ultimately implement, hp-version finite elements in the numerical solution of optimal control problems. The hybrid MACSYMA/FORTRAN code GENCODE was developed which utilized h-version finite elements to successfully approximate solutions to a wide class of optimal control problems. In that code the means for improvement of the solution was the refinement of the time-discretization mesh. With the extension to hp-version finite elements, the degrees of freedom include both nodal values and extra interior values associated with the unknown states, co-states, and controls, the number of which depends on the order of the shape functions in each element.
CFD propels NASP propulsion progress
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.; Dwoyer, Douglas L.; Green, Michael J.
1990-01-01
The most complex aerothermodynamics encountered in the National Aerospace Plane (NASP) propulsion system are associated with the fuel-mixing and combustion-reaction flows of its combustor section; adequate CFD tools must be developed to model shock-wave systems, turbulent hydrogen/air mixing, flow separation, and combustion. Improvements to existing CFD codes have involved extension from two dimensions to three, as well as the addition of finite-rate hydrogen-air chemistry. A novel CFD code for the treatment of reacting flows throughout the NASP, designated GASP, uses the most advanced upwind-differencing technology.
P.L. 110-173: Provisions in the Medicare, Medicaid, and SCHIP Extension Act of 2007
2008-02-07
option available to the Secretary. Section 113. Payment rate for certain diagnostic laboratory tests. Glycosylated hemoglobin ( HbA1c ) is used to...monitor how well blood glucose levels are controlled in diabetes patients. The current Medicare payment rate for HbA1c is tied to two HCPCS codes: 83036...and 83037. HCPCS code 83037 was developed in CRS-10 2006 to cover the testing for HbA1c by a device approved by the Food and Drug Administration (FDA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Binotti, M.; Zhu, G.; Gray, A.
An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.
CFD propels NASP propulsion progress
NASA Astrophysics Data System (ADS)
Povinelli, Louis A.; Dwoyer, Douglas L.; Green, Michael J.
1990-07-01
The most complex aerothermodynamics encountered in the National Aerospace Plane (NASP) propulsion system are associated with the fuel-mixing and combustion-reaction flows of its combustor section; adequate CFD tools must be developed to model shock-wave systems, turbulent hydrogen/air mixing, flow separation, and combustion. Improvements to existing CFD codes have involved extension from two dimensions to three, as well as the addition of finite-rate hydrogen-air chemistry. A novel CFD code for the treatment of reacting flows throughout the NASP, designated GASP, uses the most advanced upwind-differencing technology.
The InSAR Scientific Computing Environment
NASA Technical Reports Server (NTRS)
Rosen, Paul A.; Gurrola, Eric; Sacco, Gian Franco; Zebker, Howard
2012-01-01
We have developed a flexible and extensible Interferometric SAR (InSAR) Scientific Computing Environment (ISCE) for geodetic image processing. ISCE was designed from the ground up as a geophysics community tool for generating stacks of interferograms that lend themselves to various forms of time-series analysis, with attention paid to accuracy, extensibility, and modularity. The framework is python-based, with code elements rigorously componentized by separating input/output operations from the processing engines. This allows greater flexibility and extensibility in the data models, and creates algorithmic code that is less susceptible to unnecessary modification when new data types and sensors are available. In addition, the components support provenance and checkpointing to facilitate reprocessing and algorithm exploration. The algorithms, based on legacy processing codes, have been adapted to assume a common reference track approach for all images acquired from nearby orbits, simplifying and systematizing the geometry for time-series analysis. The framework is designed to easily allow user contributions, and is distributed for free use by researchers. ISCE can process data from the ALOS, ERS, EnviSAT, Cosmo-SkyMed, RadarSAT-1, RadarSAT-2, and TerraSAR-X platforms, starting from Level-0 or Level 1 as provided from the data source, and going as far as Level 3 geocoded deformation products. With its flexible design, it can be extended with raw/meta data parsers to enable it to work with radar data from other platforms
ISBT 128 Standard for Coding Medical Products of Human Origin
Ashford, Paul; Delgado, Matthew
2017-01-01
Background ISBT 128 is an international standard for the terminology, coding, labeling, and identification of medical products of human origin (MPHO). Full implementation of ISBT 128 improves traceability, transparency, vigilance and surveillance, and interoperability. Methods ICCBBA maintains the ISBT 128 standard through the activities of a network of expert volunteers, including representatives from professional scientific societies, governments and users, to standardize and maintain MPHO identification. These individuals are organized into Technical Advisory Groups and work within a structured framework as part of a quality-controlled standards development process. Results The extensive involvement of international scientific and professional societies in the development of the standard has ensured that ISBT 128 has gained widespread recognition. The user community has developed confidence in the ability of the standard to adapt to new developments in their fields of interest. The standard is fully compatible with Single European Code requirements for tissues and cells and is utilized by many European tissue establishments. ISBT 128's flexibility and robustness has allowed for expansions into subject areas such as cellular therapy, regenerative medicine, and tissue banking. Conclusion ISBT 128 is the internationally recognized standard for coding MPHO and has gained widespread use globally throughout the past two decades. PMID:29344013
An Object-oriented Computer Code for Aircraft Engine Weight Estimation
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Naylor, Bret A.
2008-01-01
Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA s NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc. that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300- passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case. Keywords: NASA, aircraft engine, weight, object-oriented
Additional extensions to the NASCAP computer code, volume 3
NASA Technical Reports Server (NTRS)
Mandell, M. J.; Cooke, D. L.
1981-01-01
The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
78 FR 5244 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-24
...: REG-106511-00. Abstract: Section 6075(a) of the Internal Revenue Code (the Code) requires the executor...; however, except in the case of executors who are abroad, no such extension may be for more than 6 months. Executors currently request an extension of time to file Form 706 by filing Form 4768, ``Application for...
Nassi-Schneiderman Diagram in HTML Based on AML
ERIC Educational Resources Information Center
Menyhárt, László
2013-01-01
In an earlier work I defined an extension of XML called Algorithm Markup Language (AML) for easy and understandable coding in an IDE which supports XML editing (e.g. NetBeans). The AML extension contains annotations and native language (English or Hungarian) tag names used when coding our algorithm. This paper presents a drawing tool with which…
Extension of the SIESTA MHD equilibrium code to free-plasma-boundary problems
Peraza-Rodriguez, Hugo; Reynolds-Barredo, J. M.; Sanchez, Raul; ...
2017-08-28
Here, SIESTA is a recently developed MHD equilibrium code designed to perform fast and accurate calculations of ideal MHD equilibria for three-dimensional magnetic configurations. Since SIESTA does not assume closed magnetic surfaces, the solution can exhibit magnetic islands and stochastic regions. In its original implementation SIESTA addressed only fixed-boundary problems. That is, the shape of the plasma edge, assumed to be a magnetic surface, was kept fixed as the solution iteratively converges to equilibrium. This condition somewhat restricts the possible applications of SIESTA. In this paper we discuss an extension that will enable SIESTA to address free-plasma-boundary problems, opening upmore » the possibility of investigating problems in which the plasma boundary is perturbed either externally or internally. As an illustration, SIESTA is applied to a configuration of the W7-X stellarator.« less
Extension of the SIESTA MHD equilibrium code to free-plasma-boundary problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peraza-Rodriguez, Hugo; Reynolds-Barredo, J. M.; Sanchez, Raul
Here, SIESTA is a recently developed MHD equilibrium code designed to perform fast and accurate calculations of ideal MHD equilibria for three-dimensional magnetic configurations. Since SIESTA does not assume closed magnetic surfaces, the solution can exhibit magnetic islands and stochastic regions. In its original implementation SIESTA addressed only fixed-boundary problems. That is, the shape of the plasma edge, assumed to be a magnetic surface, was kept fixed as the solution iteratively converges to equilibrium. This condition somewhat restricts the possible applications of SIESTA. In this paper we discuss an extension that will enable SIESTA to address free-plasma-boundary problems, opening upmore » the possibility of investigating problems in which the plasma boundary is perturbed either externally or internally. As an illustration, SIESTA is applied to a configuration of the W7-X stellarator.« less
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Charley; Kamboj, Sunita; Wang, Cheng
2015-09-01
This handbook is an update of the 1993 version of the Data Collection Handbook and the Radionuclide Transfer Factors Report to support modeling the impact of radioactive material in soil. Many new parameters have been added to the RESRAD Family of Codes, and new measurement methodologies are available. A detailed review of available parameter databases was conducted in preparation of this new handbook. This handbook is a companion document to the user manuals when using the RESRAD (onsite) and RESRAD-OFFSITE code. It can also be used for RESRAD-BUILD code because some of the building-related parameters are included in this handbook.more » The RESRAD (onsite) has been developed for implementing U.S. Department of Energy Residual Radioactive Material Guidelines. Hydrogeological, meteorological, geochemical, geometrical (size, area, depth), crops and livestock, human intake, source characteristic, and building characteristic parameters are used in the RESRAD (onsite) code. The RESRAD-OFFSITE code is an extension of the RESRAD (onsite) code and can also model the transport of radionuclides to locations outside the footprint of the primary contamination. This handbook discusses parameter definitions, typical ranges, variations, and measurement methodologies. It also provides references for sources of additional information. Although this handbook was developed primarily to support the application of RESRAD Family of Codes, the discussions and values are valid for use of other pathway analysis models and codes.« less
CLIPS: A tool for the development and delivery of expert systems
NASA Technical Reports Server (NTRS)
Riley, Gary
1991-01-01
The C Language Integrated Production System (CLIPS) is a forward chaining rule-based language developed by the Software Technology Branch at the Johnson Space Center. CLIPS provides a complete environment for the construction of rule-based expert systems. CLIPS was designed specifically to provide high probability, low cost, and easy integration with external systems. Other key features of CLIPS include a powerful rule syntax, an interactive development environment, high performance, extensibility, a verification/validation tool, extensive documentation, and source code availability. The current release of CLIPS, version 4.3, is being used by over 2,500 users throughout the public and private community including: all NASA sites and branches of the military, numerous Federal bureaus, government contractors, 140 universities, and many companies.
Solving free-plasma-boundary problems with the SIESTA MHD code
NASA Astrophysics Data System (ADS)
Sanchez, R.; Peraza-Rodriguez, H.; Reynolds-Barredo, J. M.; Tribaldos, V.; Geiger, J.; Hirshman, S. P.; Cianciosa, M.
2017-10-01
SIESTA is a recently developed MHD equilibrium code designed to perform fast and accurate calculations of ideal MHD equilibria for 3D magnetic configurations. It is an iterative code that uses the solution obtained by the VMEC code to provide a background coordinate system and an initial guess of the solution. The final solution that SIESTA finds can exhibit magnetic islands and stochastic regions. In its original implementation, SIESTA addressed only fixed-boundary problems. This fixed boundary condition somewhat restricts its possible applications. In this contribution we describe a recent extension of SIESTA that enables it to address free-plasma-boundary situations, opening up the possibility of investigating problems with SIESTA in which the plasma boundary is perturbed either externally or internally. As an illustration, the extended version of SIESTA is applied to a configuration of the W7-X stellarator.
NASA Technical Reports Server (NTRS)
Sanchez, Jose Enrique; Auge, Estanislau; Santalo, Josep; Blanes, Ian; Serra-Sagrista, Joan; Kiely, Aaron
2011-01-01
A new standard for image coding is being developed by the MHDC working group of the CCSDS, targeting onboard compression of multi- and hyper-spectral imagery captured by aircraft and satellites. The proposed standard is based on the "Fast Lossless" adaptive linear predictive compressor, and is adapted to better overcome issues of onboard scenarios. In this paper, we present a review of the state of the art in this field, and provide an experimental comparison of the coding performance of the emerging standard in relation to other state-of-the-art coding techniques. Our own independent implementation of the MHDC Recommended Standard, as well as of some of the other techniques, has been used to provide extensive results over the vast corpus of test images from the CCSDS-MHDC.
Identification of coding and non-coding mutational hotspots in cancer genomes.
Piraino, Scott W; Furney, Simon J
2017-01-05
The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from likely passenger regions susceptible to somatic mutation.
The ePLAS code for Ignition Studies
NASA Astrophysics Data System (ADS)
Faehl, R. J.; Mason, R. J.; Kirkpatrick, R. C.
2012-10-01
The ePLAS code is a multi-fluid/PIC hybrid developing self-consistent E & B-fields by the Implicit Moment Method for stable calculations of high density plasma problems with voids on the electron Courant time scale. See: http://www.researchapplicationscorp.com. Here, we outline typical applications to: 1) short pulse driven electron transport along void (or high Z) insulated wires, and 2) the 2D development of shock ignition pressure peaks with B-fields. We outline the code's recent inclusion of SESAME EOS data, a DT/DD burn capability, a new option for K-alpha imaging of modeling output, and demonstrate a foil expansion tracked with either fluid or particle ions. Also, we describe a new super-hybrid extension of our implicit solver that permits full target dynamics studies on the ion Courant scale. Finally, we will touch on the very recent application of ePLAS to possible non-local/kinetic hydro effects NIF capsules.
Numerical, analytical, experimental study of fluid dynamic forces in seals
NASA Technical Reports Server (NTRS)
Shapiro, William; Artiles, Antonio; Aggarwal, Bharat; Walowit, Jed; Athavale, Mahesh M.; Preskwas, Andrzej J.
1992-01-01
NASA/Lewis Research Center is sponsoring a program for providing computer codes for analyzing and designing turbomachinery seals for future aerospace and engine systems. The program is made up of three principal components: (1) the development of advanced three dimensional (3-D) computational fluid dynamics codes, (2) the production of simpler two dimensional (2-D) industrial codes, and (3) the development of a knowledge based system (KBS) that contains an expert system to assist in seal selection and design. The first task has been to concentrate on cylindrical geometries with straight, tapered, and stepped bores. Improvements have been made by adoption of a colocated grid formulation, incorporation of higher order, time accurate schemes for transient analysis and high order discretization schemes for spatial derivatives. This report describes the mathematical formulations and presents a variety of 2-D results, including labyrinth and brush seal flows. Extensions of 3-D are presently in progress.
NASA Technical Reports Server (NTRS)
Sayood, K.; Chen, Y. C.; Wang, X.
1992-01-01
During this reporting period we have worked on three somewhat different problems. These are modeling of video traffic in packet networks, low rate video compression, and the development of a lossy + lossless image compression algorithm, which might have some application in browsing algorithms. The lossy + lossless scheme is an extension of work previously done under this grant. It provides a simple technique for incorporating browsing capability. The low rate coding scheme is also a simple variation on the standard discrete cosine transform (DCT) coding approach. In spite of its simplicity, the approach provides surprisingly high quality reconstructions. The modeling approach is borrowed from the speech recognition literature, and seems to be promising in that it provides a simple way of obtaining an idea about the second order behavior of a particular coding scheme. Details about these are presented.
Game On, Science - How Video Game Technology May Help Biologists Tackle Visualization Challenges
Da Silva, Franck; Empereur-mot, Charly; Chavent, Matthieu; Baaden, Marc
2013-01-01
The video games industry develops ever more advanced technologies to improve rendering, image quality, ergonomics and user experience of their creations providing very simple to use tools to design new games. In the molecular sciences, only a small number of experts with specialized know-how are able to design interactive visualization applications, typically static computer programs that cannot easily be modified. Are there lessons to be learned from video games? Could their technology help us explore new molecular graphics ideas and render graphics developments accessible to non-specialists? This approach points to an extension of open computer programs, not only providing access to the source code, but also delivering an easily modifiable and extensible scientific research tool. In this work, we will explore these questions using the Unity3D game engine to develop and prototype a biological network and molecular visualization application for subsequent use in research or education. We have compared several routines to represent spheres and links between them, using either built-in Unity3D features or our own implementation. These developments resulted in a stand-alone viewer capable of displaying molecular structures, surfaces, animated electrostatic field lines and biological networks with powerful, artistic and illustrative rendering methods. We consider this work as a proof of principle demonstrating that the functionalities of classical viewers and more advanced novel features could be implemented in substantially less time and with less development effort. Our prototype is easily modifiable and extensible and may serve others as starting point and platform for their developments. A webserver example, standalone versions for MacOS X, Linux and Windows, source code, screen shots, videos and documentation are available at the address: http://unitymol.sourceforge.net/. PMID:23483961
Game on, science - how video game technology may help biologists tackle visualization challenges.
Lv, Zhihan; Tek, Alex; Da Silva, Franck; Empereur-mot, Charly; Chavent, Matthieu; Baaden, Marc
2013-01-01
The video games industry develops ever more advanced technologies to improve rendering, image quality, ergonomics and user experience of their creations providing very simple to use tools to design new games. In the molecular sciences, only a small number of experts with specialized know-how are able to design interactive visualization applications, typically static computer programs that cannot easily be modified. Are there lessons to be learned from video games? Could their technology help us explore new molecular graphics ideas and render graphics developments accessible to non-specialists? This approach points to an extension of open computer programs, not only providing access to the source code, but also delivering an easily modifiable and extensible scientific research tool. In this work, we will explore these questions using the Unity3D game engine to develop and prototype a biological network and molecular visualization application for subsequent use in research or education. We have compared several routines to represent spheres and links between them, using either built-in Unity3D features or our own implementation. These developments resulted in a stand-alone viewer capable of displaying molecular structures, surfaces, animated electrostatic field lines and biological networks with powerful, artistic and illustrative rendering methods. We consider this work as a proof of principle demonstrating that the functionalities of classical viewers and more advanced novel features could be implemented in substantially less time and with less development effort. Our prototype is easily modifiable and extensible and may serve others as starting point and platform for their developments. A webserver example, standalone versions for MacOS X, Linux and Windows, source code, screen shots, videos and documentation are available at the address: http://unitymol.sourceforge.net/.
Improved Algorithm For Finite-Field Normal-Basis Multipliers
NASA Technical Reports Server (NTRS)
Wang, C. C.
1989-01-01
Improved algorithm reduces complexity of calculations that must precede design of Massey-Omura finite-field normal-basis multipliers, used in error-correcting-code equipment and cryptographic devices. Algorithm represents an extension of development reported in "Algorithm To Design Finite-Field Normal-Basis Multipliers" (NPO-17109), NASA Tech Briefs, Vol. 12, No. 5, page 82.
Proficiency Verification Systems (PVS): Skills Indices for Language Arts. Technical Note.
ERIC Educational Resources Information Center
Humes, Ann
The procedures undertaken in developing and organizing skills indexes for use in coding elementary school language arts textbooks to determine what is actually taught are presented in this paper. The outlined procedures included performing a preliminary analysis on four language arts textbooks to compile an extensive list of skills and performance…
LSPRAY-IV: A Lagrangian Spray Module
NASA Technical Reports Server (NTRS)
Raju, M. S.
2012-01-01
LSPRAY-IV is a Lagrangian spray solver developed for application with parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and/or Monte Carlo Probability Density Function (PDF) solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type for the gas flow grid representation. It is mainly designed to predict the flow, thermal and transport properties of a rapidly vaporizing spray. Some important research areas covered as a part of the code development are: (1) the extension of combined CFD/scalar-Monte- Carlo-PDF method to spray modeling, (2) the multi-component liquid spray modeling, and (3) the assessment of various atomization models used in spray calculations. The current version contains the extension to the modeling of superheated sprays. The manual provides the user with an understanding of various models involved in the spray formulation, its code structure and solution algorithm, and various other issues related to parallelization and its coupling with other solvers.
CARES/LIFE Software Commercialization
NASA Technical Reports Server (NTRS)
1995-01-01
The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugler, Raymond E.; Lee, Chi-Miag (Technical Monitor)
2001-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this paper, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery for space launch vehicle propulsion systems.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugfer, Raymond E.
2002-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugler, Raymond E.
2002-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid beat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.
Focus Group Research on the Implications of Adopting the Unified English Braille Code
ERIC Educational Resources Information Center
Wetzel, Robin; Knowlton, Marie
2006-01-01
Five focus groups explored concerns about adopting the Unified English Braille Code. The consensus was that while the proposed changes to the literary braille code would be minor, those to the mathematics braille code would be much more extensive. The participants emphasized that "any code that reduces the number of individuals who can access…
Prediction of unsteady transonic flow around missile configurations
NASA Technical Reports Server (NTRS)
Nixon, D.; Reisenthel, P. H.; Torres, T. O.; Klopfer, G. H.
1990-01-01
This paper describes the preliminary development of a method for predicting the unsteady transonic flow around missiles at transonic and supersonic speeds, with the final goal of developing a computer code for use in aeroelastic calculations or during maneuvers. The basic equations derived for this method are an extension of those derived by Klopfer and Nixon (1989) for steady flow and are a subset of the Euler equations. In this approach, the five Euler equations are reduced to an equation similar to the three-dimensional unsteady potential equation, and a two-dimensional Poisson equation. In addition, one of the equations in this method is almost identical to the potential equation for which there are well tested computer codes, allowing the development of a prediction method based in part on proved technology.
Improvements to the nuclear model code GNASH for cross section calculations at higher energies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, P.G.; Chadwick, M.B.
1994-05-01
The nuclear model code GNASH, which in the past has been used predominantly for incident particle energies below 20 MeV, has been modified extensively for calculations at higher energies. The model extensions and improvements are described in this paper, and their significance is illustrated by comparing calculations with experimental data for incident energies up to 160 MeV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casarini, L.; Bonometto, S.A.; Tessarotto, E.
2016-08-01
We discuss an extension of the Coyote emulator to predict non-linear matter power spectra of dark energy (DE) models with a scale factor dependent equation of state of the form w = w {sub 0}+(1- a ) w {sub a} . The extension is based on the mapping rule between non-linear spectra of DE models with constant equation of state and those with time varying one originally introduced in ref. [40]. Using a series of N-body simulations we show that the spectral equivalence is accurate to sub-percent level across the same range of modes and redshift covered by the Coyotemore » suite. Thus, the extended emulator provides a very efficient and accurate tool to predict non-linear power spectra for DE models with w {sub 0}- w {sub a} parametrization. According to the same criteria we have developed a numerical code that we have implemented in a dedicated module for the CAMB code, that can be used in combination with the Coyote Emulator in likelihood analyses of non-linear matter power spectrum measurements. All codes can be found at https://github.com/luciano-casarini/pkequal.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Portmann, Greg; /LBL, Berkeley; Safranek, James
The LOCO algorithm has been used by many accelerators around the world. Although the uses for LOCO vary, the most common use has been to find calibration errors and correct the optics functions. The light source community in particular has made extensive use of the LOCO algorithms to tightly control the beta function and coupling. Maintaining high quality beam parameters requires constant attention so a relatively large effort was put into software development for the LOCO application. The LOCO code was originally written in FORTRAN. This code worked fine but it was somewhat awkward to use. For instance, the FORTRANmore » code itself did not calculate the model response matrix. It required a separate modeling code such as MAD to calculate the model matrix then one manually loads the data into the LOCO code. As the number of people interested in LOCO grew, it required making it easier to use. The decision to port LOCO to Matlab was relatively easy. It's best to use a matrix programming language with good graphics capability; Matlab was also being used for high level machine control; and the accelerator modeling code AT, [5], was already developed for Matlab. Since LOCO requires collecting and processing a relative large amount of data, it is very helpful to have the LOCO code compatible with the high level machine control, [3]. A number of new features were added while porting the code from FORTRAN and new methods continue to evolve, [7][9]. Although Matlab LOCO was written with AT as the underlying tracking code, a mechanism to connect to other modeling codes has been provided.« less
NASA Technical Reports Server (NTRS)
Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.
2012-01-01
This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.
Faunus: An object oriented framework for molecular simulation
Lund, Mikael; Trulsson, Martin; Persson, Björn
2008-01-01
Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331
PIES free boundary stellarator equilibria with improved initial conditions
NASA Astrophysics Data System (ADS)
Drevlak, M.; Monticello, D.; Reiman, A.
2005-07-01
The MFBE procedure developed by Strumberger (1997 Nucl. Fusion 37 19) is used to provide an improved starting point for free boundary equilibrium computations in the case of W7-X (Nührenberg and Zille 1986 Phys. Lett. A 114 129) using the Princeton iterative equilibrium solver (PIES) code (Reiman and Greenside 1986 Comput. Phys. Commun. 43 157). Transferring the consistent field found by the variational moments equilibrium code (VMEC) (Hirshmann and Whitson 1983 Phys. Fluids 26 3553) to an extended coordinate system using the VMORPH code, a safe margin between plasma boundary and PIES domain is established. The new EXTENDER_P code implements a generalization of the virtual casing principle, which allows field extension both for VMEC and PIES equilibria. This facilitates analysis of the 5/5 islands of the W7-X standard case without including them in the original PIES computation.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Implementation of algebraic stress models in a general 3-D Navier-Stokes method (PAB3D)
NASA Technical Reports Server (NTRS)
Abdol-Hamid, Khaled S.
1995-01-01
A three-dimensional multiblock Navier-Stokes code, PAB3D, which was developed for propulsion integration and general aerodynamic analysis, has been used extensively by NASA Langley and other organizations to perform both internal (exhaust) and external flow analysis of complex aircraft configurations. This code was designed to solve the simplified Reynolds Averaged Navier-Stokes equations. A two-equation k-epsilon turbulence model has been used with considerable success, especially for attached flows. Accurate predicting of transonic shock wave location and pressure recovery in separated flow regions has been more difficult. Two algebraic Reynolds stress models (ASM) have been recently implemented in the code that greatly improved the code's ability to predict these difficult flow conditions. Good agreement with Direct Numerical Simulation (DNS) for a subsonic flat plate was achieved with ASM's developed by Shih, Zhu, and Lumley and Gatski and Speziale. Good predictions were also achieved at subsonic and transonic Mach numbers for shock location and trailing edge boattail pressure recovery on a single-engine afterbody/nozzle model.
Status and plans for the ANOPP/HSR prediction system
NASA Technical Reports Server (NTRS)
Nolan, Sandra K.
1992-01-01
ANOPP is a comprehensive prediction system which was developed and validated by NASA. Because ANOPP is a system prediction program, it allows aerospace industry researchers to create trade-off studies with a variety of aircraft noise problems. The extensive validation of ANOPP allows the program results to be used as a benchmark for testing other prediction codes.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
A rapid mission analysis code based on the use of approximate flight path equations of motion is described. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelop performance mapping capabilities. Approximate take off and landing analyses can be performed. At high speeds, centrifugal lift effects are taken into account. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
Balintová, Jana; Plucnara, Medard; Vidláková, Pavlína; Pohl, Radek; Havran, Luděk; Fojta, Miroslav; Hocek, Michal
2013-09-16
Benzofurazane has been attached to nucleosides and dNTPs, either directly or through an acetylene linker, as a new redox label for electrochemical analysis of nucleotide sequences. Primer extension incorporation of the benzofurazane-modified dNTPs by polymerases has been developed for the construction of labeled oligonucleotide probes. In combination with nitrophenyl and aminophenyl labels, we have successfully developed a three-potential coding of DNA bases and have explored the relevant electrochemical potentials. The combination of benzofurazane and nitrophenyl reducible labels has proved to be excellent for ratiometric analysis of nucleotide sequences and is suitable for bioanalytical applications. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
QuakeML - An XML Schema for Seismology
NASA Astrophysics Data System (ADS)
Wyss, A.; Schorlemmer, D.; Maraini, S.; Baer, M.; Wiemer, S.
2004-12-01
We propose an extensible format-definition for seismic data (QuakeML). Sharing data and seismic information efficiently is one of the most important issues for research and observational seismology in the future. The eXtensible Markup Language (XML) is playing an increasingly important role in the exchange of a variety of data. Due to its extensible definition capabilities, its wide acceptance and the existing large number of utilities and libraries for XML, a structured representation of various types of seismological data should in our opinion be developed by defining a 'QuakeML' standard. Here we present the QuakeML definitions for parameter databases and further efforts, e.g. a central QuakeML catalog database and a web portal for exchanging codes and stylesheets.
Lower-extremity isokinetic strength profiling in professional rugby league and rugby union.
Brown, Scott R; Brughelli, Matt; Griffiths, Peter C; Cronin, John B
2014-03-01
While several studies have documented isokinetic knee strength in junior and senior rugby league players, investigations of isokinetic knee and hip strength in professional rugby union players are limited. The purpose of this study was to provide lower-extremity strength profiles and compare isokinetic knee and hip strength of professional rugby league and rugby union players. 32 professional rugby league and 25 professional rugby union players. Cross-sectional analysis. Isokinetic dynamometry was used to evaluate peak torque and strength ratios of the dominant and nondominant legs during seated knee-extension/ flexion and supine hip-extension/flexion actions at 60°/s. Forwards from both codes were taller and heavier and had a higher body-mass index than the backs of each code. Rugby union forwards produced significantly (P < .05) greater peak torque during knee flexion in the dominant and nondominant legs (ES = 1.81 and 2.02) compared with rugby league forwards. Rugby league backs produced significantly greater hip-extension peak torque in the dominant and nondominant legs (ES = 0.83 and 0.77) compared with rugby union backs. There were no significant differences in hamstring-to-quadriceps ratios between code, position, or leg. Rugby union forwards and backs produced significantly greater knee-flexion-to-hip-extension ratios in the dominant and nondominant legs (ES = 1.49-2.26) than rugby union players. It seems that the joint torque profiles of players from rugby league and union codes differ, which may be attributed to the different demands of each code.
The JPEG XT suite of standards: status and future plans
NASA Astrophysics Data System (ADS)
Richter, Thomas; Bruylants, Tim; Schelkens, Peter; Ebrahimi, Touradj
2015-09-01
The JPEG standard has known an enormous market adoption. Daily, billions of pictures are created, stored and exchanged in this format. The JPEG committee acknowledges this success and spends continued efforts in maintaining and expanding the standard specifications. JPEG XT is a standardization effort targeting the extension of the JPEG features by enabling support for high dynamic range imaging, lossless and near-lossless coding, and alpha channel coding, while also guaranteeing backward and forward compatibility with the JPEG legacy format. This paper gives an overview of the current status of the JPEG XT standards suite. It discusses the JPEG legacy specification, and details how higher dynamic range support is facilitated both for integer and floating-point color representations. The paper shows how JPEG XT's support for lossless and near-lossless coding of low and high dynamic range images is achieved in combination with backward compatibility to JPEG legacy. In addition, the extensible boxed-based JPEG XT file format on which all following and future extensions of JPEG will be based is introduced. This paper also details how the lossy and lossless representations of alpha channels are supported to allow coding transparency information and arbitrarily shaped images. Finally, we conclude by giving prospects on upcoming JPEG standardization initiative JPEG Privacy & Security, and a number of other possible extensions in JPEG XT.
Zhu, Yan; Chen, Longxian; Zhang, Chengjun; Hao, Pei; Jing, Xinyun; Li, Xuan
2017-01-25
Selaginella moellendorffii, a lycophyte, is a model plant to study the early evolution and development of vascular plants. As the first and only sequenced lycophyte to date, the genome of S. moellendorffii revealed many conserved genes and pathways, as well as specialized genes different from flowering plants. Despite the progress made, little is known about long noncoding RNAs (lncRNA) and the alternative splicing (AS) of coding genes in S. moellendorffii. Its coding gene models have not been fully validated with transcriptome data. Furthermore, it remains important to understand whether the regulatory mechanisms similar to flowering plants are used, and how they operate in a non-seed primitive vascular plant. RNA-sequencing (RNA-seq) was performed for three S. moellendorffii tissues, root, stem, and leaf, by constructing strand-specific RNA-seq libraries from RNA purified using RiboMinus isolation protocol. A total of 176 million reads (44 Gbp) were obtained from three tissue types, and were mapped to S. moellendorffii genome. By comparing with 22,285 existing gene models of S. moellendorffii, we identified 7930 high-confidence novel coding genes (a 35.6% increase), and for the first time reported 4422 lncRNAs in a lycophyte. Further, we refined 2461 (11.0%) of existing gene models, and identified 11,030 AS events (for 5957 coding genes) revealed for the first time for lycophytes. Tissue-specific gene expression with functional implication was analyzed, and 1031, 554, and 269 coding genes, and 174, 39, and 17 lncRNAs were identified in root, stem, and leaf tissues, respectively. The expression of critical genes for vascular development stages, i.e. formation of provascular cells, xylem specification and differentiation, and phloem specification and differentiation, was compared in S. moellendorffii tissues, indicating a less complex regulatory mechanism in lycophytes than in flowering plants. The results were further strengthened by the evolutionary trend of seven transcription factor families related to vascular development, which was observed among four representative species of seed and non-seed vascular plants, and nonvascular land and aquatic plants. The deep RNA-seq study of S. moellendorffii discovered extensive new gene contents, including novel coding genes, lncRNAs, AS events, and refined gene models. Compared to flowering vascular plants, S. moellendorffii displayed a less complexity in both gene structure, alternative splicing, and regulatory elements of vascular development. The study offered important insight into the evolution of vascular plants, and the regulation mechanism of vascular development in a non-seed plant.
Boundary-Layer Stability Analysis of the Mean Flows Obtained Using Unstructured Grids
NASA Technical Reports Server (NTRS)
Liao, Wei; Malik, Mujeeb R.; Lee-Rausch, Elizabeth M.; Li, Fei; Nielsen, Eric J.; Buning, Pieter G.; Chang, Chau-Lyan; Choudhari, Meelan M.
2012-01-01
Boundary-layer stability analyses of mean flows extracted from unstructured-grid Navier- Stokes solutions have been performed. A procedure has been developed to extract mean flow profiles from the FUN3D unstructured-grid solutions. Extensive code-to-code validations have been performed by comparing the extracted mean ows as well as the corresponding stability characteristics to the predictions based on structured-grid solutions. Comparisons are made on a range of problems from a simple at plate to a full aircraft configuration-a modified Gulfstream-III with a natural laminar flow glove. The future aim of the project is to extend the adjoint-based design capability in FUN3D to include natural laminar flow and laminar flow control by integrating it with boundary-layer stability analysis codes, such as LASTRAC.
Standard interface files and procedures for reactor physics codes, version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, B.M.
Standards and procedures for promoting the exchange of reactor physics codes are updated to Version-III status. Standards covering program structure, interface files, file handling subroutines, and card input format are included. The implementation status of the standards in codes and the extension of the standards to new code areas are summarized. (15 references) (auth)
Improvement of COBRA-TF for modeling of PWR cold- and hot-legs during reactor transients
NASA Astrophysics Data System (ADS)
Salko, Robert K.
COBRA-TF is a two-phase, three-field (liquid, vapor, droplets) thermal-hydraulic modeling tool that has been developed by the Pacific Northwest Laboratory under sponsorship of the NRC. The code was developed for Light Water Reactor analysis starting in the 1980s; however, its development has continued to this current time. COBRA-TF still finds wide-spread use throughout the nuclear engineering field, including nuclear-power vendors, academia, and research institutions. It has been proposed that extension of the COBRA-TF code-modeling region from vessel-only components to Pressurized Water Reactor (PWR) coolant-line regions can lead to improved Loss-of-Coolant Accident (LOCA) analysis. Improved modeling is anticipated due to COBRA-TF's capability to independently model the entrained-droplet flow-field behavior, which has been observed to impact delivery to the core region[1]. Because COBRA-TF was originally developed for vertically-dominated, in-vessel, sub-channel flow, extension of the COBRA-TF modeling region to the horizontal-pipe geometries of the coolant-lines required several code modifications, including: • Inclusion of the stratified flow regime into the COBRA-TF flow regime map, along with associated interfacial drag, wall drag and interfacial heat transfer correlations, • Inclusion of a horizontal-stratification force between adjacent mesh cells having unequal levels of stratified flow, and • Generation of a new code-input interface for the modeling of coolant-lines. The sheer number of COBRA-TF modifications that were required to complete this work turned this project into a code-development project as much as it was a study of thermal-hydraulics in reactor coolant-lines. The means for achieving these tasks shifted along the way, ultimately leading the development of a separate, nearly completely independent one-dimensional, two-phase-flow modeling code geared toward reactor coolant-line analysis. This developed code has been named CLAP, for Coolant-Line-Analysis Package. Versions were created that were both coupled to COBRA-TF and standalone, with the most recent version being a standalone code. This code performs a separate, simplified, 1-D solution of the conservation equations while making special considerations for coolant-line geometry and flow phenomena. The end of this project saw a functional code package that demonstrates a stable numerical solution and that has gone through a series of Validation and Verification tests using the Two-Phase Testing Facility (TPTF) experimental data[2]. The results indicate that CLAP is under-performing RELAP5-MOD3 in predicting the experimental void of the TPTF facility in some cases. There is no apparent pattern, however, to point to a consistent type of case that the code fails to predict properly (e.g., low-flow, high-flow, discharging to full vessel, or discharging to empty vessel). Pressure-profile predictions are sometimes unrealistic, which indicates that there may be a problem with test-case boundary conditions or with the coupling of continuity and momentum equations in the solution algorithm. The code does predict the flow regime correctly for all cases with the stratification-force model off. Turning the stratification model on can cause the low-flow case void profiles to over-react to the force and the flow regime to transition out of stratified flow. The code would benefit from an increased amount of Validation & Verification testing. The development of CLAP was significant, as it is a cleanly written, logical representation of the reactor coolant-line geometry. It is stable and capable of modeling basic flow physics in the reactor coolant-line. Code development and debugging required the temporary removal of the energy equation and mass-transfer terms in governing equations. The reintroduction of these terms will allow future coupling to RELAP and re-coupling with COBRA-TF. Adding in more applicable entrainment and de-entrainment models would allow the capture of more advanced physics in the coolant-line that can be expected during Loss-of-Coolant Accident. One of the package's benefits is its ability to be used as a platform for future coolant-line model development and implementation, including capturing of the important de-entrainment behavior in reactor hot-legs (steam-binding effect) and flow convection in the upper-plenum region of the vessel.
Xia, Yidong; Lou, Jialin; Luo, Hong; ...
2015-02-09
Here, an OpenACC directive-based graphics processing unit (GPU) parallel scheme is presented for solving the compressible Navier–Stokes equations on 3D hybrid unstructured grids with a third-order reconstructed discontinuous Galerkin method. The developed scheme requires the minimum code intrusion and algorithm alteration for upgrading a legacy solver with the GPU computing capability at very little extra effort in programming, which leads to a unified and portable code development strategy. A face coloring algorithm is adopted to eliminate the memory contention because of the threading of internal and boundary face integrals. A number of flow problems are presented to verify the implementationmore » of the developed scheme. Timing measurements were obtained by running the resulting GPU code on one Nvidia Tesla K20c GPU card (Nvidia Corporation, Santa Clara, CA, USA) and compared with those obtained by running the equivalent Message Passing Interface (MPI) parallel CPU code on a compute node (consisting of two AMD Opteron 6128 eight-core CPUs (Advanced Micro Devices, Inc., Sunnyvale, CA, USA)). Speedup factors of up to 24× and 1.6× for the GPU code were achieved with respect to one and 16 CPU cores, respectively. The numerical results indicate that this OpenACC-based parallel scheme is an effective and extensible approach to port unstructured high-order CFD solvers to GPU computing.« less
User's Manual for RESRAD-OFFSITE Version 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; Gnanapragasam, E.; Biwer, B. M.
2007-09-05
The RESRAD-OFFSITE code is an extension of the RESRAD (onsite) code, which has been widely used for calculating doses and risks from exposure to radioactively contaminated soils. The development of RESRAD-OFFSITE started more than 10 years ago, but new models and methodologies have been developed, tested, and incorporated since then. Some of the new models have been benchmarked against other independently developed (international) models. The databases used have also expanded to include all the radionuclides (more than 830) contained in the International Commission on Radiological Protection (ICRP) 38 database. This manual provides detailed information on the design and application ofmore » the RESRAD-OFFSITE code. It describes in detail the new models used in the code, such as the three-dimensional dispersion groundwater flow and radionuclide transport model, the Gaussian plume model for atmospheric dispersion, and the deposition model used to estimate the accumulation of radionuclides in offsite locations and in foods. Potential exposure pathways and exposure scenarios that can be modeled by the RESRAD-OFFSITE code are also discussed. A user's guide is included in Appendix A of this manual. The default parameter values and parameter distributions are presented in Appendix B, along with a discussion on the statistical distributions for probabilistic analysis. A detailed discussion on how to reduce run time, especially when conducting probabilistic (uncertainty) analysis, is presented in Appendix C of this manual.« less
Development of the First Guideline for Professional Conduct in Medical Practice in Iran.
Saeedi Tehrani, Saeedeh; Nayeri, Fatemeh; Parsapoor, Alireza; Jafarian, Ali; Labaf, Ali; Mirzazadeh, Azim; Emadi Kouchak, Hamid; Shahi, Farhad; Ghasemzadeh, Nazafarin; Asghari, Fariba
2017-01-01
Professional behavior is first learned at the university. One of the necessary considerations in maintaining the professional environment of the university is establishing a set of codes for the behavior of physicians and medical students. This paper describes the process of developing the professional code of conduct in Tehran University of Medical Sciences, Tehran, Iran. A review of Iranian and international literature was performed to develop the first draft of the guideline. In sessions of group discussion by the authors, the articles of the draft were evaluated for relevancy, clarity, and lack of repetition. The draft was sent for evaluation to all participants, including the medical faculty members, residents, and medical students, four times and necessary corrections were made according to the comments received. The final guideline included 76 behavior codes in 6 categories, including altruism, honor and integrity, responsibility, respect, justice, and excellence. The codes of the guideline cover the physicians' commitments in the physician-patient, physician-colleague, and instructor-student relationships in order to improve the quality of the services. The Islamic and Iranian culture were taken into consideration in developing the guideline. Accordance with the administrative and educational conditions of the universities was ensured in developing the guideline and its acceptance was ensured through extensive surveys. Thus, it is expected that this guideline will be very effective in enhancing professional commitment in medical universities.
Generalizing the extensibility of a dynamic geometry software
NASA Astrophysics Data System (ADS)
Herceg, Đorđe; Radaković, Davorka; Herceg, Dejana
2012-09-01
Plug-and-play visual components in a Dynamic Geometry Software (DGS) enable development of visually attractive, rich and highly interactive dynamic drawings. We are developing SLGeometry, a DGS that contains a custom programming language, a computer algebra system (CAS engine) and a graphics subsystem. The basic extensibility framework on SLGeometry supports dynamic addition of new functions from attribute annotated classes that implement runtime metadata registration in code. We present a general plug-in framework for dynamic importing of arbitrary Silverlight user interface (UI) controls into SLGeometry at runtime. The CAS engine maintains a metadata storage that describes each imported visual component and enables two-way communication between the expressions stored in the engine and the UI controls on the screen.
SDA 7: A modular and parallel implementation of the simulation of diffusional association software
Martinez, Michael; Romanowska, Julia; Kokh, Daria B.; Ozboyaci, Musa; Yu, Xiaofeng; Öztürk, Mehmet Ali; Richter, Stefan
2015-01-01
The simulation of diffusional association (SDA) Brownian dynamics software package has been widely used in the study of biomacromolecular association. Initially developed to calculate bimolecular protein–protein association rate constants, it has since been extended to study electron transfer rates, to predict the structures of biomacromolecular complexes, to investigate the adsorption of proteins to inorganic surfaces, and to simulate the dynamics of large systems containing many biomacromolecular solutes, allowing the study of concentration‐dependent effects. These extensions have led to a number of divergent versions of the software. In this article, we report the development of the latest version of the software (SDA 7). This release was developed to consolidate the existing codes into a single framework, while improving the parallelization of the code to better exploit modern multicore shared memory computer architectures. It is built using a modular object‐oriented programming scheme, to allow for easy maintenance and extension of the software, and includes new features, such as adding flexible solute representations. We discuss a number of application examples, which describe some of the methods available in the release, and provide benchmarking data to demonstrate the parallel performance. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26123630
Solubility modeling of refrigerant/lubricant mixtures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michels, H.H.; Sienel, T.H.
1996-12-31
A general model for predicting the solubility properties of refrigerant/lubricant mixtures has been developed based on applicable theory for the excess Gibbs energy of non-ideal solutions. In our approach, flexible thermodynamic forms are chosen to describe the properties of both the gas and liquid phases of refrigerant/lubricant mixtures. After an extensive study of models for describing non-ideal liquid effects, the Wohl-suffix equations, which have been extensively utilized in the analysis of hydrocarbon mixtures, have been developed into a general form applicable to mixtures where one component is a POE lubricant. In the present study we have analyzed several POEs wheremore » structural and thermophysical property data were available. Data were also collected from several sources on the solubility of refrigerant/lubricant binary pairs. We have developed a computer code (NISC), based on the Wohl model, that predicts dew point or bubble point conditions over a wide range of composition and temperature. Our present analysis covers mixtures containing up to three refrigerant molecules and one lubricant. The present code can be used to analyze the properties of R-410a and R-407c in mixtures with a POE lubricant. Comparisons with other models, such as the Wilson or modified Wilson equations, indicate that the Wohl-suffix equations yield more reliable predictions for HFC/POE mixtures.« less
NASA Astrophysics Data System (ADS)
Ames, D. P.
2013-12-01
As has been seen in other informatics fields, well-documented and appropriately licensed open source software tools have the potential to significantly increase both opportunities and motivation for inter-institutional science and technology collaboration. The CUAHSI HIS (and related HydroShare) projects have aimed to foster such activities in hydrology resulting in the development of many useful community software components including the HydroDesktop software application. HydroDesktop is an open source, GIS-based, scriptable software application for discovering data on the CUAHSI Hydrologic Information System and related resources. It includes a well-defined plugin architecture and interface to allow 3rd party developers to create extensions and add new functionality without requiring recompiling of the full source code. HydroDesktop is built in the C# programming language and uses the open source DotSpatial GIS engine for spatial data management. Capabilities include data search, discovery, download, visualization, and export. An extension that integrates the R programming language with HydroDesktop provides scripting and data automation capabilities and an OpenMI plugin provides the ability to link models. Current revision and updates to HydroDesktop include migration of core business logic to cross platform, scriptable Python code modules that can be executed in any operating system or linked into other software front-end applications.
DataRocket: Interactive Visualisation of Data Structures
NASA Astrophysics Data System (ADS)
Parkes, Steve; Ramsay, Craig
2010-08-01
CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.
Neighboring block based disparity vector derivation for multiview compatible 3D-AVC
NASA Astrophysics Data System (ADS)
Kang, Jewon; Chen, Ying; Zhang, Li; Zhao, Xin; Karczewicz, Marta
2013-09-01
3D-AVC being developed under Joint Collaborative Team on 3D Video Coding (JCT-3V) significantly outperforms the Multiview Video Coding plus Depth (MVC+D) which simultaneously encodes texture views and depth views with the multiview extension of H.264/AVC (MVC). However, when the 3D-AVC is configured to support multiview compatibility in which texture views are decoded without depth information, the coding performance becomes significantly degraded. The reason is that advanced coding tools incorporated into the 3D-AVC do not perform well due to the lack of a disparity vector converted from the depth information. In this paper, we propose a disparity vector derivation method utilizing only the information of texture views. Motion information of neighboring blocks is used to determine a disparity vector for a macroblock, so that the derived disparity vector is efficiently used for the coding tools in 3D-AVC. The proposed method significantly improves a coding gain of the 3D-AVC in the multiview compatible mode about 20% BD-rate saving in the coded views and 26% BD-rate saving in the synthesized views on average.
The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions
NASA Astrophysics Data System (ADS)
Iwamoto, O.; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.
2016-01-01
A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.
Song, Jiang-Hua; Cao, Jia-Shu; Wang, Cheng-Gang
2013-01-01
KEY MESSAGE : BcMF11 as a non-coding RNA gene has an essential role in pollen development, and might be useful for regulating the pollen fertility of crops by antisense RNA technology. We previously identified a 828-bp full-length cDNA of BcMF11, a novel pollen-specific non-coding mRNA-like gene from Chinese cabbage (Brassica campestris L. ssp. chinensis Makino). However, little information is known about the function of BcMF11 in pollen development. To investigate its exact biological roles in pollen development, the BcMF11 cDNA was antisense inhibited in transgenic Chinese cabbage under the control of a tapetum-specific promoter BcA9 and a constitutive promoter CaMV 35S. Antisense RNA transgenic plants displayed decreasing expression of BcMF11 and showed distinct morphological defects. Pollen germination test in vitro and in vivo of the transgenic plants suggested that inhibition of BcMF11 decreased pollen germination efficiency and delayed the pollen tubes' extension in the style. Under scanning electron microscopy, many shrunken and collapsed pollen grains were detected in the antisense BcMF11 transgenic Chinese cabbage. Further cytological observation revealed abnormal pollen development process in transgenic plants, including delayed degradation of tapetum, asynchronous separation of microspore, and aborted development of pollen grain. These results suggest that BcMF11, as a non-coding RNA, plays an essential role in pollen development and male fertility.
Modified linear predictive coding approach for moving target tracking by Doppler radar
NASA Astrophysics Data System (ADS)
Ding, Yipeng; Lin, Xiaoyi; Sun, Ke-Hui; Xu, Xue-Mei; Liu, Xi-Yao
2016-07-01
Doppler radar is a cost-effective tool for moving target tracking, which can support a large range of civilian and military applications. A modified linear predictive coding (LPC) approach is proposed to increase the target localization accuracy of the Doppler radar. Based on the time-frequency analysis of the received echo, the proposed approach first real-time estimates the noise statistical parameters and constructs an adaptive filter to intelligently suppress the noise interference. Then, a linear predictive model is applied to extend the available data, which can help improve the resolution of the target localization result. Compared with the traditional LPC method, which empirically decides the extension data length, the proposed approach develops an error array to evaluate the prediction accuracy and thus, adjust the optimum extension data length intelligently. Finally, the prediction error array is superimposed with the predictor output to correct the prediction error. A series of experiments are conducted to illustrate the validity and performance of the proposed techniques.
Efficient 3D kinetic Monte Carlo method for modeling of molecular structure and dynamics.
Panshenskov, Mikhail; Solov'yov, Ilia A; Solov'yov, Andrey V
2014-06-30
Self-assembly of molecular systems is an important and general problem that intertwines physics, chemistry, biology, and material sciences. Through understanding of the physical principles of self-organization, it often becomes feasible to control the process and to obtain complex structures with tailored properties, for example, bacteria colonies of cells or nanodevices with desired properties. Theoretical studies and simulations provide an important tool for unraveling the principles of self-organization and, therefore, have recently gained an increasing interest. The present article features an extension of a popular code MBN EXPLORER (MesoBioNano Explorer) aiming to provide a universal approach to study self-assembly phenomena in biology and nanoscience. In particular, this extension involves a highly parallelized module of MBN EXPLORER that allows simulating stochastic processes using the kinetic Monte Carlo approach in a three-dimensional space. We describe the computational side of the developed code, discuss its efficiency, and apply it for studying an exemplary system. Copyright © 2014 Wiley Periodicals, Inc.
New features in the design code Tlie
NASA Astrophysics Data System (ADS)
van Zeijts, Johannes
1993-12-01
We present features recently installed in the arbitrary-order accelerator design code Tlie. The code uses the MAD input language, and implements programmable extensions modeled after the C language that make it a powerful tool in a wide range of applications: from basic beamline design to high precision-high order design and even control room applications. The basic quantities important in accelerator design are easily accessible from inside the control language. Entities like parameters in elements (strength, current), transfer maps (either in Taylor series or in Lie algebraic form), lines, and beams (either as sets of particles or as distributions) are among the type of variables available. These variables can be set, used as arguments in subroutines, or just typed out. The code is easily extensible with new datatypes.
ERIC Educational Resources Information Center
Harley, Jason M.; Taub, Michelle; Azevedo, Roger; Bouchet, Francois
2018-01-01
Research on collaborative learning between humans and virtual pedagogical agents represents a necessary extension to recent research on the conceptual, theoretical, methodological, analytical, and educational issues behind co- and socially-shared regulated learning between humans. This study presents a novel coding framework that was developed and…
Code Validation Studies of High-Enthalpy Flows
2006-12-01
stage of future hypersonic vehicles. The development and design of such vehicles is aided by the use of experimentation and numerical simulation... numerical predictions and experimental measurements. 3. Summary of Previous Work We have studied extensively hypersonic double-cone flows with and in...the experimental measurements and the numerical predictions. When we accounted for that effect in numerical simulations, and also augmented the
NASA Technical Reports Server (NTRS)
Hessenius, K. A.; Goorjian, P. M.
1981-01-01
A high frequency extension of the unsteady, transonic code LTRAN2 was created and is evaluated by comparisons with experimental results. The experimental test case is a NACA 64A010 airfoil in pitching motion at a Mach number of 0.8 over a range of reduced frequencies. Comparisons indicate that the modified code is an improvement of the original LTRAN2 and provides closer agreement with experimental lift and moment coefficients. A discussion of the code modifications, which involve the addition of high frequency terms of the boundary conditions of the numerical algorithm, is included.
1985-09-30
El recombination inversion, is much more effective. Furthermore, we have studied extensively a more advanced geometry which we predict theoretically ...to be even more effective: that of laser-imploded thin cylindrical shells. We report here on theoretical and Codes or. - .. I-. - experimental progress... theoretical analysis, as well as the actual demonstration on OMEGA of the compression of cylindrical shell targets were described in a paper entitled
Recent developments in multidimensional transport methods for the APOLLO 2 lattice code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zmijarevic, I.; Sanchez, R.
1995-12-31
A usual method of preparation of homogenized cross sections for reactor coarse-mesh calculations is based on two-dimensional multigroup transport treatment of an assembly together with an appropriate leakage model and reaction-rate-preserving homogenization technique. The actual generation of assembly spectrum codes based on collision probability methods is capable of treating complex geometries (i.e., irregular meshes of arbitrary shape), thus avoiding the modeling error that was introduced in codes with traditional tracking routines. The power and architecture of current computers allow the treatment of spatial domains comprising several mutually interacting assemblies using fine multigroup structure and retaining all geometric details of interest.more » Increasing safety requirements demand detailed two- and three-dimensional calculations for very heterogeneous problems such as control rod positioning, broken Pyrex rods, irregular compacting of mixed- oxide (MOX) pellets at an MOX-UO{sub 2} interface, and many others. An effort has been made to include accurate multi- dimensional transport methods in the APOLLO 2 lattice code. These include extension to three-dimensional axially symmetric geometries of the general-geometry collision probability module TDT and the development of new two- and three-dimensional characteristics methods for regular Cartesian meshes. In this paper we discuss the main features of recently developed multidimensional methods that are currently being tested.« less
Synchrotron characterization of nanograined UO 2 grain growth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mo, Kun; Miao, Yinbin; Yun, Di
2015-09-30
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructuremore » based materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize our preliminary synchrotron radiation experiments at APS to determine the grain size of nanograin UO 2. The methodology and experimental setup developed in this experiment can directly apply to the proposed in-situ grain growth measurements. The investigation of the grain growth kinetics was conducted based on isothermal annealing and grain growth characterization as functions of duration and temperature. The kinetic parameters such as activation energy for grain growth for UO 2 with different stoichiometry are obtained and compared with molecular dynamics (MD) simulations.« less
Supplying materials needed for grain growth characterizations of nano-grained UO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mo, Kun; Miao, Yinbin; Yun, Di
2015-09-30
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructuremore » based materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize our preliminary synchrotron radiation experiments at APS to determine the grain size of nanograin UO 2. The methodology and experimental setup developed in this experiment can directly apply to the proposed in-situ grain growth measurements. The investigation of the grain growth kinetics was conducted based on isothermal annealing and grain growth characterization as functions of duration and temperature. The kinetic parameters such as activation energy for grain growth for UO 2 with different stoichiometry are obtained and compared with molecular dynamics (MD) simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vine, E.
1990-11-01
As part of Lawrence Berkeley Laboratory's (LBL) technical assistance to the Sustainable City Project, compliance and enforcement activities related to local and state building codes for existing and new construction were evaluated in two case studies. The analysis of the City of San Francisco's Residential Energy Conservation Ordinance (RECO) showed that a limited, prescriptive energy conservation ordinance for existing residential construction can be enforced relatively easily with little administrative costs, and that compliance with such ordinances can be quite high. Compliance with the code was facilitated by extensive publicity, an informed public concerned with the cost of energy and knowledgeablemore » about energy efficiency, the threat of punishment (Order of Abatement), the use of private inspectors, and training workshops for City and private inspectors. The analysis of California's Title 24 Standards for new residential and commercial construction showed that enforcement of this type of code for many climate zones is more complex and requires extensive administrative support for education and training of inspectors, architects, engineers, and builders. Under this code, prescriptive and performance approaches for compliance are permitted, resulting in the demand for alternative methods of enforcement: technical assistance, plan review, field inspection, and computer analysis. In contrast to existing to construction, building design and new materials and construction practices are of critical importance in new construction, creating a need for extensive technical assistance and extensive interaction between enforcement personnel and the building community. Compliance problems associated with building design and installation did occur in both residential and nonresidential buildings. 12 refs., 5 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vine, E.
1990-11-01
As part of Lawrence Berkeley Laboratory's (LBL) technical assistance to the Sustainable City Project, compliance and enforcement activities related to local and state building codes for existing and new construction were evaluated in two case studies. The analysis of the City of San Francisco's Residential Energy Conservation Ordinance (RECO) showed that a limited, prescriptive energy conservation ordinance for existing residential construction can be enforced relatively easily with little administrative costs, and that compliance with such ordinances can be quite high. Compliance with the code was facilitated by extensive publicity, an informed public concerned with the cost of energy and knowledgeablemore » about energy efficiency, the threat of punishment (Order of Abatement), the use of private inspectors, and training workshops for City and private inspectors. The analysis of California's Title 24 Standards for new residential and commercial construction showed that enforcement of this type of code for many climate zones is more complex and requires extensive administrative support for education and training of inspectors, architects, engineers, and builders. Under this code, prescriptive and performance approaches for compliance are permitted, resulting in the demand for alternative methods of enforcement: technical assistance, plan review, field inspection, and computer analysis. In contrast to existing construction, building design and new materials and construction practices are of critical importance in new construction, creating a need for extensive technical assistance and extensive interaction between enforcement personnel and the building community. Compliance problems associated with building design and installation did occur in both residential and nonresidential buildings. Because statewide codes are enforced by local officials, these problems may increase over time as energy standards change and become more complex and as other standards (eg, health and safety codes) remain a higher priority. The California Energy Commission realizes that code enforcement by itself is insufficient and expects that additional educational and technical assistance efforts (eg, manuals, training programs, and toll-free telephone lines) will ameliorate these problems.« less
Data compression for satellite images
NASA Technical Reports Server (NTRS)
Chen, P. H.; Wintz, P. A.
1976-01-01
An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.
Friendly Extensible Transfer Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.
2016-04-15
Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).
Progress Toward Efficient Laminar Flow Analysis and Design
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas
2011-01-01
A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.
NASA Astrophysics Data System (ADS)
Williams, C. A.; Dicaprio, C.; Simons, M.
2003-12-01
With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.
Long Non-coding RNAs and Their Biological Roles in Plants
Liu, Xue; Hao, Lili; Li, Dayong; Zhu, Lihuang; Hu, Songnian
2015-01-01
With the development of genomics and bioinformatics, especially the extensive applications of high-throughput sequencing technology, more transcriptional units with little or no protein-coding potential have been discovered. Such RNA molecules are called non-protein-coding RNAs (npcRNAs or ncRNAs). Among them, long npcRNAs or ncRNAs (lnpcRNAs or lncRNAs) represent diverse classes of transcripts longer than 200 nucleotides. In recent years, the lncRNAs have been considered as important regulators in many essential biological processes. In plants, although a large number of lncRNA transcripts have been predicted and identified in few species, our current knowledge of their biological functions is still limited. Here, we have summarized recent studies on their identification, characteristics, classification, bioinformatics, resources, and current exploration of their biological functions in plants. PMID:25936895
NASA Technical Reports Server (NTRS)
Majumdar, A. K.; Hedayat, A.
2015-01-01
This paper describes the experience of the authors in using the Generalized Fluid System Simulation Program (GFSSP) in teaching Design of Thermal Systems class at University of Alabama in Huntsville. GFSSP is a finite volume based thermo-fluid system network analysis code, developed at NASA/Marshall Space Flight Center, and is extensively used in NASA, Department of Defense, and aerospace industries for propulsion system design, analysis, and performance evaluation. The educational version of GFSSP is freely available to all US higher education institutions. The main purpose of the paper is to illustrate the utilization of this user-friendly code for the thermal systems design and fluid engineering courses and to encourage the instructors to utilize the code for the class assignments as well as senior design projects.
Algorithm and code development for unsteady three-dimensional Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Obayashi, Shigeru
1993-01-01
In the last two decades, there have been extensive developments in computational aerodynamics, which constitutes a major part of the general area of computational fluid dynamics. Such developments are essential to advance the understanding of the physics of complex flows, to complement expensive wind-tunnel tests, and to reduce the overall design cost of an aircraft, particularly in the area of aeroelasticity. Aeroelasticity plays an important role in the design and development of aircraft, particularly modern aircraft, which tend to be more flexible. Several phenomena that can be dangerous and limit the performance of an aircraft occur because of the interaction of the flow with flexible components. For example, an aircraft with highly swept wings may experience vortex-induced aeroelastic oscillations. Also, undesirable aeroelastic phenomena due to the presence and movement of shock waves occur in the transonic range. Aeroelastically critical phenomena, such as a low transonic flutter speed, have been known to occur through limited wind-tunnel tests and flight tests. Aeroelastic tests require extensive cost and risk. An aeroelastic wind-tunnel experiment is an order of magnitude more expensive than a parallel experiment involving only aerodynamics. By complementing the wind-tunnel experiments with numerical simulations the overall cost of the development of aircraft can be considerably reduced. In order to accurately compute aeroelastic phenomenon it is necessary to solve the unsteady Euler/Navier-Stokes equations simultaneously with the structural equations of motion. These equations accurately describe the flow phenomena for aeroelastic applications. At Ames a code, ENSAERO, is being developed for computing the unsteady aerodynamics and aeroelasticity of aircraft and it solves the Euler/Navier-Stokes equations. The purpose of this contract is to continue the algorithm enhancements of ENSAERO and to apply the code to complicated geometries. During the last year, the geometric capability of the code was extended to simulate transonic flows, a wing with oscillating control surface. Single-grid and zonal approaches were tested. For the zonal approach, a new interpolation technique was introduced. The key development of the algorithm was an interface treatment between moving zones for a control surface using the virtual-zone concept. The work performed during the period, 1 Apr. 1992 through 31 Mar. 1993 is summarized. Additional details on the various aspects of the study are given in the Appendices.
A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics
NASA Astrophysics Data System (ADS)
Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger
2017-09-01
Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.
An inverse method for the aerodynamic design of three-dimensional aircraft engine nacelles
NASA Technical Reports Server (NTRS)
Bell, R. A.; Cedar, R. D.
1991-01-01
A fast, efficient and user friendly inverse design system for 3-D nacelles was developed. The system is a product of a 2-D inverse design method originally developed at NASA-Langley and the CFL3D analysis code which was also developed at NASA-Langley and modified for nacelle analysis. The design system uses a predictor/corrector design approach in which an analysis code is used to calculate the flow field for an initial geometry, the geometry is then modified based on the difference between the calculated and target pressures. A detailed discussion of the design method, the process of linking it to the modified CFL3D solver and its extension to 3-D is presented. This is followed by a number of examples of the use of the design system for the design of both axisymmetric and 3-D nacelles.
A crystallographic model for nickel base single crystal alloys
NASA Technical Reports Server (NTRS)
Dame, L. T.; Stouffer, D. C.
1988-01-01
The purpose of this research is to develop a tool for the mechanical analysis of nickel-base single-crystal superalloys, specifically Rene N4, used in gas turbine engine components. This objective is achieved by developing a rate-dependent anisotropic constitutive model and implementing it in a nonlinear three-dimensional finite-element code. The constitutive model is developed from metallurgical concepts utilizing a crystallographic approach. An extension of Schmid's law is combined with the Bodner-Partom equations to model the inelastic tension/compression asymmetry and orientation-dependence in octahedral slip. Schmid's law is used to approximate the inelastic response of the material in cube slip. The constitutive equations model the tensile behavior, creep response and strain-rate sensitivity of the single-crystal superalloys. Methods for deriving the material constants from standard tests are also discussed. The model is implemented in a finite-element code, and the computed and experimental results are compared for several orientations and loading conditions.
DNA: Polymer and molecular code
NASA Astrophysics Data System (ADS)
Shivashankar, G. V.
1999-10-01
The thesis work focusses upon two aspects of DNA, the polymer and the molecular code. Our approach was to bring single molecule micromanipulation methods to the study of DNA. It included a home built optical microscope combined with an atomic force microscope and an optical tweezer. This combined approach led to a novel method to graft a single DNA molecule onto a force cantilever using the optical tweezer and local heating. With this method, a force versus extension assay of double stranded DNA was realized. The resolution was about 10 picoN. To improve on this force measurement resolution, a simple light backscattering technique was developed and used to probe the DNA polymer flexibility and its fluctuations. It combined the optical tweezer to trap a DNA tethered bead and the laser backscattering to detect the beads Brownian fluctuations. With this technique the resolution was about 0.1 picoN with a millisecond access time, and the whole entropic part of the DNA force-extension was measured. With this experimental strategy, we measured the polymerization of the protein RecA on an isolated double stranded DNA. We observed the progressive decoration of RecA on the l DNA molecule, which results in the extension of l , due to unwinding of the double helix. The dynamics of polymerization, the resulting change in the DNA entropic elasticity and the role of ATP hydrolysis were the main parts of the study. A simple model for RecA assembly on DNA was proposed. This work presents a first step in the study of genetic recombination. Recently we have started a study of equilibrium binding which utilizes fluorescence polarization methods to probe the polymerization of RecA on single stranded DNA. In addition to the study of material properties of DNA and DNA-RecA, we have developed experiments for which the code of the DNA is central. We studied one aspect of DNA as a molecular code, using different techniques. In particular the programmatic use of template specificity makes gene expression a prime example of a biological code. We developed a novel method of making DNA micro- arrays, the so-called DNA chip. Using the optical tweezer concept, we were able to pattern biomolecules on a solid substrate, developing a new type of sub-micron laser lithography. A laser beam is focused onto a thin gold film on a glass substrate. Laser ablation of gold results in local aggregation of nanometer scale beads conjugated with small DNA oligonucleotides, with sub-micron resolution. This leads to specific detection of cDNA and RNA molecules. We built a simple micro-array fabrication and detection in the laboratory, based on this method, to probe addressable pools (genes, proteins or antibodies). We have lately used molecular beacons (single stranded DNA with a stem-loop structure containing a fluorophore and quencher), for the direct detection of unlabelled mRNA. As a first step towards a study of the dynamics of the biological code, we have begun to examine the patterns of gene expression during virus (T7 phage) infection of E-coli bacteria.
Code for Multiblock CFD and Heat-Transfer Computations
NASA Technical Reports Server (NTRS)
Fabian, John C.; Heidmann, James D.; Lucci, Barbara L.; Ameri, Ali A.; Rigby, David L.; Steinthorsson, Erlendur
2006-01-01
The NASA Glenn Research Center General Multi-Block Navier-Stokes Convective Heat Transfer Code, Glenn-HT, has been used extensively to predict heat transfer and fluid flow for a variety of steady gas turbine engine problems. Recently, the Glenn-HT code has been completely rewritten in Fortran 90/95, a more object-oriented language that allows programmers to create code that is more modular and makes more efficient use of data structures. The new implementation takes full advantage of the capabilities of the Fortran 90/95 programming language. As a result, the Glenn-HT code now provides dynamic memory allocation, modular design, and unsteady flow capability. This allows for the heat-transfer analysis of a full turbine stage. The code has been demonstrated for an unsteady inflow condition, and gridding efforts have been initiated for a full turbine stage unsteady calculation. This analysis will be the first to simultaneously include the effects of rotation, blade interaction, film cooling, and tip clearance with recessed tip on turbine heat transfer and cooling performance. Future plans call for the application of the new Glenn-HT code to a range of gas turbine engine problems of current interest to the heat-transfer community. The new unsteady flow capability will allow researchers to predict the effect of unsteady flow phenomena upon the convective heat transfer of turbine blades and vanes. Work will also continue on the development of conjugate heat-transfer capability in the code, where simultaneous solution of convective and conductive heat-transfer domains is accomplished. Finally, advanced turbulence and fluid flow models and automatic gridding techniques are being developed that will be applied to the Glenn-HT code and solution process.
NASA's mobile satellite development program
NASA Technical Reports Server (NTRS)
Rafferty, William; Dessouky, Khaled; Sue, Miles
1988-01-01
A Mobile Satellite System (MSS) will provide data and voice communications over a vast geographical area to a large population of mobile users. A technical overview is given of the extensive research and development studies and development performed under NASA's mobile satellite program (MSAT-X) in support of the introduction of a U.S. MSS. The critical technologies necessary to enable such a system are emphasized: vehicle antennas, modulation and coding, speech coders, networking and propagation characterization. Also proposed is a first, and future generation MSS architecture based upon realized ground segment equipment and advanced space segment studies.
Flexible Generation of Kalman Filter Code
NASA Technical Reports Server (NTRS)
Richardson, Julian; Wilson, Edward
2006-01-01
Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator
1989-06-23
Iterations .......................... 86 3.2 Comparison between MACH and POLAR ......................... 90 3.3 Flow Chart for VSTS Algorithm...The most recent changes are: a) development of the VSTS (velocity space topology search) algorithm for calculating particle densities b) extension...with simple analytic models. The largest modification of the MACH code was the implementation of the VSTS procedure, which constituted a complete
On-Chip Transport of Biological Fluids in MEMS Devices
1999-02-01
this model has been extended for multi-dimensional geometries to simulate electroosmotic flow in microdevices. Electrophoresis model in CFD- ACE + will...integrated with CFD- ACE +. 7.0 REFERENCES 1. N. A. Patankar and H. H. Hu, "Numerical Simulation of Electroosmotic Flow," Analytical Chemistry, 70...Electroosmosis has been developed and successfully integrated with CFD- ACE + code. (ii) Extension of the above-mentioned model to simulate
Approaches and possible improvements in the area of multibody dynamics modeling
NASA Technical Reports Server (NTRS)
Lips, K. W.; Singh, R.
1987-01-01
A wide ranging look is taken at issues involved in the dynamic modeling of complex, multibodied orbiting space systems. Capabilities and limitations of two major codes (DISCOS, TREETOPS) are assessed and possible extensions to the CONTOPS software are outlined. In addition, recommendations are made concerning the direction future development should take in order to achieve higher fidelity, more computationally efficient multibody software solutions.
Multiple Neuropeptide-Coding Genes Involved in Planarian Pharynx Extension.
Shimoyama, Seira; Inoue, Takeshi; Kashima, Makoto; Agata, Kiyokazu
2016-06-01
Planarian feeding behavior involves three steps: moving toward food, extending the pharynx from their planarian's ventral side after arriving at the food, and ingesting the food through the pharynx. Although pharynx extension is a remarkable behavior, it remains unknown what neuronal cell types are involved in its regulation. To identify neurons involved in regulating pharynx extension, we quantitatively analyzed pharynx extension and sought to identify these neurons by RNA interference (RNAi) and in situ hybridization. This assay, when performed using planarians with amputation of various body parts, clearly showed that the head portion is indispensable for inducing pharynx extension. We thus tested the effects of knockdown of brain neurons such as serotonergic, GABAergic, and dopaminergic neurons by RNAi, but did not observe any effects on pharynx extension behavior. However, animals with RNAi of the Prohormone Convertase 2 (PC2, a neuropeptide processing enzyme) gene did not perform the pharynx extension behavior, suggesting the possible involvement of neuropeptide(s in the regulation of pharynx extension. We screened 24 neuropeptide-coding genes, analyzed their functions by RNAi using the pharynx extension assay system, and identified at least five neuropeptide genes involved in pharynx extension. These was expressed in different cells or neurons, and some of them were expressed in the brain, suggesting complex regulation of planarian feeding behavior by the nervous system.
NASA Astrophysics Data System (ADS)
Scholten, O.; Trinh, T. N. G.; de Vries, K. D.; Hare, B. M.
2018-01-01
The radio intensity and polarization footprint of a cosmic-ray induced extensive air shower is determined by the time-dependent structure of the current distribution residing in the plasma cloud at the shower front. In turn, the time dependence of the integrated charge-current distribution in the plasma cloud, the longitudinal shower structure, is determined by interesting physics which one would like to extract, such as the location and multiplicity of the primary cosmic-ray collision or the values of electric fields in the atmosphere during thunderstorms. To extract the structure of a shower from its footprint requires solving a complicated inverse problem. For this purpose we have developed a code that semianalytically calculates the radio footprint of an extensive air shower given an arbitrary longitudinal structure. This code can be used in an optimization procedure to extract the optimal longitudinal shower structure given a radio footprint. On the basis of air-shower universality we propose a simple parametrization of the structure of the plasma cloud. This parametrization is based on the results of Monte Carlo shower simulations. Deriving the parametrization also teaches which aspects of the plasma cloud are important for understanding the features seen in the radio-emission footprint. The calculated radio footprints are compared with microscopic CoREAS simulations.
78 FR 59265 - FD&C Yellow No. 5; Exemption From the Requirement of a Tolerance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-26
... (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). B. How.... 5 is a FDA permanently listed color additive used in food, drugs and cosmetics, including drugs and cosmetics for the eye area. FDA's color additive evaluation included the consideration of an extensive set...
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; Fales, Carl L.
1990-01-01
Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
The systems resilience research community has developed methods to manually insert additional source-program level assertions to trap errors, and also devised tools to conduct fault injection studies for scalar program codes. In this work, we contribute the first vector oriented LLVM-level fault injector VULFI to help study the effects of faults in vector architectures that are of growing importance, especially for vectorizing loops. Using VULFI, we conduct a resiliency study of nine real-world vector benchmarks using Intel’s AVX and SSE extensions as the target vector instruction sets, and offer the first reported understanding of how faults affect vector instruction sets.more » We take this work further toward automating the insertion of resilience assertions during compilation. This is based on our observation that during intermediate (e.g., LLVM-level) code generation to handle full and partial vectorization, modern compilers exploit (and explicate in their code-documentation) critical invariants. These invariants are turned into error-checking code. We confirm the efficacy of these automatically inserted low-overhead error detectors for vectorized for-loops.« less
Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry
NASA Technical Reports Server (NTRS)
Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.
Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.
Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A
2016-06-01
A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.
BBC users manual. [In LRLTRAN for CDC 7600 and STAR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ltterst, R. F.; Sutcliffe, W. G.; Warshaw, S. I.
1977-11-01
BBC is a two-dimensional, multifluid Eulerian hydro-radiation code based on KRAKEN and some subsequent ideas. It was developed in the explosion group in T-Division as a basic two-dimensional code to which various types of physics can be added. For this reason BBC is a FORTRAN (LRLTRAN) code. In order to gain the 2-to-1 to 4-to-1 speed advantage of the STACKLIB software on the 7600's and to be able to execute at high speed on the STAR, the vector extensions of LRLTRAN (STARTRAN) are used throughout the code. Either cylindrical- or slab-type problems can be run on BBC. The grid ismore » bounded by a rectangular band of boundary zones. The interfaces between the regular and boundary zones can be selected to be either rigid or nonrigid. The setup for BBC problems is described in the KEG Manual and LEG Manual. The difference equations are described in BBC Hydrodynamics. Basic input and output for BBC are described.« less
NASA Astrophysics Data System (ADS)
Bruno, A.; Michalak, D. J.; Poletto, S.; Clarke, J. S.; Dicarlo, L.
Large-scale quantum computation hinges on the ability to preserve and process quantum information with higher fidelity by increasing redundancy in a quantum error correction code. We present the realization of a scalable footprint for superconducting surface code based on planar circuit QED. We developed a tileable unit cell for surface code with all I/O routed vertically by means of superconducting through-silicon vias (TSVs). We address some of the challenges encountered during the fabrication and assembly of these chips, such as the quality of etch of the TSV, the uniformity of the ALD TiN coating conformal to the TSV, and the reliability of superconducting indium contact between the chips and PCB. We compare measured performance to a detailed list of specifications required for the realization of quantum fault tolerance. Our demonstration using centimeter-scale chips can accommodate the 50 qubits needed to target the experimental demonstration of small-distance logical qubits. Research funded by Intel Corporation and IARPA.
New Web Server - the Java Version of Tempest - Produced
NASA Technical Reports Server (NTRS)
York, David W.; Ponyik, Joseph G.
2000-01-01
A new software design and development effort has produced a Java (Sun Microsystems, Inc.) version of the award-winning Tempest software (refs. 1 and 2). In 1999, the Embedded Web Technology (EWT) team received a prestigious R&D 100 Award for Tempest, Java Version. In this article, "Tempest" will refer to the Java version of Tempest, a World Wide Web server for desktop or embedded systems. Tempest was designed at the NASA Glenn Research Center at Lewis Field to run on any platform for which a Java Virtual Machine (JVM, Sun Microsystems, Inc.) exists. The JVM acts as a translator between the native code of the platform and the byte code of Tempest, which is compiled in Java. These byte code files are Java executables with a ".class" extension. Multiple byte code files can be zipped together as a "*.jar" file for more efficient transmission over the Internet. Today's popular browsers, such as Netscape (Netscape Communications Corporation) and Internet Explorer (Microsoft Corporation) have built-in Virtual Machines to display Java applets.
Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST
NASA Astrophysics Data System (ADS)
Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan
2018-04-01
We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.
Context-aware and locality-constrained coding for image categorization.
Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun
2014-01-01
Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.
NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities
NASA Technical Reports Server (NTRS)
Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.
2015-01-01
Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The
Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators
NASA Astrophysics Data System (ADS)
Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.
2018-03-01
We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.
Collisional tests and an extension of the TEMPEST continuum gyrokinetic code
NASA Astrophysics Data System (ADS)
Cohen, R. H.; Dorr, M.; Hittinger, J.; Kerbel, G.; Nevins, W. M.; Rognlien, T.; Xiong, Z.; Xu, X. Q.
2006-04-01
An important requirement of a kinetic code for edge plasmas is the ability to accurately treat the effect of colllisions over a broad range of collisionalities. To test the interaction of collisions and parallel streaming, TEMPEST has been compared with published analytic and numerical (Monte Carlo, bounce-averaged Fokker-Planck) results for endloss of particles confined by combined electrostatic and magnetic wells. Good agreement is found over a wide range of collisionality, confining potential and mirror ratio, and the required velocity space resolution is modest. We also describe progress toward extension of (4-dimensional) TEMPEST into a ``kinetic edge transport code'' (a kinetic counterpart of UEDGE). The extension includes averaging of the gyrokinetic equations over fast timescales and approximating the averaged quadratic terms by diffusion terms which respect the boundaries of inaccessable regions in phase space. F. Najmabadi, R.W. Conn and R.H. Cohen, Nucl. Fusion 24, 75 (1984); T.D. Rognlien and T.A. Cutler, Nucl. Fusion 20, 1003 (1980).
Inverse modeling of InSAR and ground leveling data for 3D volumetric strain distribution
NASA Astrophysics Data System (ADS)
Gallardo, L. A.; Glowacka, E.; Sarychikhina, O.
2015-12-01
Wide availability of modern Interferometric Synthetic aperture Radar (InSAR) data have made possible the extensive observation of differential surface displacements and are becoming an efficient tool for the detailed monitoring of terrain subsidence associated to reservoir dynamics, volcanic deformation and active tectonism. Unfortunately, this increasing popularity has not been matched by the availability of automated codes to estimate underground deformation, since many of them still rely on trial-error subsurface model building strategies. We posit that an efficient algorithm for the volumetric modeling of differential surface displacements should match the availability of current leveling and InSAR data and have developed an algorithm for the joint inversion of ground leveling and dInSAR data in 3D. We assume the ground displacements are originated by a stress free-volume strain distribution in a homogeneous elastic media and determined the displacement field associated to an ensemble of rectangular prisms. This formulation is then used to develop a 3D conjugate gradient inversion code that searches for the three-dimensional distribution of the volumetric strains that predict InSAR and leveling surface displacements simultaneously. The algorithm is regularized applying discontinuos first and zero order Thikonov constraints. For efficiency, the resulting computational code takes advantage of the resulting convolution integral associated to the deformation field and some basic tools for multithreading parallelization. We extensively test our algorithm on leveling and InSAR test and field data of the Northwest of Mexico and compare to some feasible geological scenarios of underground deformation.
MPI-Defrost: Extension of Defrost to MPI-based Cluster Environment
NASA Astrophysics Data System (ADS)
Amin, Mustafa A.; Easther, Richard; Finkel, Hal
2011-06-01
MPI-Defrost extends Frolov’s Defrost to an MPI-based cluster environment. This version has been restricted to a single field. Restoring two-field support should be straightforward, but will require some code changes. Some output options may also not be fully supported under MPI. This code was produced to support our own work, and has been made available for the benefit of anyone interested in either oscillon simulations or an MPI capable version of Defrost, and it is provided on an "as-is" basis. Andrei Frolov is the primary developer of Defrost and we thank him for placing his work under the GPL (GNU Public License), and thus allowing us to distribute this modified version.
New Directions in Giant Planet Formation
NASA Astrophysics Data System (ADS)
Youdin, Andrew
The proposed research will explore the limits of the core accretion mechanism for forming giant planets, both in terms of timescale and orbital distance. This theoretical research will be useful in interpreting the results of ongoing exoplanet searches. The effects of radiogenic heating and aerodynamic accretion of pebbles and boulders will be included in time-dependent models of atmospheric structure and growth. To investigate these issues, we will develop and publicly share a protoplanet atmospheric evolution code as an extension of the MESA stellar evolution code. By focusing on relevant processes in the early stages of giant planet formation, we can refine model predictions for exoplanet searches at a wide range of stellar ages and distances from the host star.
Integrated modelling framework for short pulse high energy density physics experiments
NASA Astrophysics Data System (ADS)
Sircombe, N. J.; Hughes, S. J.; Ramsay, M. G.
2016-03-01
Modelling experimental campaigns on the Orion laser at AWE, and developing a viable point-design for fast ignition (FI), calls for a multi-scale approach; a complete description of the problem would require an extensive range of physics which cannot realistically be included in a single code. For modelling the laser-plasma interaction (LPI) we need a fine mesh which can capture the dispersion of electromagnetic waves, and a kinetic model for each plasma species. In the dense material of the bulk target, away from the LPI region, collisional physics dominates. The transport of hot particles generated by the action of the laser is dependent on their slowing and stopping in the dense material and their need to draw a return current. These effects will heat the target, which in turn influences transport. On longer timescales, the hydrodynamic response of the target will begin to play a role as the pressure generated from isochoric heating begins to take effect. Recent effort at AWE [1] has focussed on the development of an integrated code suite based on: the particle in cell code EPOCH, to model LPI; the Monte-Carlo electron transport code THOR, to model the onward transport of hot electrons; and the radiation hydrodynamics code CORVUS, to model the hydrodynamic response of the target. We outline the methodology adopted, elucidate on the advantages of a robustly integrated code suite compared to a single code approach, demonstrate the integrated code suite's application to modelling the heating of buried layers on Orion, and assess the potential of such experiments for the validation of modelling capability in advance of more ambitious HEDP experiments, as a step towards a predictive modelling capability for FI.
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.
Xu, Xing; Shen, Fumin; Yang, Yang; Shen, Heng Tao; Li, Xuelong
2017-05-01
Hashing based methods have attracted considerable attention for efficient cross-modal retrieval on large-scale multimedia data. The core problem of cross-modal hashing is how to learn compact binary codes that construct the underlying correlations between heterogeneous features from different modalities. A majority of recent approaches aim at learning hash functions to preserve the pairwise similarities defined by given class labels. However, these methods fail to explicitly explore the discriminative property of class labels during hash function learning. In addition, they usually discard the discrete constraints imposed on the to-be-learned binary codes, and compromise to solve a relaxed problem with quantization to obtain the approximate binary solution. Therefore, the binary codes generated by these methods are suboptimal and less discriminative to different classes. To overcome these drawbacks, we propose a novel cross-modal hashing method, termed discrete cross-modal hashing (DCH), which directly learns discriminative binary codes while retaining the discrete constraints. Specifically, DCH learns modality-specific hash functions for generating unified binary codes, and these binary codes are viewed as representative features for discriminative classification with class labels. An effective discrete optimization algorithm is developed for DCH to jointly learn the modality-specific hash function and the unified binary codes. Extensive experiments on three benchmark data sets highlight the superiority of DCH under various cross-modal scenarios and show its state-of-the-art performance.
Sparse representation-based image restoration via nonlocal supervised coding
NASA Astrophysics Data System (ADS)
Li, Ao; Chen, Deyun; Sun, Guanglu; Lin, Kezheng
2016-10-01
Sparse representation (SR) and nonlocal technique (NLT) have shown great potential in low-level image processing. However, due to the degradation of the observed image, SR and NLT may not be accurate enough to obtain a faithful restoration results when they are used independently. To improve the performance, in this paper, a nonlocal supervised coding strategy-based NLT for image restoration is proposed. The novel method has three main contributions. First, to exploit the useful nonlocal patches, a nonnegative sparse representation is introduced, whose coefficients can be utilized as the supervised weights among patches. Second, a novel objective function is proposed, which integrated the supervised weights learning and the nonlocal sparse coding to guarantee a more promising solution. Finally, to make the minimization tractable and convergence, a numerical scheme based on iterative shrinkage thresholding is developed to solve the above underdetermined inverse problem. The extensive experiments validate the effectiveness of the proposed method.
A numerical algorithm for MHD of free surface flows at low magnetic Reynolds numbers
NASA Astrophysics Data System (ADS)
Samulyak, Roman; Du, Jian; Glimm, James; Xu, Zhiliang
2007-10-01
We have developed a numerical algorithm and computational software for the study of magnetohydrodynamics (MHD) of free surface flows at low magnetic Reynolds numbers. The governing system of equations is a coupled hyperbolic-elliptic system in moving and geometrically complex domains. The numerical algorithm employs the method of front tracking and the Riemann problem for material interfaces, second order Godunov-type hyperbolic solvers, and the embedded boundary method for the elliptic problem in complex domains. The numerical algorithm has been implemented as an MHD extension of FronTier, a hydrodynamic code with free interface support. The code is applicable for numerical simulations of free surface flows of conductive liquids or weakly ionized plasmas. The code has been validated through the comparison of numerical simulations of a liquid metal jet in a non-uniform magnetic field with experiments and theory. Simulations of the Muon Collider/Neutrino Factory target have also been discussed.
Atomic Physics Effects on Convergent, Child-Langmuir Ion Flow between Nearly Transparent Electrodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santarius, John F.; Emmert, Gilbert A.
Research during this project at the University of Wisconsin Fusion Technology Institute (UW FTI) on ion and neutral flow through an arbitrary, monotonic potential difference created by nearly transparent electrodes accomplished the following: (1) developed and implemented an integral equation approach for atomic physics effects in helium plasmas; (2) extended the analysis to coupled integral equations that treat atomic and molecular deuterium ions and neutrals; (3) implemented the key deuterium and helium atomic and molecular cross sections; (4) added negative ion production and related cross sections; and (5) benchmarked the code against experimental results. The analysis and codes treat themore » species D0, D20, D+, D2+, D3+, D and, separately at present, He0 and He+. Extensions enhanced the analysis and related computer codes to include He++ ions plus planar and cylindrical geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bose, Benjamin; Koyama, Kazuya, E-mail: benjamin.bose@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk
We develop a code to produce the power spectrum in redshift space based on standard perturbation theory (SPT) at 1-loop order. The code can be applied to a wide range of modified gravity and dark energy models using a recently proposed numerical method by A.Taruya to find the SPT kernels. This includes Horndeski's theory with a general potential, which accommodates both chameleon and Vainshtein screening mechanisms and provides a non-linear extension of the effective theory of dark energy up to the third order. Focus is on a recent non-linear model of the redshift space power spectrum which has been shownmore » to model the anisotropy very well at relevant scales for the SPT framework, as well as capturing relevant non-linear effects typical of modified gravity theories. We provide consistency checks of the code against established results and elucidate its application within the light of upcoming high precision RSD data.« less
Long non-coding RNAs and their biological roles in plants.
Liu, Xue; Hao, Lili; Li, Dayong; Zhu, Lihuang; Hu, Songnian
2015-06-01
With the development of genomics and bioinformatics, especially the extensive applications of high-throughput sequencing technology, more transcriptional units with little or no protein-coding potential have been discovered. Such RNA molecules are called non-protein-coding RNAs (npcRNAs or ncRNAs). Among them, long npcRNAs or ncRNAs (lnpcRNAs or lncRNAs) represent diverse classes of transcripts longer than 200 nucleotides. In recent years, the lncRNAs have been considered as important regulators in many essential biological processes. In plants, although a large number of lncRNA transcripts have been predicted and identified in few species, our current knowledge of their biological functions is still limited. Here, we have summarized recent studies on their identification, characteristics, classification, bioinformatics, resources, and current exploration of their biological functions in plants. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.
Kraus, Wayne A; Wagner, Albert F
1986-04-01
A triatomic classical trajectory code has been modified by extensive vectorization of the algorithms to achieve much improved performance on an FPS 164 attached processor. Extensive timings on both the FPS 164 and a VAX 11/780 with floating point accelerator are presented as a function of the number of trajectories simultaneously run. The timing tests involve a potential energy surface of the LEPS variety and trajectories with 1000 time steps. The results indicate that vectorization results in timing improvements on both the VAX and the FPS. For larger numbers of trajectories run simultaneously, up to a factor of 25 improvement in speed occurs between VAX and FPS vectorized code. Copyright © 1986 John Wiley & Sons, Inc.
Support of Multidimensional Parallelism in the OpenMP Programming Model
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Jost, Gabriele
2003-01-01
OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.
A Network Coding Based Hybrid ARQ Protocol for Underwater Acoustic Sensor Networks
Wang, Hao; Wang, Shilian; Zhang, Eryang; Zou, Jianbin
2016-01-01
Underwater Acoustic Sensor Networks (UASNs) have attracted increasing interest in recent years due to their extensive commercial and military applications. However, the harsh underwater channel causes many challenges for the design of reliable underwater data transport protocol. In this paper, we propose an energy efficient data transport protocol based on network coding and hybrid automatic repeat request (NCHARQ) to ensure reliability, efficiency and availability in UASNs. Moreover, an adaptive window length estimation algorithm is designed to optimize the throughput and energy consumption tradeoff. The algorithm can adaptively change the code rate and can be insensitive to the environment change. Extensive simulations and analysis show that NCHARQ significantly reduces energy consumption with short end-to-end delay. PMID:27618044
NASA Technical Reports Server (NTRS)
Hague, D. S.; Rozendaal, H. L.
1977-01-01
Program NSEG is a rapid mission analysis code based on the use of approximate flight path equations of motion. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelope performance mapping capabilities. For example, rate-of-climb, turn rates, and energy maneuverability parameter values may be mapped in the Mach-altitude plane. Approximate take off and landing analyses are also performed. At high speeds, centrifugal lift effects are accounted for. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.
Cloud based, Open Source Software Application for Mitigating Herbicide Drift
NASA Astrophysics Data System (ADS)
Saraswat, D.; Scott, B.
2014-12-01
The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flacco, A.; Fairchild, M.; Reiche, S.
2004-12-07
The coherent radiation emitted by electrons in high brightness beam-based experiments is important from the viewpoints of both radiation source development, and the understanding and diagnosing the basic physical processes important in beam manipulations at high intensity. While much theoretical work has been developed to aid in calculating aspects of this class of radiation, these methods do not often produce accurate information concerning the experimentally relevant aspects of the radiation. At UCLA, we are particularly interested in coherent synchrotron radiation and the related phenomena of coherent edge radiation, in the context of a fs-beam chicane compression experiment at the BNLmore » ATF. To analyze this and related problems, we have developed a program that acts as an extension to the Lienard-Wiechert-based 3D simulation code TREDI, termed FieldEye. This program allows the evaluation of electromagnetic fields in the time and frequency domain in an arbitrary 2D detector planar area. We discuss here the implementation of the FieldEye code, and give examples of results relevant to the case of the ATF chicane compressor experiment.« less
A resistive magnetohydrodynamics solver using modern C++ and the Boost library
NASA Astrophysics Data System (ADS)
Einkemmer, Lukas
2016-09-01
In this paper we describe the implementation of our C++ resistive magnetohydrodynamics solver. The framework developed facilitates the separation of the code implementing the specific numerical method and the physical model from the handling of boundary conditions and the management of the computational domain. In particular, this will allow us to use finite difference stencils which are only defined in the interior of the domain (the boundary conditions are handled automatically). We will discuss this and other design considerations and their impact on performance in some detail. In addition, we provide a documentation of the code developed and demonstrate that a performance comparable to Fortran can be achieved, while still maintaining a maximum of code readability and extensibility. Catalogue identifier: AFAH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFAH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 592774 No. of bytes in distributed program, including test data, etc.: 43771395 Distribution format: tar.gz Programming language: C++03. Computer: PC, HPC systems. Operating system: POSIX compatible (extensively tested on various Linux systems). In fact only the timing class requires POSIX routines; all other parts of the program can be run on any system where a C++ compiler, Boost, CVODE, and an implementation of BLAS are available. RAM: Hundredths of Kilobytes to Gigabytes (depending on the problem size) Classification: 19.10, 4.3. External routines: Boost, CVODE, either a BLAS library or Intel MKL Nature of problem: An approximate solution to the equations of resistive magnetohydrodynamics for a given initial value and given boundary conditions is computed. Solution method: The discretization is performed using a finite difference approximation in space and the CVODE library in time (which employs a scheme based on the backward differentiation formulas). Restrictions: We consider the 2.5 dimensional case; that is, the magnetic field and the velocity field are three dimensional but all quantities depend only on x and y (but not z). Unusual features: We provide an implementation in C++ using the Boost library that combines high level techniques (which greatly increases code maintainability and extensibility) with performance that is comparable to Fortran implementations. Running time: From seconds to weeks (depending on the problem size).
MPD thruster research issues, activities, strategies
NASA Technical Reports Server (NTRS)
1991-01-01
The following activities and plans in the MPD thruster development are summarized: (1) experimental and theoretical research (magnetic nozzles at present and high power levels, MPD thrusters with applied fields extending into the thrust chamber, and improved electrode performance); and (2) tools (MACH2 code for MPD and nozzle flow calculation, laser diagnostics and spectroscopy for non-intrusive measurements of flow conditions, and extension to higher power). National strategies are also outlined.
Facilities and the Air Force Systems Acquisition Process.
1985-05-01
INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (it applicable) Sc. ADDRESS (City. State and ZIP Code) 10. SOURCE OF FUNDING NOS. PROGRAM PROJECT TASK WORK UNIT...L- . - - - CHAP T-F< I P f),DUCTr JOti The Air Force is in the midst of its most extensive peacetime force modernization programs ...will answer the following ques- tions: a. Are facility requirements anticipated and ade- quatel% scoped during the early phase of program development so
Superconducting Cavity Development for Free Electron Lasers.
1986-06-30
effects have been modeled extensively using the code PARMELA, including finite space charge . The conflict is resolved through the use of harmonically...depends on the specifics of how the whole accelerator is run, i.e., bunch length, interpulse spacing , macrobunch length, charge per bunch, external...this indicates that the bunch length should be as long as possible. 2.4 OPTIMUM BUNCH LENGTH 20 Although wakefield, HOM excitation and space charge
NASA Astrophysics Data System (ADS)
Beauchamp, James W.
2002-11-01
Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.
Development of tools and techniques for momentum compression of fast rare isotopes
DOE Office of Scientific and Technical Information (OSTI.GOV)
David J. Morrissey; Bradley M. Sherrill; Oleg Tarasov
2010-11-21
As part of our past research and development work, we have created and developed the LISE++ simulation code [Tar04, Tar08]. The LISE++ package was significantly extended with the addition of a Monte Carlo option that includes an option for calculating ion trajectories using a Taylor-series expansion up to fifth order, and implementation of the MOTER Monte Carlo code [Kow87] for ray tracing of the ions into the suite of LISE++ codes. The MOTER code was rewritten from FORTRAN into C++ and transported to the MS-Windows operating system. Extensive work went into the creation of a user-friendly interface for the code.more » An example of the graphical user interface created for the MOTER code is shown in the left panel of Figure 1 and the results of a typical calculation for the trajectories of particles that pass through the A1900 fragment separator are shown in the right panel. The MOTER code is presently included as part of the LISE++ package for downloading without restriction by the worldwide community. The LISE++ was extensively developed and generalized to apply to any projectile fragment separator during the early phase of this grant. In addition to the inclusion of the MOTER code, other important additions to the LISE++ code made during FY08/FY09 are listed. The LISE++ is distributed over the web (http://groups.nscl.msu.edu/lise ) and is available without charge to anyone by anonymous download, thus, the number of individual users is not recorded. The number of 'hits' on the servers that provide the LISE++ code is shown in Figure 3 for the last eight calendar years (left panel) along with the country from the IP address (right panel). The data show an increase in web-activity with the release of the new version of the program during the grant period and a worldwide impact. An important part of the proposed work carried out during FY07, FY08 and FY09 by a graduate student in the MSU Physics program was to benchmark the codes by comparison of detailed measurements to the LISE++ predictions. A large data set was obtained for fission fragments from the reaction of 238U ions at 81 MeV/u in a 92 mg/cm2 beryllium target with the A1900 projectile fragment separator. The data were analyzed and form the bulk of a Ph.D. dissertation that is nearing completion. The rich data set provides a number of benchmarks for the improved LISE++ code and only a few examples can be shown here. The primary information obtained from the measurements is the yield of the products as a function of mass, charge and momentum. Examples of the momentum distributions of individually identified fragments can be seen in Figures 2 and 4 along with comparisons to the predicted distributions. The agreement is remarkably good and indicates the general validity of the model of the nuclear reactions producing these fragments and of the higher order transmission calculations in the LISE++ code. The momentum distributions were integrated to provide the cross sections for the individual isotopes. As shown in Figure 5, there is good agreement with the model predictions although the observed cross sections are a factor of five or so higher in this case. Other comparisons of measured production cross sections from abrasion-fission reactions have been published by our group working at the NSCL during this period [Fol09] and through our collaboration with Japanese researchers working at RIKEN with the BigRIPS separator [Ohn08, Ohn10]. The agreement of the model predictions with the data obtained with two different fragment separators is very good and indicates the usefulness of the new LISE++ code.« less
Scheduling observational and physical practice: influence on the coding of simple motor sequences.
Ellenbuerger, Thomas; Boutin, Arnaud; Blandin, Yannick; Shea, Charles H; Panzer, Stefan
2012-01-01
The main purpose of the present experiment was to determine the coordinate system used in the development of movement codes when observational and physical practice are scheduled across practice sessions. The task was to reproduce a 1,300-ms spatial-temporal pattern of elbow flexions and extensions. An intermanual transfer paradigm with a retention test and two effector (contralateral limb) transfer tests was used. The mirror effector transfer test required the same pattern of homologous muscle activation and sequence of limb joint angles as that performed or observed during practice, and the non-mirror effector transfer test required the same spatial pattern movements as that performed or observed. The test results following the first acquisition session replicated the findings of Gruetzmacher, Panzer, Blandin, and Shea (2011) . The results following the second acquisition session indicated a strong advantage for participants who received physical practice in both practice sessions or received observational practice followed by physical practice. This advantage was found on both the retention and the mirror transfer tests compared to the non-mirror transfer test. These results demonstrate that codes based in motor coordinates can be developed relatively quickly and effectively for a simple spatial-temporal movement sequence when participants are provided with physical practice or observation followed by physical practice, but physical practice followed by observational practice or observational practice alone limits the development of codes based in motor coordinates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miao, Yinbin; Mo, Kun; Jamison, Laura M.
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL) and aims at providing experimental data for the validation of the mesoscale simulation code MARMOT. MARMOT is a mesoscale multiphysics code that predicts the coevolution of microstructure and properties within reactor fuel during its lifetime in the reactor. It is an important component of the Moose-Bison-Marmot (MBM) code suite that has been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. In order to ensure the accuracy of the microstructure-basedmore » materials models being developed within the MARMOT code, extensive validation efforts must be carried out. In this report, we summarize the experimental efforts in FY16 including the following important experiments: (1) in-situ grain growth measurement of nano-grained UO 2; (2) investigation of surface morphology in micrograined UO 2; (3) Nano-indentation experiments on nano- and micro-grained UO 2. The highlight of this year is: we have successfully demonstrated our capability to in-situ measure grain size development while maintaining the stoichiometry of nano-grained UO 2 materials; the experiment is, for the first time, using synchrotron X-ray diffraction to in-situ measure grain growth behavior of UO 2.« less
Algorithm and code development for unsteady three-dimensional Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Obayashi, Shigeru
1991-01-01
A streamwise upwind algorithm for solving the unsteady 3-D Navier-Stokes equations was extended to handle the moving grid system. It is noted that the finite volume concept is essential to extend the algorithm. The resulting algorithm is conservative for any motion of the coordinate system. Two extensions to an implicit method were considered and the implicit extension that makes the algorithm computationally efficient is implemented into Ames's aeroelasticity code, ENSAERO. The new flow solver has been validated through the solution of test problems. Test cases include three-dimensional problems with fixed and moving grids. The first test case shown is an unsteady viscous flow over an F-5 wing, while the second test considers the motion of the leading edge vortex as well as the motion of the shock wave for a clipped delta wing. The resulting algorithm has been implemented into ENSAERO. The upwind version leads to higher accuracy in both steady and unsteady computations than the previously used central-difference method does, while the increase in the computational time is small.
NPTFit: A Code Package for Non-Poissonian Template Fitting
NASA Astrophysics Data System (ADS)
Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R.
2017-06-01
We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ˜GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allow for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.
Space station integrated wall design and penetration damage control
NASA Technical Reports Server (NTRS)
Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.
1987-01-01
The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.
Recent progress in the analysis of iced airfoils and wings
NASA Technical Reports Server (NTRS)
Cebeci, Tuncer; Chen, Hsun H.; Kaups, Kalle; Schimke, Sue
1992-01-01
Recent work on the analysis of iced airfoils and wings is described. Ice shapes for multielement airfoils and wings are computed using an extension of the LEWICE code that was developed for single airfoils. The aerodynamic properties of the iced wing are determined with an interactive scheme in which the solutions of the inviscid flow equations are obtained from a panel method and the solutions of the viscous flow equations are obtained from an inverse three-dimensional finite-difference boundary-layer method. A new interaction law is used to couple the inviscid and viscous flow solutions. The newly developed LEWICE multielement code is amplified to a high-lift configuration to calculate the ice shapes on the slat and on the main airfoil and on a four-element airfoil. The application of the LEWICE wing code to the calculation of ice shapes on a MS-317 swept wing shows good agreement with measurements. The interactive boundary-layer method is applied to a tapered iced wing in order to study the effect of icing on the aerodynamic properties of the wing at several angles of attack.
FY2017 Updates to the SAS4A/SASSYS-1 Safety Analysis Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fanning, T. H.
The SAS4A/SASSYS-1 safety analysis software is used to perform deterministic analysis of anticipated events as well as design-basis and beyond-design-basis accidents for advanced fast reactors. It plays a central role in the analysis of U.S. DOE conceptual designs, proposed test and demonstration reactors, and in domestic and international collaborations. This report summarizes the code development activities that have taken place during FY2017. Extensions to the void and cladding reactivity feedback models have been implemented, and Control System capabilities have been improved through a new virtual data acquisition system for plant state variables and an additional Block Signal for a variablemore » lag compensator to represent reactivity feedback for novel shutdown devices. Current code development and maintenance needs are also summarized in three key areas: software quality assurance, modeling improvements, and maintenance of related tools. With ongoing support, SAS4A/SASSYS-1 can continue to fulfill its growing role in fast reactor safety analysis and help solidify DOE’s leadership role in fast reactor safety both domestically and in international collaborations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Robert Cameron; Steiner, Don
2004-06-15
The generation of runaway electrons during a thermal plasma disruption is a concern for the safe and economical operation of a tokamak power system. Runaway electrons have high energy, 10 to 300 MeV, and may potentially cause extensive damage to plasma-facing components (PFCs) through large temperature increases, melting of metallic components, surface erosion, and possible burnout of coolant tubes. The EPQ code system was developed to simulate the thermal response of PFCs to a runaway electron impact. The EPQ code system consists of several parts: UNIX scripts that control the operation of an electron-photon Monte Carlo code to calculate themore » interaction of the runaway electrons with the plasma-facing materials; a finite difference code to calculate the thermal response, melting, and surface erosion of the materials; a code to process, scale, transform, and convert the electron Monte Carlo data to volumetric heating rates for use in the thermal code; and several minor and auxiliary codes for the manipulation and postprocessing of the data. The electron-photon Monte Carlo code used was Electron-Gamma-Shower (EGS), developed and maintained by the National Research Center of Canada. The Quick-Therm-Two-Dimensional-Nonlinear (QTTN) thermal code solves the two-dimensional cylindrical modified heat conduction equation using the Quickest third-order accurate and stable explicit finite difference method and is capable of tracking melting or surface erosion. The EPQ code system is validated using a series of analytical solutions and simulations of experiments. The verification of the QTTN thermal code with analytical solutions shows that the code with the Quickest method is better than 99.9% accurate. The benchmarking of the EPQ code system and QTTN versus experiments showed that QTTN's erosion tracking method is accurate within 30% and that EPQ is able to predict the occurrence of melting within the proper time constraints. QTTN and EPQ are verified and validated as able to calculate the temperature distribution, phase change, and surface erosion successfully.« less
IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean
2014-01-01
The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.
Transonic blade-vortex interactions - The far field
NASA Astrophysics Data System (ADS)
Lyrintzis, A. S.; George, A. R.
Numerical techniques are developed to predict midfield and far-field helicopter noise due to main-rotor blade-vortex interaction (BVI). The extension of the two-dimensional small-disturbance transonic flow code VTRAN2 (George and Chang, 1983) to the three-dimensional far field (via the Green-function approach of Kirchhoff) is described, and the treatment of oblique BVIs is discussed. Numerical results for a NACA 64A006 airfoil at Mach 0.82 are presented in extensive graphs and characterized in detail. The far-field BVI signature is shown to begin with a strongly forward-directed primary wave (from the original BVI), with an additional downward-directed wave in the case of type C shock motion on the blade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Morgan C.
2000-07-01
The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a selectmore » group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to calculate radiation dose due to the neutron environment around a MEA is shown. An uncertainty of a factor of three in the MEA calculations is shown to be due to uncertainties in the geometry modeling. It is believed that the methodology is sound and that good agreement between simulation and experiment has been demonstrated.« less
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul
2002-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)
2001-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
Runtime Detection of C-Style Errors in UPC Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pirkelbauer, P; Liao, C; Panas, T
2011-09-29
Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less
Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community
NASA Astrophysics Data System (ADS)
Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.
2016-12-01
The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.
Fracture mechanics based design for radioactive material transport packagings -- Historical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, J.A.; Salzbrenner, D.; Sorenson, K.
1998-04-01
The use of a fracture mechanics based design for the radioactive material transport (RAM) packagings has been the subject of extensive research for more than a decade. Sandia National Laboratories (SNL) has played an important role in the research and development of the application of this technology. Ductile iron has been internationally accepted as an exemplary material for the demonstration of a fracture mechanics based method of RAM packaging design and therefore is the subject of a large portion of the research discussed in this report. SNL`s extensive research and development program, funded primarily by the U. S. Department ofmore » Energy`s Office of Transportation, Energy Management and Analytical Services (EM-76) and in an auxiliary capacity, the office of Civilian Radioactive Waste Management, is summarized in this document along with a summary of the research conducted at other institutions throughout the world. In addition to the research and development work, code and standards development and regulatory positions are also discussed.« less
Particle kinetic simulation of high altitude hypervelocity flight
NASA Technical Reports Server (NTRS)
Boyd, Iain; Haas, Brian L.
1994-01-01
Rarefied flows about hypersonic vehicles entering the upper atmosphere or through nozzles expanding into a near vacuum may only be simulated accurately with a direct simulation Monte Carlo (DSMC) method. Under this grant, researchers enhanced the models employed in the DSMC method and performed simulations in support of existing NASA projects or missions. DSMC models were developed and validated for simulating rotational, vibrational, and chemical relaxation in high-temperature flows, including effects of quantized anharmonic oscillators and temperature-dependent relaxation rates. State-of-the-art advancements were made in simulating coupled vibration-dissociation recombination for post-shock flows. Models were also developed to compute vehicle surface temperatures directly in the code rather than requiring isothermal estimates. These codes were instrumental in simulating aerobraking of NASA's Magellan spacecraft during orbital maneuvers to assess heat transfer and aerodynamic properties of the delicate satellite. NASA also depended upon simulations of entry of the Galileo probe into the atmosphere of Jupiter to provide drag and flow field information essential for accurate interpretation of an onboard experiment. Finally, the codes have been used extensively to simulate expanding nozzle flows in low-power thrusters in support of propulsion activities at NASA-Lewis. Detailed comparisons between continuum calculations and DSMC results helped to quantify the limitations of continuum CFD codes in rarefied applications.
Jamoulle, Marc; Resnick, Melissa; Grosjean, Julien; Ittoo, Ashwin; Cardillo, Elena; Vander Stichele, Robert; Darmoni, Stefan; Vanmeerbeek, Marc
2018-12-01
While documentation of clinical aspects of General Practice/Family Medicine (GP/FM) is assured by the International Classification of Primary Care (ICPC), there is no taxonomy for the professional aspects (context and management) of GP/FM. To present the development, dissemination, applications, and resulting face validity of the Q-Codes taxonomy specifically designed to describe contextual features of GP/FM, proposed as an extension to the ICPC. The Q-Codes taxonomy was developed from Lamberts' seminal idea for indexing contextual content (1987) by a multi-disciplinary team of knowledge engineers, linguists and general practitioners, through a qualitative and iterative analysis of 1702 abstracts from six GP/FM conferences using Atlas.ti software. A total of 182 concepts, called Q-Codes, representing professional aspects of GP/FM were identified and organized in a taxonomy. Dissemination: The taxonomy is published as an online terminological resource, using semantic web techniques and web ontology language (OWL) ( http://www.hetop.eu/Q ). Each Q-Code is identified with a unique resource identifier (URI), and provided with preferred terms, and scope notes in ten languages (Portuguese, Spanish, English, French, Dutch, Korean, Vietnamese, Turkish, Georgian, German) and search filters for MEDLINE and web searches. This taxonomy has already been used to support queries in bibliographic databases (e.g., MEDLINE), to facilitate indexing of grey literature in GP/FM as congress abstracts, master theses, websites and as an educational tool in vocational teaching, Conclusions: The rapidly growing list of practical applications provides face-validity for the usefulness of this freely available new terminological resource.
Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network
Lin, Kai; Wang, Di; Hu, Long
2016-01-01
With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC). The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S) evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods. PMID:27376302
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
NASA Astrophysics Data System (ADS)
Bonfiglio, D.; Chacón, L.; Cappello, S.
2010-08-01
With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacón, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code in cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonfiglio, Daniele; Chacon, Luis; Cappello, Susanna
2010-01-01
With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacon, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code inmore » cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.« less
A Data Parallel Multizone Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Jespersen, Dennis C.; Levit, Creon; Kwak, Dochan (Technical Monitor)
1995-01-01
We have developed a data parallel multizone compressible Navier-Stokes code on the Connection Machine CM-5. The code is set up for implicit time-stepping on single or multiple structured grids. For multiple grids and geometrically complex problems, we follow the "chimera" approach, where flow data on one zone is interpolated onto another in the region of overlap. We will describe our design philosophy and give some timing results for the current code. The design choices can be summarized as: 1. finite differences on structured grids; 2. implicit time-stepping with either distributed solves or data motion and local solves; 3. sequential stepping through multiple zones with interzone data transfer via a distributed data structure. We have implemented these ideas on the CM-5 using CMF (Connection Machine Fortran), a data parallel language which combines elements of Fortran 90 and certain extensions, and which bears a strong similarity to High Performance Fortran (HPF). One interesting feature is the issue of turbulence modeling, where the architecture of a parallel machine makes the use of an algebraic turbulence model awkward, whereas models based on transport equations are more natural. We will present some performance figures for the code on the CM-5, and consider the issues involved in transitioning the code to HPF for portability to other parallel platforms.
Pritoni, Marco; Ford, Rebecca; Karlin, Beth; Sanguinetti, Angela
2018-02-01
Policymakers worldwide are currently discussing whether to include home energy management (HEM) products in their portfolio of technologies to reduce carbon emissions and improve grid reliability. However, very little data is available about these products. Here we present the results of an extensive review including 308 HEM products available on the US market in 2015-2016. We gathered these data from publicly available sources such as vendor websites, online marketplaces and other vendor documents. A coding guide was developed iteratively during the data collection and utilized to classify the devices. Each product was coded based on 96 distinct attributes, grouped into 11 categories: Identifying information, Product components, Hardware, Communication, Software, Information - feedback, Information - feedforward, Control, Utility interaction, Additional benefits and Usability. The codes describe product features and functionalities, user interaction and interoperability with other devices. A mix of binary attributes and more descriptive codes allow to sort and group data without losing important qualitative information. The information is stored in a large spreadsheet included with this article, along with an explanatory coding guide. This dataset is analyzed and described in a research article entitled "Categories and functionality of smart home technology for energy management" (Ford et al., 2017) [1].
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-03-09
This work represents a first-of-its-kind successful application to employ advanced numerical methods in solving realistic two-phase flow problems with two-fluid six-equation two-phase flow model. These advanced numerical methods include high-resolution spatial discretization scheme with staggered grids (high-order) fully implicit time integration schemes, and Jacobian-free Newton–Krylov (JFNK) method as the nonlinear solver. The computer code developed in this work has been extensively validated with existing experimental flow boiling data in vertical pipes and rod bundles, which cover wide ranges of experimental conditions, such as pressure, inlet mass flux, wall heat flux and exit void fraction. Additional code-to-code benchmark with the RELAP5-3Dmore » code further verifies the correct code implementation. The combined methods employed in this work exhibit strong robustness in solving two-phase flow problems even when phase appearance (boiling) and realistic discrete flow regimes are considered. Transitional flow regimes used in existing system analysis codes, normally introduced to overcome numerical difficulty, were completely removed in this work. As a result, this in turn provides the possibility to utilize more sophisticated flow regime maps in the future to further improve simulation accuracy.« less
Formal System Verification - Extension 2
2012-08-08
vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales
Riccardi, Demian M.; Parks, Jerry M.; Johs, Alexander; ...
2015-03-20
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. We tested the core; it is well-documented and easy to install across computational platforms. Our goal for the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, anmore » abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.« less
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales.
Riccardi, Demian; Parks, Jerry M; Johs, Alexander; Smith, Jeremy C
2015-04-27
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. The core is well-tested, well-documented, and easy to install across computational platforms. The goal of the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, an abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riccardi, Demian M.; Parks, Jerry M.; Johs, Alexander
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. We tested the core; it is well-documented and easy to install across computational platforms. Our goal for the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, anmore » abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.« less
ERIC Educational Resources Information Center
Ghaffarzadegan, Navid; Stewart, Thomas R.
2011-01-01
Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…
Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.
2013-01-01
A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.
MPEG4: coding for content, interactivity, and universal accessibility
NASA Astrophysics Data System (ADS)
Reader, Cliff
1996-01-01
MPEG4 is a natural extension of audiovisual coding, and yet from many perspectives breaks new ground as a standard. New coding techniques are being introduced, of course, but they will work on new data structures. The standard itself has a new architecture, and will use a new operational model when implemented on equipment that is likely to have innovative system architecture. The author introduces the background developments in technology and applications that are driving or enabling the standard, introduces the focus of MPEG4, and enumerates the new functionalities to be supported. Key applications in interactive TV and heterogeneous environments are discussed. The architecture of MPEG4 is described, followed by a discussion of the multiphase MPEG4 communication scenarios, and issues of practical implementation of MPEG4 terminals. The paper concludes with a description of the MPEG4 workplan. In summary, MPEG4 has two fundamental attributes. First, it is the coding of audiovisual objects, which may be natural or synthetic data in two or three dimensions. Second, the heart of MPEG4 is its syntax: the MPEG4 Syntactic Descriptive Language -- MSDL.
NASA Technical Reports Server (NTRS)
Choo, Y. K.; Staiger, P. J.
1982-01-01
The code was designed to analyze performance at valves-wide-open design flow. The code can model conventional steam cycles as well as cycles that include such special features as process steam extraction and induction and feedwater heating by external heat sources. Convenience features and extensions to the special features were incorporated into the PRESTO code. The features are described, and detailed examples illustrating the use of both the original and the special features are given.
Neural Decoder for Topological Codes
NASA Astrophysics Data System (ADS)
Torlai, Giacomo; Melko, Roger G.
2017-07-01
We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.
Additional extensions to the NASCAP computer code, volume 2
NASA Technical Reports Server (NTRS)
Stannard, P. R.; Katz, I.; Mandell, M. J.
1982-01-01
Particular attention is given to comparison of the actural response of the SCATHA (Spacecraft Charging AT High Altitudes) P78-2 satellite with theoretical (NASCAP) predictions. Extensive comparisons for a variety of environmental conditions confirm the validity of the NASCAP model. A summary of the capabilities and range of validity of NASCAP is presented, with extensive reference to previously published applications. It is shown that NASCAP is capable of providing quantitatively accurate results when the object and environment are adequately represented and fall within the range of conditions for which NASCAP was intended. Three dimensional electric field affects play an important role in determining the potential of dielectric surfaces and electrically isolated conducting surfaces, particularly in the presence of artificially imposed high voltages. A theory for such phenomena is presented and applied to the active control experiments carried out in SCATHA, as well as other space and laboratory experiments. Finally, some preliminary work toward modeling large spacecraft in polar Earth orbit is presented. An initial physical model is presented including charge emission. A simple code based upon the model is described along with code test results.
Development of a Multi-Disciplinary Computing Environment (MDICE)
NASA Technical Reports Server (NTRS)
Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.
1999-01-01
The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.
Comparison of UWCC MOX fuel measurements to MCNP-REN calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.; Baker, M.; Jie, R.
1998-12-31
The development of neutron coincidence counting has greatly improved the accuracy and versatility of neutron-based techniques to assay fissile materials. Today, the shift register analyzer connected to either a passive or active neutron detector is widely used by both domestic and international safeguards organizations. The continued development of these techniques and detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model, as it is currently used, fails to accurately predict detector response in highly multiplying mediums such as mixed-oxide (MOX) lightmore » water reactor fuel assemblies. For this reason, efforts have been made to modify the currently used Monte Carlo codes and to develop new analytical methods so that this model is not required to predict detector response. The authors describe their efforts to modify a widely used Monte Carlo code for this purpose and also compare calculational results with experimental measurements.« less
Vlaisavljevich, Bess; Shiozaki, Toru
2016-08-09
We report the development of the theory and computer program for analytical nuclear energy gradients for (extended) multistate complete active space perturbation theory (CASPT2) with full internal contraction. The vertical shifts are also considered in this work. This is an extension of the fully internally contracted CASPT2 nuclear gradient program recently developed for a state-specific variant by us [MacLeod and Shiozaki, J. Chem. Phys. 2015, 142, 051103]; in this extension, the so-called λ equation is solved to account for the variation of the multistate CASPT2 energies with respect to the change in the amplitudes obtained in the preceding state-specific CASPT2 calculations, and the Z vector equations are modified accordingly. The program is parallelized using the MPI3 remote memory access protocol that allows us to perform efficient one-sided communication. The optimized geometries of the ground and excited states of a copper corrole and benzophenone are presented as numerical examples. The code is publicly available under the GNU General Public License.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
... period ends on April 22, 2013. This notice announces an extension of the public comment period until May.... SUPPLEMENTARY INFORMATION: This notice announces an extension of the public comment period until May 7, 2013. Comments may be submitted in writing to Naval Facilities Engineering Command Southwest, Attn: Code EV21.AK...
Lyapustin, Alexei
2002-09-20
Results of an extensive validation study of the new radiative transfer code SHARM-3D are described. The code is designed for modeling of unpolarized monochromatic radiative transfer in the visible and near-IR spectra in the laterally uniform atmosphere over an arbitrarily inhomogeneous anisotropic surface. The surface boundary condition is periodic. The algorithm is based on an exact solution derived with the Green's function method. Several parameterizations were introduced into the algorithm to achieve superior performance. As a result, SHARM-3D is 2-3 orders of magnitude faster than the rigorous code SHDOM. It can model radiances over large surface scenes for a number of incidence-view geometries simultaneously. Extensive comparisons against SHDOM indicate that SHARM-3D has an average accuracy of better than 1%, which along with the high speed of calculations makes it a unique tool for remote-sensing applications in land surface and related atmospheric radiation studies.
NASA Astrophysics Data System (ADS)
Lyapustin, Alexei
2002-09-01
Results of an extensive validation study of the new radiative transfer code SHARM-3D are described. The code is designed for modeling of unpolarized monochromatic radiative transfer in the visible and near-IR spectra in the laterally uniform atmosphere over an arbitrarily inhomogeneous anisotropic surface. The surface boundary condition is periodic. The algorithm is based on an exact solution derived with the Green ’s function method. Several parameterizations were introduced into the algorithm to achieve superior performance. As a result, SHARM-3D is 2 -3 orders of magnitude faster than the rigorous code SHDOM. It can model radiances over large surface scenes for a number of incidence-view geometries simultaneously. Extensive comparisons against SHDOM indicate that SHARM-3D has an average accuracy of better than 1%, which along with the high speed of calculations makes it a unique tool for remote-sensing applications in land surface and related atmospheric radiation studies.
NASA Technical Reports Server (NTRS)
Larson, William E.; Lueck, Dale E.; Parrish, Clyde F.; Sanders, Gerald B.; Trevathan, Joseph R.; Baird, R. Scott; Simon, Tom; Peters, T.; Delgado, H. (Technical Monitor)
2001-01-01
As we look forward into the new millennium, the extension of human presence beyond Low-Earth Orbit (LEO) looms large in the plans of NASA. The Agency's Strategic Plan specifically calls out the need to identify and develop technologies for 100 and 1000-day class missions beyond LEO. To meet the challenge of these extended duration missions, it is important that we learn how to utilize the indigenous resources available to us on extraterrestrial bodies. This concept, known as In-Situ Resource Utilization (ISRU) can greatly reduce the launch mass & cost of human missions while reducing the risk. These technologies may also pave the way for the commercial development of space. While no specific target beyond LEO is identified in NASA's Strategic Plan, mission architecture studies have been on-going for the Moon, Mars, Near-Earth Asteroids and Earth/Moon & Earth/Sun Libration Points. As a result of these studies, the NASA Office of Space Flight (Code M) through the Johnson and Kennedy Space Centers, is leading the effort to develop ISRU technologies and systems to meet the current and future needs of human missions beyond LEO and on to Mars. This effort also receives support from the NASA Office of Biological and Physical Research (Code U), the Office of Space Science (Code S), and the Office of Aerospace Technology (Code R). This paper will present unique developments in the area of fuel and oxidizer production, breathing air production, water production, C02 collection, separation of atmospheric gases, and gas liquefaction and storage. A technology overview will be provided for each topic along with the results achieved to date, future development plans, and the mission architectures that these technologies support.
Trace-shortened Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Solomon, G.
1994-01-01
Reed-Solomon (RS) codes have been part of standard NASA telecommunications systems for many years. RS codes are character-oriented error-correcting codes, and their principal use in space applications has been as outer codes in concatenated coding systems. However, for a given character size, say m bits, RS codes are limited to a length of, at most, 2(exp m). It is known in theory that longer character-oriented codes would be superior to RS codes in concatenation applications, but until recently no practical class of 'long' character-oriented codes had been discovered. In 1992, however, Solomon discovered an extensive class of such codes, which are now called trace-shortened Reed-Solomon (TSRS) codes. In this article, we will continue the study of TSRS codes. Our main result is a formula for the dimension of any TSRS code, as a function of its error-correcting power. Using this formula, we will give several examples of TSRS codes, some of which look very promising as candidate outer codes in high-performance coded telecommunications systems.
Finite element analysis of inviscid subsonic boattail flow
NASA Technical Reports Server (NTRS)
Chima, R. V.; Gerhart, P. M.
1981-01-01
A finite element code for analysis of inviscid subsonic flows over arbitrary nonlifting planar or axisymmetric bodies is described. The code solves a novel primitive variable formulation of the coupled irrotationality and compressible continuity equations. Results for flow over a cylinder, a sphere, and a NACA 0012 airfoil verify the code. Computed subcritical flows over an axisymmetric boattailed afterbody compare well with finite difference results and experimental data. Interative coupling with an integral turbulent boundary layer code shows strong viscous effects on the inviscid flow. Improvements in code efficiency and extensions to transonic flows are discussed.
NASA Astrophysics Data System (ADS)
Kong, Gyuyeol; Choi, Sooyong
2017-09-01
An enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding is proposed for four-level holographic data storage systems. While the previous four-ary modulation codes focus on preventing maximum two-dimensional intersymbol interference patterns, the proposed four-ary modulation code aims at maximizing the coding gains for better bit error rate performances. For achieving significant coding gains from the four-ary modulation codes, we design a new 2/3 four-ary modulation code in order to enlarge the free distance on the trellis through extensive simulation. The free distance of the proposed four-ary modulation code is extended from 1.21 to 2.04 compared with that of the conventional four-ary modulation code. The simulation result shows that the proposed four-ary modulation code has more than 1 dB gains compared with the conventional four-ary modulation code.
Strengthening Software Authentication with the ROSE Software Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2006-06-15
Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less
Development of an extensible dual-core wireless sensing node for cyber-physical systems
NASA Astrophysics Data System (ADS)
Kane, Michael; Zhu, Dapeng; Hirose, Mitsuhito; Dong, Xinjun; Winter, Benjamin; Häckell, Mortiz; Lynch, Jerome P.; Wang, Yang; Swartz, A.
2014-04-01
The introduction of wireless telemetry into the design of monitoring and control systems has been shown to reduce system costs while simplifying installations. To date, wireless nodes proposed for sensing and actuation in cyberphysical systems have been designed using microcontrollers with one computational pipeline (i.e., single-core microcontrollers). While concurrent code execution can be implemented on single-core microcontrollers, concurrency is emulated by splitting the pipeline's resources to support multiple threads of code execution. For many applications, this approach to multi-threading is acceptable in terms of speed and function. However, some applications such as feedback controls demand deterministic timing of code execution and maximum computational throughput. For these applications, the adoption of multi-core processor architectures represents one effective solution. Multi-core microcontrollers have multiple computational pipelines that can execute embedded code in parallel and can be interrupted independent of one another. In this study, a new wireless platform named Martlet is introduced with a dual-core microcontroller adopted in its design. The dual-core microcontroller design allows Martlet to dedicate one core to standard wireless sensor operations while the other core is reserved for embedded data processing and real-time feedback control law execution. Another distinct feature of Martlet is a standardized hardware interface that allows specialized daughter boards (termed wing boards) to be interfaced to the Martlet baseboard. This extensibility opens opportunity to encapsulate specialized sensing and actuation functions in a wing board without altering the design of Martlet. In addition to describing the design of Martlet, a few example wings are detailed, along with experiments showing the Martlet's ability to monitor and control physical systems such as wind turbines and buildings.
Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Extension of Generalized Fluid System Simulation Program's Fluid Property Database
NASA Technical Reports Server (NTRS)
Patel, Kishan
2011-01-01
This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.
Mathematical description of complex chemical kinetics and application to CFD modeling codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Add Java extensions to your wiki: Java applets can bring dynamic functionality to your wiki pages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scarberry, Randall E.
Virtually everyone familiar with today’s world wide web has encountered the free online encyclopedia Wikipedia many times. What you may not know is that Wikipedia is driven by an excellent open-source product called MediaWiki which is available to anyone for free. This has led to a proliferation of wiki sites devoted to just about any topic one can imagine. Users of a wiki can add content -- all that is required of them is that they type in their additions into their web browsers using the simple markup language called wikitext. Even better, the developers of wikitext made it extensible.more » With a little server-side development of your own, you can add your own custom syntax. Users aware of your extensions can then utilize them on their wiki pages with a few simple keystrokes. These extensions can be custom decorations, formatting, web applications, and even instances of the venerable old Java applet. One example of a Java applet extension is the Jmol extension (REF), used to embed a 3-D molecular viewer. This article will walk you through the deployment of a fairly elaborate applet via a MediaWiki extension. By no means exhaustive -- an entire book would be required for that -- it will demonstrate how to give the applet resize handles using using a little Javascript and CSS coding and some popular Javascript libraries. It even describes how a user may customize the extension somewhat using a wiki template. Finally, it explains a rudimentary persistence mechanism which allows applets to save data directly to the wiki pages on which they reside.« less
A graph-based approach for designing extensible pipelines
2012-01-01
Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http://code.google.com/p/dynamic-pipeline. The system has been tested on Linux and Windows platforms. Conclusions Our graph-based approach enables the automatic creation of pipelines by compiling a specialised set of tools on demand, depending on the functionality required. It also allows the implementation of extensible and low-maintenance pipelines and contributes towards consolidating openness and collaboration in bioinformatics systems. It is targeted at pipeline developers and is suited for implementing applications with sequential execution steps and combined functionalities. In the format conversion application, the automatic combination of conversion tools increased both the number of possible conversions available to the user and the extensibility of the system to allow for future updates with new file formats. PMID:22788675
HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics
NASA Astrophysics Data System (ADS)
Wiebusch, Martin
2015-10-01
This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.
Genes uniquely expressed in human growth plate chondrocytes uncover a distinct regulatory network.
Li, Bing; Balasubramanian, Karthika; Krakow, Deborah; Cohn, Daniel H
2017-12-20
Chondrogenesis is the earliest stage of skeletal development and is a highly dynamic process, integrating the activities and functions of transcription factors, cell signaling molecules and extracellular matrix proteins. The molecular mechanisms underlying chondrogenesis have been extensively studied and multiple key regulators of this process have been identified. However, a genome-wide overview of the gene regulatory network in chondrogenesis has not been achieved. In this study, employing RNA sequencing, we identified 332 protein coding genes and 34 long non-coding RNA (lncRNA) genes that are highly selectively expressed in human fetal growth plate chondrocytes. Among the protein coding genes, 32 genes were associated with 62 distinct human skeletal disorders and 153 genes were associated with skeletal defects in knockout mice, confirming their essential roles in skeletal formation. These gene products formed a comprehensive physical interaction network and participated in multiple cellular processes regulating skeletal development. The data also revealed 34 transcription factors and 11,334 distal enhancers that were uniquely active in chondrocytes, functioning as transcriptional regulators for the cartilage-selective genes. Our findings revealed a complex gene regulatory network controlling skeletal development whereby transcription factors, enhancers and lncRNAs participate in chondrogenesis by transcriptional regulation of key genes. Additionally, the cartilage-selective genes represent candidate genes for unsolved human skeletal disorders.
77 FR 16022 - Agency Information Collection Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-19
... DEPARTMENT OF ENERGY Office of Energy Efficiency and Renewable Energy Agency Information Collection Extension AGENCY: Office of Energy Efficiency and Renewable Energy, U.S. Department of Energy... Renewable Energy. [FR Doc. 2012-6546 Filed 3-16-12; 8:45 am] BILLING CODE 6450-01-P ...
76 FR 37372 - Notice of Proposed Withdrawal Extension and Notification of a Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-27
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [61510-8451-0000; MTM 80092] Notice of Proposed Withdrawal Extension and Notification of a Public Meeting; Montana AGENCY: Bureau of Land...; 8:45 am] BILLING CODE 4310-55-P ...
Development and application of unified algorithms for problems in computational science
NASA Technical Reports Server (NTRS)
Shankar, Vijaya; Chakravarthy, Sukumar
1987-01-01
A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.
Digital Coding and the Self-Proving Message
ERIC Educational Resources Information Center
Dettering, Richard
1971-01-01
Author suggests that digital Communication", which relies on arbitrary coding elements, like the phones of speech," overshadows the importance of the analogic symbolism people use more extensively than realized. Non-verbal messages can be more convincing than verbal and can be used to predict patterns of future behavior. (Author/PD)
Lightweight computational steering of very large scale molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beazley, D.M.; Lomdahl, P.S.
1996-09-01
We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less
ANTLR Tree Grammar Generator and Extensions
NASA Technical Reports Server (NTRS)
Craymer, Loring
2005-01-01
A computer program implements two extensions of ANTLR (Another Tool for Language Recognition), which is a set of software tools for translating source codes between different computing languages. ANTLR supports predicated- LL(k) lexer and parser grammars, a notation for annotating parser grammars to direct tree construction, and predicated tree grammars. [ LL(k) signifies left-right, leftmost derivation with k tokens of look-ahead, referring to certain characteristics of a grammar.] One of the extensions is a syntax for tree transformations. The other extension is the generation of tree grammars from annotated parser or input tree grammars. These extensions can simplify the process of generating source-to-source language translators and they make possible an approach, called "polyphase parsing," to translation between computing languages. The typical approach to translator development is to identify high-level semantic constructs such as "expressions," "declarations," and "definitions" as fundamental building blocks in the grammar specification used for language recognition. The polyphase approach is to lump ambiguous syntactic constructs during parsing and then disambiguate the alternatives in subsequent tree transformation passes. Polyphase parsing is believed to be useful for generating efficient recognizers for C++ and other languages that, like C++, have significant ambiguities.
Keeping abreast with long non-coding RNAs in mammary gland development and breast cancer
Hansji, Herah; Leung, Euphemia Y.; Baguley, Bruce C.; Finlay, Graeme J.; Askarian-Amiri, Marjan E.
2014-01-01
The majority of the human genome is transcribed, even though only 2% of transcripts encode proteins. Non-coding transcripts were originally dismissed as evolutionary junk or transcriptional noise, but with the development of whole genome technologies, these non-coding RNAs (ncRNAs) are emerging as molecules with vital roles in regulating gene expression. While shorter ncRNAs have been extensively studied, the functional roles of long ncRNAs (lncRNAs) are still being elucidated. Studies over the last decade show that lncRNAs are emerging as new players in a number of diseases including cancer. Potential roles in both oncogenic and tumor suppressive pathways in cancer have been elucidated, but the biological functions of the majority of lncRNAs remain to be identified. Accumulated data are identifying the molecular mechanisms by which lncRNA mediates both structural and functional roles. LncRNA can regulate gene expression at both transcriptional and post-transcriptional levels, including splicing and regulating mRNA processing, transport, and translation. Much current research is aimed at elucidating the function of lncRNAs in breast cancer and mammary gland development, and at identifying the cellular processes influenced by lncRNAs. In this paper we review current knowledge of lncRNAs contributing to these processes and present lncRNA as a new paradigm in breast cancer development. PMID:25400658
Real-time control using open source RTOS
NASA Astrophysics Data System (ADS)
Irwin, Philip C.; Johnson, Richard L., Jr.
2002-12-01
Complex telescope systems such as interferometers tend to rely heavily on hard real-time operating systems (RTOS). It has been standard practice at NASA's Jet Propulsion Laboratory (JPL) and many other institutions to use costly commercial RTOSs and hardware. After developing a real-time toolkit for VxWorks on the PowerPC platform (dubbed RTC), the interferometry group at JPL is porting this code to the real-time Application Interface (RTAI), an open source RTOS that is essentially an extension to the Linux kernel. This port has the potential to reduce software and hardware costs for future projects, while increasing the level of performance. The goals of this paper are to briefly describe the RTC toolkit, highlight the successes and pitfalls of porting the toolkit from VxWorks to Linux-RTAI, and to discuss future enhancements that will be implemented as a direct result of this port. The first port of any body of code is always the most difficult since it uncovers the OS-specific calls and forces "red flags" into those portions of the code. For this reason, It has also been a huge benefit that the project chose a generic, platform independent OS extension, ACE, and its CORBA counterpart, TAO. This port of RTC will pave the way for conversions to other environments, the most interesting of which is a non-real-time simulation environment, currently being considered by the Space Interferometry Mission (SIM) and the Terrestrial Planet Finder (TPF) Projects.
Linear microbunching analysis for recirculation machines
Tsai, C. -Y.; Douglas, D.; Li, R.; ...
2016-11-28
Microbunching instability (MBI) has been one of the most challenging issues in designs of magnetic chicanes for short-wavelength free-electron lasers or linear colliders, as well as those of transport lines for recirculating or energy-recovery-linac machines. To quantify MBI for a recirculating machine and for more systematic analyses, we have recently developed a linear Vlasov solver and incorporated relevant collective effects into the code, including the longitudinal space charge, coherent synchrotron radiation, and linac geometric impedances, with extension of the existing formulation to include beam acceleration. In our code, we semianalytically solve the linearized Vlasov equation for microbunching amplification factor formore » an arbitrary linear lattice. In this study we apply our code to beam line lattices of two comparative isochronous recirculation arcs and one arc lattice preceded by a linac section. The resultant microbunching gain functions and spectral responses are presented, with some results compared to particle tracking simulation by elegant (M. Borland, APS Light Source Note No. LS-287, 2002). These results demonstrate clearly the impact of arc lattice design on the microbunching development. Lastly, the underlying physics with inclusion of those collective effects is elucidated and the limitation of the existing formulation is also discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, C. -Y.; Douglas, D.; Li, R.
Microbunching instability (MBI) has been one of the most challenging issues in designs of magnetic chicanes for short-wavelength free-electron lasers or linear colliders, as well as those of transport lines for recirculating or energy-recovery-linac machines. To quantify MBI for a recirculating machine and for more systematic analyses, we have recently developed a linear Vlasov solver and incorporated relevant collective effects into the code, including the longitudinal space charge, coherent synchrotron radiation, and linac geometric impedances, with extension of the existing formulation to include beam acceleration. In our code, we semianalytically solve the linearized Vlasov equation for microbunching amplification factor formore » an arbitrary linear lattice. In this study we apply our code to beam line lattices of two comparative isochronous recirculation arcs and one arc lattice preceded by a linac section. The resultant microbunching gain functions and spectral responses are presented, with some results compared to particle tracking simulation by elegant (M. Borland, APS Light Source Note No. LS-287, 2002). These results demonstrate clearly the impact of arc lattice design on the microbunching development. Lastly, the underlying physics with inclusion of those collective effects is elucidated and the limitation of the existing formulation is also discussed.« less
NPTFit: A Code Package for Non-Poissonian Template Fitting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R., E-mail: smsharma@princeton.edu, E-mail: nrodd@mit.edu, E-mail: bsafdi@mit.edu
We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ∼GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allowmore » for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.« less
Improved design of special boundary elements for T-shaped reinforced concrete walls
NASA Astrophysics Data System (ADS)
Ji, Xiaodong; Liu, Dan; Qian, Jiaru
2017-01-01
This study examines the design provisions of the Chinese GB 50011-2010 code for seismic design of buildings for the special boundary elements of T-shaped reinforced concrete walls and proposes an improved design method. Comparison of the design provisions of the GB 50011-2010 code and those of the American code ACI 318-14 indicates a possible deficiency in the T-shaped wall design provisions in GB 50011-2010. A case study of a typical T-shaped wall designed in accordance with GB 50011-2010 also indicates the insufficient extent of the boundary element at the non-flange end and overly conservative design of the flange end boundary element. Improved designs for special boundary elements of T-shaped walls are developed using a displacement-based method. The proposed design formulas produce a longer boundary element at the non-flange end and a shorter boundary element at the flange end, relative to those of the GB 50011-2010 provisions. Extensive numerical analysis indicates that T-shaped walls designed using the proposed formulas develop inelastic drift of 0.01 for both cases of the flange in compression and in tension.
Mad-X a worthy successor for MAD8?
NASA Astrophysics Data System (ADS)
Schmidt, F.
2006-03-01
MAD-X is the successor at CERN to MAD8, a program for accelerator design and simulation with a long history. We had to give up on MAD8 since the code had evolved in such a way that the maintenance and upgrades had become increasingly difficult. In particular, the memory management with the Zebra banks seemed outdated. MAD-X was first released in June, 2002. It offers most of the MAD8 functionality, with some additions, corrections, and extensions. The most important of these extensions is the interface to PTC, the Polymorphic Tracking Code by E. Forest. The most relevant new features of MAD-X are: languages: C, Fortran77, and Fortran90; dynamic memory allocation: in the core program written in C; strictly modular organization, modified and extended input language; symplectic and arbitrary exact description of all elements via PTC; Taylor Maps and Normal Form techniques using PTC. It is also important to note that we have adopted a new style for program development and maintenance that relies heavily on active maintenance of modules by the users themselves. Proposals for collaboration as with KEK, Japan and GSI, Germany are therefore very welcome.
New estimates of extensive-air-shower energies on the basis of signals in scintillation detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anyutin, N. V.; Dedenko, L. G., E-mail: ddn@dec1.sinp.msu.ru; Roganova, T. M.
New formulas for estimating the energy of inclined extensive air showers (EASs) on the basis of signals in detectors by means of an original method and detailed tables of signals induced in scintillation detectors by photons, electrons, positrons, and muons and calculated with the aid of the GEANT4 code package were proposed in terms of the QGSJETII-04, EPOS LHC, and GHEISHA models. The parameters appearing in the proposed formulas were calculated by employing the CORSIKA code package. It is shown that, for showers of zenith angles in the range of 20◦–45◦, the standard constant-intensity-cut method, which is used to interpretmore » data from the Yakutsk EAS array, overestimates the shower energy by a factor of 1.2 to 1.5. It is proposed to employ the calculated VEM (Vertical Equivalent Muon) signal units of 10.8 and 11.4 MeV for, respectively, ground-based and underground scintillation detectors and to take into account the dependence of signals on the azimuthal angle of the detector position and fluctuations in the development of showers.« less
Gnuastro: GNU Astronomy Utilities
NASA Astrophysics Data System (ADS)
Akhlaghi, Mohammad
2018-01-01
Gnuastro (GNU Astronomy Utilities) manipulates and analyzes astronomical data. It is an official GNU package of a large collection of programs and C/C++ library functions. Command-line programs perform arithmetic operations on images, convert FITS images to common types like JPG or PDF, convolve an image with a given kernel or matching of kernels, perform cosmological calculations, crop parts of large images (possibly in multiple files), manipulate FITS extensions and keywords, and perform statistical operations. In addition, it contains programs to make catalogs from detection maps, add noise, make mock profiles with a variety of radial functions using monte-carlo integration for their centers, match catalogs, and detect objects in an image among many other operations. The command-line programs share the same basic command-line user interface for the comfort of both the users and developers. Gnuastro is written to comply fully with the GNU coding standards and integrates well with all Unix-like operating systems. This enables astronomers to expect a fully familiar experience in the source code, building, installing and command-line user interaction that they have seen in all the other GNU software that they use. Gnuastro's extensive library is included for users who want to build their own unique programs.
A generalized chemistry version of SPARK
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.
1988-01-01
An extension of the reacting H2-air computer code SPARK is presented, which enables the code to be used on any reacting flow problem. Routines are developed calculating in a general fashion, the reaction rates, and chemical Jacobians of any reacting system. In addition, an equilibrium routine is added so that the code will have frozen, finite rate, and equilibrium capabilities. The reaction rate for the species is determined from the law of mass action using Arrhenius expressions for the rate constants. The Jacobian routines are determined by numerically or analytically differentiating the law of mass action for each species. The equilibrium routine is based on a Gibbs free energy minimization routine. The routines are written in FORTRAN 77, with special consideration given to vectorization. Run times for the generalized routines are generally 20 percent slower than reaction specific routines. The numerical efficiency of the generalized analytical Jacobian, however, is nearly 300 percent better than the reaction specific numerical Jacobian used in SPARK.
Simulations of Coherent Synchrotron Radiation Effects in Electron Machines
NASA Astrophysics Data System (ADS)
Migliorati, M.; Schiavi, A.; Dattoli, G.
2007-09-01
Coherent synchrotron radiation (CSR) generated by high intensity electron beams can be a source of undesirable effects limiting the performance of storage rings. The complexity of the physical mechanisms underlying the interplay between the electron beam and the CSR demands for reliable simulation codes. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non linear case is ideally suited to treat wakefields - beam interaction. In this paper we report on the development of a numerical code, based on the solution of the Vlasov equation, which includes the non linear contribution due to wakefields. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that, in the case of CSR wakefields, the integration procedure is capable of reproducing the onset of an instability which leads to microbunching of the beam thus increasing the CSR at short wavelengths. In addition, considerations on the threshold of the instability for Gaussian bunches is also reported.
Simulations of Coherent Synchrotron Radiation Effects in Electron Machines
NASA Astrophysics Data System (ADS)
Migliorati, M.; Schiavi, A.; Dattoli, G.
Coherent synchrotron radiation (CSR) generated by high intensity electron beams can be a source of undesirable effects limiting the performance of storage rings. The complexity of the physical mechanisms underlying the interplay between the electron beam and the CSR demands for reliable simulation codes. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non linear case is ideally suited to treat wakefields - beam interaction. In this paper we report on the development of a numerical code, based on the solution of the Vlasov equation, which includes the non linear contribution due to wakefields. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that, in the case of CSR wakefields, the integration procedure is capable of reproducing the onset of an instability which leads to microbunching of the beam thus increasing the CSR at short wavelengths. In addition, considerations on the threshold of the instability for Gaussian bunches is also reported.
Geometric Nonlinear Computation of Thin Rods and Shells
NASA Astrophysics Data System (ADS)
Grinspun, Eitan
2011-03-01
We develop simple, fast numerical codes for the dynamics of thin elastic rods and shells, by exploiting the connection between physics, geometry, and computation. By building a discrete mechanical picture from the ground up, mimicking the axioms, structures, and symmetries of the smooth setting, we produce numerical codes that not only are consistent in a classical sense, but also reproduce qualitative, characteristic behavior of a physical system----such as exact preservation of conservation laws----even for very coarse discretizations. As two recent examples, we present discrete computational models of elastic rods and shells, with straightforward extensions to the viscous setting. Even at coarse discretizations, the resulting simulations capture characteristic geometric instabilities. The numerical codes we describe are used in experimental mechanics, cinema, and consumer software products. This is joint work with Miklós Bergou, Basile Audoly, Max Wardetzky, and Etienne Vouga. This research is supported in part by the Sloan Foundation, the NSF, Adobe, Autodesk, Intel, the Walt Disney Company, and Weta Digital.
LSSGalPy: Interactive Visualization of the Large-scale Environment Around Galaxies
NASA Astrophysics Data System (ADS)
Argudo-Fernández, M.; Duarte Puertas, S.; Ruiz, J. E.; Sabater, J.; Verley, S.; Bergond, G.
2017-05-01
New tools are needed to handle the growth of data in astrophysics delivered by recent and upcoming surveys. We aim to build open-source, light, flexible, and interactive software designed to visualize extensive three-dimensional (3D) tabular data. Entirely written in the Python language, we have developed interactive tools to browse and visualize the positions of galaxies in the universe and their positions with respect to its large-scale structures (LSS). Motivated by a previous study, we created two codes using Mollweide projection and wedge diagram visualizations, where survey galaxies can be overplotted on the LSS of the universe. These are interactive representations where the visualizations can be controlled by widgets. We have released these open-source codes that have been designed to be easily re-used and customized by the scientific community to fulfill their needs. The codes are adaptable to other kinds of 3D tabular data and are robust enough to handle several millions of objects. .
75 FR 15693 - Extension of Web-Based TRICARE Assistance Program Demonstration Project
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... DEPARTMENT OF DEFENSE Office of the Secretary Extension of Web-Based TRICARE Assistance Program Demonstration Project AGENCY: Department of Defense (DoD). ACTION: Notice of a Web-Based TRICARE Assistance... States Code, Section 1092, entitled Web-Based TRICARE Assistance Program. This demonstration was...
Extensions and Adjuncts to the BRL-COMGEOM Program
1974-08-01
m MAGIC Code, GIFT Code, Computer Simulation, Target Description, Geometric Modeling Techniques, Vulnerability Analysis 20...Arbitrary Quadric Surf ace.. 0Oo „<>. 7 III. BRITL: A GEOMETRY PREPROCESSOR PROGRAM FOR INPUT TO THE GIFT SYSTEM „ 0 18 A. Introduction <, „. ° 18 B...the BRL- GIFT code. The tasks completed under this contract and described in the report are: Ao The addition to the list of available body types
Mu, Zhiping; Hong, Baoming; Li, Shimin; Liu, Yi-Hwa
2009-01-01
Coded aperture imaging for two-dimensional (2D) planar objects has been investigated extensively in the past, whereas little success has been achieved in imaging 3D objects using this technique. In this article, the authors present a novel method of 3D single photon emission computerized tomography (SPECT) reconstruction for near-field coded aperture imaging. Multiangular coded aperture projections are acquired and a stack of 2D images is reconstructed separately from each of the projections. Secondary projections are subsequently generated from the reconstructed image stacks based on the geometry of parallel-hole collimation and the variable magnification of near-field coded aperture imaging. Sinograms of cross-sectional slices of 3D objects are assembled from the secondary projections, and the ordered subset expectation and maximization algorithm is employed to reconstruct the cross-sectional image slices from the sinograms. Experiments were conducted using a customized capillary tube phantom and a micro hot rod phantom. Imaged at approximately 50 cm from the detector, hot rods in the phantom with diameters as small as 2.4 mm could be discerned in the reconstructed SPECT images. These results have demonstrated the feasibility of the authors’ 3D coded aperture image reconstruction algorithm for SPECT, representing an important step in their effort to develop a high sensitivity and high resolution SPECT imaging system. PMID:19544769
mdFoam+: Advanced molecular dynamics in OpenFOAM
NASA Astrophysics Data System (ADS)
Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.
2018-03-01
This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.
Advanced capabilities for materials modelling with Quantum ESPRESSO
NASA Astrophysics Data System (ADS)
Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Buongiorno Nardelli, M.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; Colonna, N.; Carnimeo, I.; Dal Corso, A.; de Gironcoli, S.; Delugas, P.; DiStasio, R. A., Jr.; Ferretti, A.; Floris, A.; Fratesi, G.; Fugallo, G.; Gebauer, R.; Gerstmann, U.; Giustino, F.; Gorni, T.; Jia, J.; Kawamura, M.; Ko, H.-Y.; Kokalj, A.; Küçükbenli, E.; Lazzeri, M.; Marsili, M.; Marzari, N.; Mauri, F.; Nguyen, N. L.; Nguyen, H.-V.; Otero-de-la-Roza, A.; Paulatto, L.; Poncé, S.; Rocca, D.; Sabatini, R.; Santra, B.; Schlipf, M.; Seitsonen, A. P.; Smogunov, A.; Timrov, I.; Thonhauser, T.; Umari, P.; Vast, N.; Wu, X.; Baroni, S.
2017-11-01
Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.
Advanced capabilities for materials modelling with Quantum ESPRESSO.
Giannozzi, P; Andreussi, O; Brumme, T; Bunau, O; Buongiorno Nardelli, M; Calandra, M; Car, R; Cavazzoni, C; Ceresoli, D; Cococcioni, M; Colonna, N; Carnimeo, I; Dal Corso, A; de Gironcoli, S; Delugas, P; DiStasio, R A; Ferretti, A; Floris, A; Fratesi, G; Fugallo, G; Gebauer, R; Gerstmann, U; Giustino, F; Gorni, T; Jia, J; Kawamura, M; Ko, H-Y; Kokalj, A; Küçükbenli, E; Lazzeri, M; Marsili, M; Marzari, N; Mauri, F; Nguyen, N L; Nguyen, H-V; Otero-de-la-Roza, A; Paulatto, L; Poncé, S; Rocca, D; Sabatini, R; Santra, B; Schlipf, M; Seitsonen, A P; Smogunov, A; Timrov, I; Thonhauser, T; Umari, P; Vast, N; Wu, X; Baroni, S
2017-10-24
Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.
Advanced capabilities for materials modelling with Quantum ESPRESSO.
Andreussi, Oliviero; Brumme, Thomas; Bunau, Oana; Buongiorno Nardelli, Marco; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Cococcioni, Matteo; Colonna, Nicola; Carnimeo, Ivan; Dal Corso, Andrea; de Gironcoli, Stefano; Delugas, Pietro; DiStasio, Robert; Ferretti, Andrea; Floris, Andrea; Fratesi, Guido; Fugallo, Giorgia; Gebauer, Ralph; Gerstmann, Uwe; Giustino, Feliciano; Gorni, Tommaso; Jia, Junteng; Kawamura, Mitsuaki; Ko, Hsin-Yu; Kokalj, Anton; Küçükbenli, Emine; Lazzeri, Michele; Marsili, Margherita; Marzari, Nicola; Mauri, Francesco; Nguyen, Ngoc Linh; Nguyen, Huy-Viet; Otero-de-la-Roza, Alberto; Paulatto, Lorenzo; Poncé, Samuel; Giannozzi, Paolo; Rocca, Dario; Sabatini, Riccardo; Santra, Biswajit; Schlipf, Martin; Seitsonen, Ari Paavo; Smogunov, Alexander; Timrov, Iurii; Thonhauser, Timo; Umari, Paolo; Vast, Nathalie; Wu, Xifan; Baroni, Stefano
2017-09-27
Quantum ESPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudo-potential and projector-augmented-wave approaches. Quantum ESPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement theirs ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software. © 2017 IOP Publishing Ltd.
Extension of an Object-Oriented Optimization Tool: User's Reference Manual
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Truong, Samson S.
2015-01-01
The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.
Recent results from the SIMECA code and VLTI observations
NASA Astrophysics Data System (ADS)
Meilland, A.; Stee, Ph.
We present recent results on active hot stars using the first VLTI/MIDI and VLTI/AMBER observations, and the SIMECA code developed by Stee (1994). α Arae was the first classical Be star observed with MIDI (June 2003) and AMBER (February 2005). The size of its circumstellar envelope was measured and thanks to the spectrally-resolved interferometric AMBER observations, we were able, for the first time to evidence the Keplerian rotation of the circumstellar disc. MWC 297 is one of the brightest Herbig Be star, and it was then observed during the first commissioning run of AMBER in May 2004. The data obtained were good enough to allow us to infer the accretion disc extension, as well as to put strong constraints on the stellar wind geometry and kinematics.
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2013-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshimura, Ann S.; Brandt, Larry D.
2009-11-01
The NUclear EVacuation Analysis Code (NUEVAC) has been developed by Sandia National Laboratories to support the analysis of shelter-evacuate (S-E) strategies following an urban nuclear detonation. This tool can model a range of behaviors, including complex evacuation timing and path selection, as well as various sheltering or mixed evacuation and sheltering strategies. The calculations are based on externally generated, high resolution fallout deposition and plume data. Scenario setup and calculation outputs make extensive use of graphics and interactive features. This software is designed primarily to produce quantitative evaluations of nuclear detonation response options. However, the outputs have also proven usefulmore » in the communication of technical insights concerning shelter-evacuate tradeoffs to urban planning or response personnel.« less
Low-density parity-check codes for volume holographic memory systems.
Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali
2003-02-10
We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.
Computational Infrastructure for Geodynamics (CIG)
NASA Astrophysics Data System (ADS)
Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.
2004-12-01
Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.
Supersonics Project - Airport Noise Tech Challenge
NASA Technical Reports Server (NTRS)
Bridges, James
2010-01-01
The Airport Noise Tech Challenge research effort under the Supersonics Project is reviewed. While the goal of "Improved supersonic jet noise models validated on innovative nozzle concepts" remains the same, the success of the research effort has caused the thrust of the research to be modified going forward in time. The main activities from FY06-10 focused on development and validation of jet noise prediction codes. This required innovative diagnostic techniques to be developed and deployed, extensive jet noise and flow databases to be created, and computational tools to be developed and validated. Furthermore, in FY09-10 systems studies commissioned by the Supersonics Project showed that viable supersonic aircraft were within reach using variable cycle engine architectures if exhaust nozzle technology could provide 3-5dB of suppression. The Project then began to focus on integrating the technologies being developed in its Tech Challenge areas to bring about successful system designs. Consequently, the Airport Noise Tech Challenge area has shifted efforts from developing jet noise prediction codes to using them to develop low-noise nozzle concepts for integration into supersonic aircraft. The new plan of research is briefly presented by technology and timelines.
Pop-Bica, Cecilia; Gulei, Diana; Cojocneanu-Petric, Roxana; Braicu, Cornelia; Petrut, Bogdan; Berindan-Neagoe, Ioana
2017-01-01
The mortality and morbidity that characterize bladder cancer compel this malignancy into the category of hot topics in terms of biomolecular research. Therefore, a better knowledge of the specific molecular mechanisms that underlie the development and progression of bladder cancer is demanded. Tumor heterogeneity among patients with similar diagnosis, as well as intratumor heterogeneity, generates difficulties in terms of targeted therapy. Furthermore, late diagnosis represents an ongoing issue, significantly reducing the response to therapy and, inevitably, the overall survival. The role of non-coding RNAs in bladder cancer emerged in the last decade, revealing that microRNAs (miRNAs) may act as tumor suppressor genes, respectively oncogenes, but also as biomarkers for early diagnosis. Regarding other types of non-coding RNAs, especially long non-coding RNAs (lncRNAs) which are extensively reviewed in this article, their exact roles in tumorigenesis are—for the time being—not as evident as in the case of miRNAs, but, still, clearly suggested. Therefore, this review covers the non-coding RNA expression profile of bladder cancer patients and their validated target genes in bladder cancer cell lines, with repercussions on processes such as proliferation, invasiveness, apoptosis, cell cycle arrest, and other molecular pathways which are specific for the malignant transformation of cells. PMID:28703782
Pop-Bica, Cecilia; Gulei, Diana; Cojocneanu-Petric, Roxana; Braicu, Cornelia; Petrut, Bogdan; Berindan-Neagoe, Ioana
2017-07-13
The mortality and morbidity that characterize bladder cancer compel this malignancy into the category of hot topics in terms of biomolecular research. Therefore, a better knowledge of the specific molecular mechanisms that underlie the development and progression of bladder cancer is demanded. Tumor heterogeneity among patients with similar diagnosis, as well as intratumor heterogeneity, generates difficulties in terms of targeted therapy. Furthermore, late diagnosis represents an ongoing issue, significantly reducing the response to therapy and, inevitably, the overall survival. The role of non-coding RNAs in bladder cancer emerged in the last decade, revealing that microRNAs (miRNAs) may act as tumor suppressor genes, respectively oncogenes, but also as biomarkers for early diagnosis. Regarding other types of non-coding RNAs, especially long non-coding RNAs (lncRNAs) which are extensively reviewed in this article, their exact roles in tumorigenesis are-for the time being-not as evident as in the case of miRNAs, but, still, clearly suggested. Therefore, this review covers the non-coding RNA expression profile of bladder cancer patients and their validated target genes in bladder cancer cell lines, with repercussions on processes such as proliferation, invasiveness, apoptosis, cell cycle arrest, and other molecular pathways which are specific for the malignant transformation of cells.
Optimizations of a Hardware Decoder for Deep-Space Optical Communications
NASA Technical Reports Server (NTRS)
Cheng, Michael K.; Nakashima, Michael A.; Moision, Bruce E.; Hamkins, Jon
2007-01-01
The National Aeronautics and Space Administration has developed a capacity approaching modulation and coding scheme that comprises a serial concatenation of an inner accumulate pulse-position modulation (PPM) and an outer convolutional code [or serially concatenated PPM (SCPPM)] for deep-space optical communications. Decoding of this code uses the turbo principle. However, due to the nonbinary property of SCPPM, a straightforward application of classical turbo decoding is very inefficient. Here, we present various optimizations applicable in hardware implementation of the SCPPM decoder. More specifically, we feature a Super Gamma computation to efficiently handle parallel trellis edges, a pipeline-friendly 'maxstar top-2' circuit that reduces the max-only approximation penalty, a low-latency cyclic redundancy check circuit for window-based decoders, and a high-speed algorithmic polynomial interleaver that leads to memory savings. Using the featured optimizations, we implement a 6.72 megabits-per-second (Mbps) SCPPM decoder on a single field-programmable gate array (FPGA). Compared to the current data rate of 256 kilobits per second from Mars, the SCPPM coded scheme represents a throughput increase of more than twenty-six fold. Extension to a 50-Mbps decoder on a board with multiple FPGAs follows naturally. We show through hardware simulations that the SCPPM coded system can operate within 1 dB of the Shannon capacity at nominal operating conditions.
[An analysis of code-switching phenomenon in bimodal bilinguals (Libras and Portuguese).
de Sousa, Aline Nunes; de Quadros, Ronice Müller
2012-01-01
An interesting linguistic phenomenon that happens in the interaction among bilingual people is code-switching. In this paper, we are investigating code-switching among oral Brazilian Portuguese and Brazilian Sign Language - Libras, in a same enunciative chain, with the goal of identifying and analyzing the use of code-switching in the speech of a child and an adult (both hearing from deaf parents), interacting in an intermodal bilingual context, with deaf and hearing interlocutors. Code-switching in languages, in this case, occurs when a person stops to speak in Portuguese and he/she alternates to sign. This present research is a starting study, with qualitative analysis of data. Our corpus is composed of nine sections of interactions in Libras and oral Portuguese, recorded in video, part of the Bimodal Bilingual Development Project from UFSC. The data shows that adult and child's characteristics of code-switching seem to have similarities and differences. The adult seems to switch more worried about the course of the interaction. On the other hand, the child did not seem to use code-switching for specific pragmatic reasons. In regard to the switching extension, it is noted that both the child and the adult used more than one word sentences. The role of the interlocutors seems to be decisive in the interactions investigated here - especially for the adult, since the child is still acquiring awareness about the role of the interlocutor in an interaction.
Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics
NASA Astrophysics Data System (ADS)
Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.
2015-12-01
Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.
Application of Chimera Navier-Stokes Code for High Speed Flows
NASA Technical Reports Server (NTRS)
Ajmani, Kumud
1997-01-01
The primary task for this year was performed in support of the "Trailblazer" project. The purpose of the task was to perform an extensive CFD study of the shock boundary-layer interaction between the engine-diverters and the primary body surfaces of the Trailblazer vehicle. Information gathered from this study would be used to determine the effectiveness of the diverters in preventing the boundary-layer coming off of the vehicle forebody from entering the main engines. The PEGSUS code was used to define the "holes" and "boundaries" for each grid. Two sets of CFD calculations were performed.Extensive post-processing of the results was performed.
Confined Turbulent Swirling Recirculating Flow Predictions. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Abujelala, M. T.
1984-01-01
Turbulent swirling flow, the STARPIC computer code, turbulence modeling of turbulent flows, the k-xi turbulence model and extensions, turbulence parameters deduction from swirling confined flow measurements, extension of the k-xi to confined swirling recirculating flows, and general predictions for confined turbulent swirling flow are discussed.
78 FR 73915 - Notice of Request for Extension of Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-09
..., Enterprise Information Security & Policy. [FR Doc. 2013-29218 Filed 12-6-13; 8:45 am] BILLING CODE 8120-08-P ... TENNESSEE VALLEY AUTHORITY Notice of Request for Extension of Currently Approved Information... Currently Approved Information Collection. SUMMARY: The information collection described below will be...
75 FR 63856 - Public Land Order No. 7753; Extension of Public Land Order No. 7464; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTL06000-L14300000.ET0000; MTM 89170] Public Land Order No. 7753; Extension of Public Land Order No. 7464; Montana AGENCY: Bureau of Land...:45 am] BILLING CODE 4310-DN-P ...
Genomic Sequence around Butterfly Wing Development Genes: Annotation and Comparative Analysis
Conceição, Inês C.; Long, Anthony D.; Gruber, Jonathan D.; Beldade, Patrícia
2011-01-01
Background Analysis of genomic sequence allows characterization of genome content and organization, and access beyond gene-coding regions for identification of functional elements. BAC libraries, where relatively large genomic regions are made readily available, are especially useful for species without a fully sequenced genome and can increase genomic coverage of phylogenetic and biological diversity. For example, no butterfly genome is yet available despite the unique genetic and biological properties of this group, such as diversified wing color patterns. The evolution and development of these patterns is being studied in a few target species, including Bicyclus anynana, where a whole-genome BAC library allows targeted access to large genomic regions. Methodology/Principal Findings We characterize ∼1.3 Mb of genomic sequence around 11 selected genes expressed in B. anynana developing wings. Extensive manual curation of in silico predictions, also making use of a large dataset of expressed genes for this species, identified repetitive elements and protein coding sequence, and highlighted an expansion of Alcohol dehydrogenase genes. Comparative analysis with orthologous regions of the lepidopteran reference genome allowed assessment of conservation of fine-scale synteny (with detection of new inversions and translocations) and of DNA sequence (with detection of high levels of conservation of non-coding regions around some, but not all, developmental genes). Conclusions The general properties and organization of the available B. anynana genomic sequence are similar to the lepidopteran reference, despite the more than 140 MY divergence. Our results lay the groundwork for further studies of new interesting findings in relation to both coding and non-coding sequence: 1) the Alcohol dehydrogenase expansion with higher similarity between the five tandemly-repeated B. anynana paralogs than with the corresponding B. mori orthologs, and 2) the high conservation of non-coding sequence around the genes wingless and Ecdysone receptor, both involved in multiple developmental processes including wing pattern formation. PMID:21909358
Profile of the Successful Recruiter
1987-06-01
ON (0i AeoIAh•)eI A0 OOQS (C6ry Statle Ar4 IP Code) 10 SOUQRCE O; PuNOING NUMEIRS PROCRAM VIRO,ECT FAS( WORK .,Nir ELEMENT NO NO NO ACCESSIO0 NO...Personnel Research and Development Center (NPRDC) (Borman, et al., 1976-1986) provides the most extensive and promising work in this area. This work has...ambitious, working hard" correlated highest with the human relations skills category. - "Unhappy, lack of confidence" related negatively to human relations
Engineering Analysis Using a Web-based Protocol
NASA Technical Reports Server (NTRS)
Schoeffler, James D.; Claus, Russell W.
2002-01-01
This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.
An extensive coronagraphic simulation applied to LBT
NASA Astrophysics Data System (ADS)
Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.
2016-08-01
In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.
ORAC-DR: Pipelining With Other People's Code
NASA Astrophysics Data System (ADS)
Economou, Frossie; Bridger, Alan; Wright, Gillian S.; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy
As part of the UKIRT ORAC project, we have developed a pipeline (orac-dr) for driving on-line data reduction using existing astronomical packages as algorithm engines and display tools. The design is modular and extensible on several levels, allowing it to be easily adapted to a wide variety of instruments. Here we briefly review the design, discuss the robustness and speed of execution issues inherent in such pipelines, and address what constitutes a desirable (in terms of ``buy-in'' effort) engine or tool.
ERIC Educational Resources Information Center
Mayer, Richard E.; Sims, Valerie K.
1994-01-01
In 2 experiments, 162 high- and low-spatial ability students viewed a computer-generated animation and heard a concurrent or successive explanation. The concurrent group generated more creative solutions to transfer problems and demonstrated a contiguity effect consistent with dual-coding theory. (SLD)
Error suppression via complementary gauge choices in Reed-Muller codes
NASA Astrophysics Data System (ADS)
Chamberland, Christopher; Jochym-O'Connor, Tomas
2017-09-01
Concatenation of two quantum error-correcting codes with complementary sets of transversal gates can provide a means toward universal fault-tolerant quantum computation. We first show that it is generally preferable to choose the inner code with the higher pseudo-threshold to achieve lower logical failure rates. We then explore the threshold properties of a wide range of concatenation schemes. Notably, we demonstrate that the concatenation of complementary sets of Reed-Muller codes can increase the code capacity threshold under depolarizing noise when compared to extensions of previously proposed concatenation models. We also analyze the properties of logical errors under circuit-level noise, showing that smaller codes perform better for all sampled physical error rates. Our work provides new insights into the performance of universal concatenated quantum codes for both code capacity and circuit-level noise.
Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.
Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting
2012-09-01
In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.
SYMTRAN - A Time-dependent Symmetric Tandem Mirror Transport Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hua, D; Fowler, T
2004-06-15
A time-dependent version of the steady-state radial transport model in symmetric tandem mirrors in Ref. [1] has been coded up and first tests performed. Our code, named SYMTRAN, is an adaptation of the earlier SPHERE code for spheromaks, now modified for tandem mirror physics. Motivated by Post's new concept of kinetic stabilization of symmetric mirrors, it is an extension of the earlier TAMRAC rate-equation code omitting radial transport [2], which successfully accounted for experimental results in TMX. The SYMTRAN code differs from the earlier tandem mirror radial transport code TMT in that our code is focused on axisymmetric tandem mirrorsmore » and classical diffusion, whereas TMT emphasized non-ambipolar transport in TMX and MFTF-B due to yin-yang plugs and non-symmetric transitions between the plugs and axisymmetric center cell. Both codes exhibit interesting but different non-linear behavior.« less
Modelling of LOCA Tests with the BISON Fuel Performance Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Richard L; Pastore, Giovanni; Novascone, Stephen Rhead
2016-05-01
BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculationsmore » are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.« less
Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release
NASA Astrophysics Data System (ADS)
Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.
2017-11-01
We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.
MitoNuc: a database of nuclear genes coding for mitochondrial proteins. Update 2002.
Attimonelli, Marcella; Catalano, Domenico; Gissi, Carmela; Grillo, Giorgio; Licciulli, Flavio; Liuni, Sabino; Santamaria, Monica; Pesole, Graziano; Saccone, Cecilia
2002-01-01
Mitochondria, besides their central role in energy metabolism, have recently been found to be involved in a number of basic processes of cell life and to contribute to the pathogenesis of many degenerative diseases. All functions of mitochondria depend on the interaction of nuclear and organelle genomes. Mitochondrial genomes have been extensively sequenced and analysed and data have been collected in several specialised databases. In order to collect information on nuclear coded mitochondrial proteins we developed MitoNuc, a database containing detailed information on sequenced nuclear genes coding for mitochondrial proteins in Metazoa. The MitoNuc database can be retrieved through SRS and is available via the web site http://bighost.area.ba.cnr.it/mitochondriome where other mitochondrial databases developed by our group, the complete list of the sequenced mitochondrial genomes, links to other mitochondrial sites and related information, are available. The MitoAln database, related to MitoNuc in the previous release, reporting the multiple alignments of the relevant homologous protein coding regions, is no longer supported in the present release. In order to keep the links among entries in MitoNuc from homologous proteins, a new field in the database has been defined: the cluster identifier, an alpha numeric code used to identify each cluster of homologous proteins. A comment field derived from the corresponding SWISS-PROT entry has been introduced; this reports clinical data related to dysfunction of the protein. The logic scheme of MitoNuc database has been implemented in the ORACLE DBMS. This will allow the end-users to retrieve data through a friendly interface that will be soon implemented.
Efficient population-scale variant analysis and prioritization with VAPr.
Birmingham, Amanda; Mark, Adam M; Mazzaferro, Carlo; Xu, Guorong; Fisch, Kathleen M
2018-04-06
With the growing availability of population-scale whole-exome and whole-genome sequencing, demand for reproducible, scalable variant analysis has spread within genomic research communities. To address this need, we introduce the Python package VAPr (Variant Analysis and Prioritization). VAPr leverages existing annotation tools ANNOVAR and MyVariant.info with MongoDB-based flexible storage and filtering functionality. It offers biologists and bioinformatics generalists easy-to-use and scalable analysis and prioritization of genomic variants from large cohort studies. VAPr is developed in Python and is available for free use and extension under the MIT License. An install package is available on PyPi at https://pypi.python.org/pypi/VAPr, while source code and extensive documentation are on GitHub at https://github.com/ucsd-ccbb/VAPr. kfisch@ucsd.edu.
An assessment of laser velocimetry in hypersonic flow
NASA Technical Reports Server (NTRS)
1992-01-01
Although extensive progress has been made in computational fluid mechanics, reliable flight vehicle designs and modifications still cannot be made without recourse to extensive wind tunnel testing. Future progress in the computation of hypersonic flow fields is restricted by the need for a reliable mean flow and turbulence modeling data base which could be used to aid in the development of improved empirical models for use in numerical codes. Currently, there are few compressible flow measurements which could be used for this purpose. In this report, the results of experiments designed to assess the potential for laser velocimeter measurements of mean flow and turbulent fluctuations in hypersonic flow fields are presented. Details of a new laser velocimeter system which was designed and built for this test program are described.
Using hidden Markov models and observed evolution to annotate viral genomes.
McCauley, Stephen; Hein, Jotun
2006-06-01
ssRNA (single stranded) viral genomes are generally constrained in length and utilize overlapping reading frames to maximally exploit the coding potential within the genome length restrictions. This overlapping coding phenomenon leads to complex evolutionary constraints operating on the genome. In regions which code for more than one protein, silent mutations in one reading frame generally have a protein coding effect in another. To maximize coding flexibility in all reading frames, overlapping regions are often compositionally biased towards amino acids which are 6-fold degenerate with respect to the 64 codon alphabet. Previous methodologies have used this fact in an ad hoc manner to look for overlapping genes by motif matching. In this paper differentiated nucleotide compositional patterns in overlapping regions are incorporated into a probabilistic hidden Markov model (HMM) framework which is used to annotate ssRNA viral genomes. This work focuses on single sequence annotation and applies an HMM framework to ssRNA viral annotation. A description of how the HMM is parameterized, whilst annotating within a missing data framework is given. A Phylogenetic HMM (Phylo-HMM) extension, as applied to 14 aligned HIV2 sequences is also presented. This evolutionary extension serves as an illustration of the potential of the Phylo-HMM framework for ssRNA viral genomic annotation. The single sequence annotation procedure (SSA) is applied to 14 different strains of the HIV2 virus. Further results on alternative ssRNA viral genomes are presented to illustrate more generally the performance of the method. The results of the SSA method are encouraging however there is still room for improvement, and since there is overwhelming evidence to indicate that comparative methods can improve coding sequence (CDS) annotation, the SSA method is extended to a Phylo-HMM to incorporate evolutionary information. The Phylo-HMM extension is applied to the same set of 14 HIV2 sequences which are pre-aligned. The performance improvement that results from including the evolutionary information in the analysis is illustrated.
General Mission Analysis Tool (GMAT) Architectural Specification. Draft
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel, J.
2007-01-01
Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Cristian Rabiti; Andrea Alfonsi
2012-10-01
PHISICS is a neutronics code system currently under development at the Idaho National Laboratory (INL). Its goal is to provide state of the art simulation capability to reactor designers. The different modules for PHISICS currently under development are a nodal and semi-structured transport core solver (INSTANT), a depletion module (MRTAU) and a cross section interpolation (MIXER) module. The INSTANT module is the most developed of the mentioned above. Basic functionalities are ready to use, but the code is still in continuous development to extend its capabilities. This paper reports on the effort of coupling the nodal kinetics code package PHISICSmore » (INSTANT/MRTAU/MIXER) to the thermal hydraulics system code RELAP5-3D, to enable full core and system modeling. This will enable the possibility to model coupled (thermal-hydraulics and neutronics) problems with more options for 3D neutron kinetics, compared to the existing diffusion theory neutron kinetics module in RELAP5-3D (NESTLE). In the second part of the paper, an overview of the OECD/NEA MHTGR-350 MW benchmark is given. This benchmark has been approved by the OECD, and is based on the General Atomics 350 MW Modular High Temperature Gas Reactor (MHTGR) design. The benchmark includes coupled neutronics thermal hydraulics exercises that require more capabilities than RELAP5-3D with NESTLE offers. Therefore, the MHTGR benchmark makes extensive use of the new PHISICS/RELAP5-3D coupling capabilities. The paper presents the preliminary results of the three steady state exercises specified in Phase I of the benchmark using PHISICS/RELAP5-3D.« less
Quasi 1D Modeling of Mixed Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Woolwine, Kyle J.
2012-01-01
The AeroServoElasticity task under the NASA Supersonics Project is developing dynamic models of the propulsion system and the vehicle in order to conduct research for integrated vehicle dynamic performance. As part of this effort, a nonlinear quasi 1-dimensional model of the 2-dimensional bifurcated mixed compression supersonic inlet is being developed. The model utilizes computational fluid dynamics for both the supersonic and subsonic diffusers. The oblique shocks are modeled utilizing compressible flow equations. This model also implements variable geometry required to control the normal shock position. The model is flexible and can also be utilized to simulate other mixed compression supersonic inlet designs. The model was validated both in time and in the frequency domain against the legacy LArge Perturbation INlet code, which has been previously verified using test data. This legacy code written in FORTRAN is quite extensive and complex in terms of the amount of software and number of subroutines. Further, the legacy code is not suitable for closed loop feedback controls design, and the simulation environment is not amenable to systems integration. Therefore, a solution is to develop an innovative, more simplified, mixed compression inlet model with the same steady state and dynamic performance as the legacy code that also can be used for controls design. The new nonlinear dynamic model is implemented in MATLAB Simulink. This environment allows easier development of linear models for controls design for shock positioning. The new model is also well suited for integration with a propulsion system model to study inlet/propulsion system performance, and integration with an aero-servo-elastic system model to study integrated vehicle ride quality, vehicle stability, and efficiency.
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines.OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results
NASA Technical Reports Server (NTRS)
Jones, Scott M.
2015-01-01
Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial compressors and turbines at design and off-design conditions.
A Three-Dimensional Eulerian Code for Simulation of High-Speed Multimaterial Interactions
2011-08-01
PDE -based extension. The extension process is done on only the host cells on a particular processor. After extension the parallel communication is...condensation shocks, explosive debris transport, detonation in heterogeneous media and so on. In these flows complex interactions occur between the...A.22] and ijΩ is the spin tensor. The Jaumann derivative is used to ensure objectivity of the stress tensor with respect to rotation
Data-Driven Software Framework for Web-Based ISS Telescience
NASA Technical Reports Server (NTRS)
Tso, Kam S.
2005-01-01
Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.
Using XML and Java for Astronomical Instrumentation Control
NASA Technical Reports Server (NTRS)
Ames, Troy; Koons, Lisa; Sall, Ken; Warsaw, Craig
2000-01-01
Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). ]ML is used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, and communication mechanisms. Although the current effort is targeted for the High-resolution Airborne Wideband Camera, a first-light instrument of the Stratospheric Observatory for Infrared Astronomy, the framework is designed to be generic and extensible so that it can be applied to any instrument.
Tetley, Robert J; Blanchard, Guy B; Fletcher, Alexander G; Adams, Richard J; Sanson, Bénédicte
2016-01-01
Convergence and extension movements elongate tissues during development. Drosophila germ-band extension (GBE) is one example, which requires active cell rearrangements driven by Myosin II planar polarisation. Here, we develop novel computational methods to analyse the spatiotemporal dynamics of Myosin II during GBE, at the scale of the tissue. We show that initial Myosin II bipolar cell polarization gives way to unipolar enrichment at parasegmental boundaries and two further boundaries within each parasegment, concomitant with a doubling of cell number as the tissue elongates. These boundaries are the primary sites of cell intercalation, behaving as mechanical barriers and providing a mechanism for how cells remain ordered during GBE. Enrichment at parasegment boundaries during GBE is independent of Wingless signaling, suggesting pair-rule gene control. Our results are consistent with recent work showing that a combinatorial code of Toll-like receptors downstream of pair-rule genes contributes to Myosin II polarization via local cell-cell interactions. We propose an updated cell-cell interaction model for Myosin II polarization that we tested in a vertex-based simulation. DOI: http://dx.doi.org/10.7554/eLife.12094.001 PMID:27183005
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Kis, Mihaly; Burbridge, Emma; Brock, Ian W; Heggie, Laura; Dix, Philip J; Kavanagh, Tony A
2004-03-01
Native horseradish (Armoracia rusticana) peroxidase, HRP (EC 1.11.1.7), isoenzyme C is synthesized with N-terminal and C-terminal peptide extensions, believed to be associated with protein targeting. This study aimed to explore the specific functions of these extensions, and to generate transgenic plants with expression patterns suitable for exploring the role of peroxidase in plant development and defence. Transgenic Nicotiana tabacum (tobacco) plants expressing different versions of a synthetic horseradish peroxidase, HRP, isoenzyme C gene were constructed. The gene was engineered to include additional sequences coding for either the natural N-terminal or the C-terminal extension or both. These constructs were placed under the control of a constitutive promoter (CaMV-35S) or the tobacco RUBISCO-SSU light inducible promoter (SSU) and introduced into tobacco using Agrobacterium-mediated transformation. To study the effects of the N- and C-terminal extensions, the localization of recombinant peroxidase was determined using biochemical and molecular techniques. Transgenic tobacco plants can exhibit a ten-fold increase in peroxidase activity compared with wild-type tobacco levels, and the majority of this activity is located in the symplast. The N-terminal extension is essential for the production of high levels of recombinant protein, while the C-terminal extension has little effect. Differences in levels of enzyme activity and recombinant protein are reflected in transcript levels. There is no evidence to support either preferential secretion or vacuolar targeting of recombinant peroxidase in this heterologous expression system. This leads us to question the postulated targeting roles of these peptide extensions. The N-terminal extension is essential for high level expression and appears to influence transcript stability or translational efficiency. Plants have been generated with greatly elevated cytosolic peroxidase activity, and smaller increases in apoplastic activity. These will be valuable for exploring the role of these enzymes in stress amelioration and plant development.
KIS, MIHALY; BURBRIDGE, EMMA; BROCK, IAN W.; HEGGIE, LAURA; DIX, PHILIP J.; KAVANAGH, TONY A.
2004-01-01
• Background and Aims Native horseradish (Armoracia rusticana) peroxidase, HRP (EC 1.11.1.7), isoenzyme C is synthesized with N‐terminal and C‐terminal peptide extensions, believed to be associated with protein targeting. This study aimed to explore the specific functions of these extensions, and to generate transgenic plants with expression patterns suitable for exploring the role of peroxidase in plant development and defence. • Methods Transgenic Nicotiana tabacum (tobacco) plants expressing different versions of a synthetic horseradish peroxidase, HRP, isoenzyme C gene were constructed. The gene was engineered to include additional sequences coding for either the natural N‐terminal or the C‐terminal extension or both. These constructs were placed under the control of a constitutive promoter (CaMV‐35S) or the tobacco RUBISCO‐SSU light inducible promoter (SSU) and introduced into tobacco using Agrobacterium‐mediated transformation. To study the effects of the N‐ and C‐terminal extensions, the localization of recombinant peroxidase was determined using biochemical and molecular techniques. • Key Results Transgenic tobacco plants can exhibit a ten‐fold increase in peroxidase activity compared with wild‐type tobacco levels, and the majority of this activity is located in the symplast. The N‐terminal extension is essential for the production of high levels of recombinant protein, while the C‐terminal extension has little effect. Differences in levels of enzyme activity and recombinant protein are reflected in transcript levels. • Conclusions There is no evidence to support either preferential secretion or vacuolar targeting of recombinant peroxidase in this heterologous expression system. This leads us to question the postulated targeting roles of these peptide extensions. The N‐terminal extension is essential for high level expression and appears to influence transcript stability or translational efficiency. Plants have been generated with greatly elevated cytosolic peroxidase activity, and smaller increases in apoplastic activity. These will be valuable for exploring the role of these enzymes in stress amelioration and plant development. PMID:14749254
An Introduction to the Extensible Markup Language (XML).
ERIC Educational Resources Information Center
Bryan, Martin
1998-01-01
Describes Extensible Markup Language (XML), a subset of the Standard Generalized Markup Language (SGML) that is designed to make it easy to interchange structured documents over the Internet. Topics include Document Type Definition (DTD), components of XML, the use of XML, text and non-text elements, and uses for XML-coded files. (LRW)
76 FR 9358 - Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-17
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [MTM 067221] Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana AGENCY: Bureau of Land Management, Interior. ACTION.... Cynthia Staszak, Chief, Branch of Land Resources. [FR Doc. 2011-3617 Filed 2-16-11; 8:45 am] BILLING CODE...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vitart, Xavier; Foulon, Francois; Bodineau, Jean Christophe
Part of the French Alternative Energies and Atomic Energy Commission (CEA), the National Institute for Nuclear Science and Technology (INSTN) is a higher education institution whose mission is to provide students and professionals a high level of scientific and technological qualification in all disciplines related to nuclear energy applications. In this frame, INSTN carries out education and training (E and T) programs in nuclear instrumentation and radioprotection. Its strategy has always been to complete theoretical courses by training courses and laboratory works carried out on an extensive range of training tools that includes a large panel of nuclear instrumentation asmore » well as software applications. Since its creation in 1956, the INSTN has conducted both education and vocational programs on ionizing radiation detection. An extensive range of techniques have commonly been used during practical works with students and employees of companies who need to get the knowledge and specialization in this field. Today, the INSTN is mainly equipped with usual detectors and electronics in large numbers in order to be able to accommodate up to 48 trainees at the same time in two classrooms, with only two trainees for one workstation in order to optimize their learning. In the field of the neutron detection systems, the INSTN has strongly developed its offer taking advantage of the use of research reactors, such as ISIS reactor (700 kW) at Saclay. The implementation of neutron detection systems specific to the courses offers a unique way of observing and analysing the signal coming from neutron detectors, as well as learning how to set the parameters of the detection system in real conditions. Providing the trainees with an extensive overview of each part of the neutron monitoring instrumentation apply to a nuclear reactor, hands-on measurements on the ISIS reactor play a major role in ensuring a practical and comprehensive understanding of the neutron detection system and their integration in the safety system of nuclear reactors. It also gives a solid background for the follow up and the development of the neutron detection systems. Another field of activity of the INSTN is the development of new teaching tools using software applications. We describe hereafter two applications that have recently been developed by the institute, O.S.I.R.I.S and DOSIMEX. The INSTN and the company OREKA have developed an innovative teaching tool named O.S.I.R.I.S. The tool is built on a virtual 3D environment in which users operate in a totally free way. The action is located in a pressurised water reactor in the environment of a steam generator building. The users are students or professionals who want to learn and train on concrete situations on how to protect the workers against radiation. Through the use of this serious game, users must: establish a predictive dose evaluation, implement the principles of radiation protection, perform the dose result of the operation, analyse the gap between the predictive collective dose and the collective dose achieved, and decide about the lessons to provide feedback. Taking into account the increasing constraints related to radiation protection, it is essential to use computer codes to evaluate the dose rates generated by sources of ionizing radiation. Many powerful codes exist but are often relatively complex to implement and are dedicated to experts. In addition, these codes are often 'black boxes' that does not allow the user to understand the underlying physics. The objective of DOSIMEX codes developed by the INSTN is to give, by the means of a VBA code with a simple interface, the ability to easily conduct a wide range of radiological exposure situations: calculation of gamma, beta, neutron equivalent dose rate, calculation of internal contamination and atmospheric transfer.« less
NASA Technical Reports Server (NTRS)
Eklund, Dean R.; Northam, G. B.; Mcdaniel, J. C.; Smith, Cliff
1992-01-01
A CFD (Computational Fluid Dynamics) competition was held at the Third Scramjet Combustor Modeling Workshop to assess the current state-of-the-art in CFD codes for the analysis of scramjet combustors. Solutions from six three-dimensional Navier-Stokes codes were compared for the case of staged injection of air behind a step into a Mach 2 flow. This case was investigated experimentally at the University of Virginia and extensive in-stream data was obtained. Code-to-code comparisons have been made with regard to both accuracy and efficiency. The turbulence models employed in the solutions are believed to be a major source of discrepancy between the six solutions.
Measurement Requirements for Improved Modeling of Arcjet Facility Flows
NASA Technical Reports Server (NTRS)
Fletcher, Douglas G.
2000-01-01
Current efforts to develop new reusable launch vehicles and to pursue low-cost robotic planetary missions have led to a renewed interest in understanding arc-jet flows. Part of this renewed interest is concerned with improving the understanding of arc-jet test results and the potential use of available computational-fluid- dynamic (CFD) codes to aid in this effort. These CFD codes have been extensively developed and tested for application to nonequilibrium, hypersonic flow modeling. It is envisioned, perhaps naively, that the application of these CFD codes to the simulation of arc-jet flows would serve two purposes: first. the codes would help to characterize the nonequilibrium nature of the arc-jet flows; and second. arc-jet experiments could potentially be used to validate the flow models. These two objectives are, to some extent, mutually exclusive. However, the purpose of the present discussion is to address what role CFD codes can play in the current arc-jet flow characterization effort, and whether or not the simulation of arc-jet facility tests can be used to eva1uate some of the modeling that is used to formu1ate these codes. This presentation is organized into several sections. In the introductory section, the development of large-scale, constricted-arc test facilities within NASA is reviewed, and the current state of flow diagnostics using conventional instrumentation is summarized. The motivation for using CFD to simulate arc-jet flows is addressed in the next section, and the basic requirements for CFD models that would be used for these simulations are briefly discussed. This section is followed by a more detailed description of experimental measurements that are needed to initiate credible simulations and to evaluate their fidelity in the different flow regions of an arc-jet facility. Observations from a recent combined computational and experiment.al investigation of shock-layer flows in a large-scale arc-jet facility are then used to illustrate the current state of development of diagnostic instrumentation, CFD simulations, and general knowledge in the field of arc-jet characterization. Finally, the main points are summarized and recommendations for future efforts are given.
Interleaved concatenated codes: new perspectives on approaching the Shannon limit.
Viterbi, A J; Viterbi, A M; Sindhushayana, N T
1997-09-02
The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit.
A TDM link with channel coding and digital voice.
NASA Technical Reports Server (NTRS)
Jones, M. W.; Tu, K.; Harton, P. L.
1972-01-01
The features of a TDM (time-division multiplexed) link model are described. A PCM telemetry sequence was coded for error correction and multiplexed with a digitized voice channel. An all-digital implementation of a variable-slope delta modulation algorithm was used to digitize the voice channel. The results of extensive testing are reported. The measured coding gain and the system performance over a Gaussian channel are compared with theoretical predictions and computer simulations. Word intelligibility scores are reported as a measure of voice channel performance.
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.
2010-01-01
The space radiation environment, particularly solar particle events (SPEs), poses the risk of acute radiation sickness (ARS) to humans; and organ doses from SPE exposure may reach critical levels during extra vehicular activities (EVAs) or within lightly shielded spacecraft. NASA has developed an organ dose projection model using the BRYNTRN with SUMDOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUMDOSE, written in FORTRAN, are a Baryon transport code and an output data processing code, respectively. The ARR code is written in C. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. BRYNTRN code operation requires extensive input preparation. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN in friendly way. A GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. The ARRBOD GUI will serve as a proof-of-concept example for future integration of other human space applications risk projection models. The current version of the ARRBOD GUI is a new self-contained product and will have follow-on versions, as options are added: 1) human geometries of MAX/FAX in addition to CAM/CAF; 2) shielding distributions for spacecraft, Mars surface and atmosphere; 3) various space environmental and biophysical models; and 4) other response models to be connected to the BRYNTRN. The major components of the overall system, the subsystem interconnections, and external interfaces are described in this report; and the ARRBOD GUI product is explained step by step in order to serve as a tutorial.
ATLAS offline software performance monitoring and optimization
NASA Astrophysics Data System (ADS)
Chauhan, N.; Kabra, G.; Kittelmann, T.; Langenberg, R.; Mandrysch, R.; Salzburger, A.; Seuster, R.; Ritsch, E.; Stewart, G.; van Eldik, N.; Vitillo, R.; Atlas Collaboration
2014-06-01
In a complex multi-developer, multi-package software environment, such as the ATLAS offline framework Athena, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide the optimization work. The first tool we used to instrument the code is PAPI, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles, instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event results in a good understanding of the algorithm level performance of ATLAS code. Further data can be obtained using Pin, a dynamic binary instrumentation tool. Pin tools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is also possible. Pin tools can additionally interrogate the arguments to functions, like those in linear algebra libraries, so that a detailed usage profile can be obtained. These tools have characterized the extensive use of vector and matrix operations in ATLAS tracking. Currently, CLHEP is used here, which is not an optimal choice. To help evaluate replacement libraries a testbed has been setup allowing comparison of the performance of different linear algebra libraries (including CLHEP, Eigen and SMatrix/SVector). Results are then presented via the ATLAS Performance Management Board framework, which runs daily with the current development branch of the code and monitors reconstruction and Monte-Carlo jobs. This framework analyses the CPU and memory performance of algorithms and an overview of results are presented on a web page. These tools have provided the insight necessary to plan and implement performance enhancements in ATLAS code by identifying the most common operations, with the call parameters well understood, and allowing improvements to be quantified in detail.
Drakos, Nicole E; Wahl, Lindi M
2015-12-01
Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome. Copyright © 2015 Elsevier Inc. All rights reserved.
GobyWeb: Simplified Management and Analysis of Gene Expression and DNA Methylation Sequencing Data
Dorff, Kevin C.; Chambwe, Nyasha; Zeno, Zachary; Simi, Manuele; Shaknovich, Rita; Campagne, Fabien
2013-01-01
We present GobyWeb, a web-based system that facilitates the management and analysis of high-throughput sequencing (HTS) projects. The software provides integrated support for a broad set of HTS analyses and offers a simple plugin extension mechanism. Analyses currently supported include quantification of gene expression for messenger and small RNA sequencing, estimation of DNA methylation (i.e., reduced bisulfite sequencing and whole genome methyl-seq), or the detection of pathogens in sequenced data. In contrast to previous analysis pipelines developed for analysis of HTS data, GobyWeb requires significantly less storage space, runs analyses efficiently on a parallel grid, scales gracefully to process tens or hundreds of multi-gigabyte samples, yet can be used effectively by researchers who are comfortable using a web browser. We conducted performance evaluations of the software and found it to either outperform or have similar performance to analysis programs developed for specialized analyses of HTS data. We found that most biologists who took a one-hour GobyWeb training session were readily able to analyze RNA-Seq data with state of the art analysis tools. GobyWeb can be obtained at http://gobyweb.campagnelab.org and is freely available for non-commercial use. GobyWeb plugins are distributed in source code and licensed under the open source LGPL3 license to facilitate code inspection, reuse and independent extensions http://github.com/CampagneLaboratory/gobyweb2-plugins. PMID:23936070
A graphics-card implementation of Monte-Carlo simulations for cosmic-ray transport
NASA Astrophysics Data System (ADS)
Tautz, R. C.
2016-05-01
A graphics card implementation of a test-particle simulation code is presented that is based on the CUDA extension of the C/C++ programming language. The original CPU version has been developed for the calculation of cosmic-ray diffusion coefficients in artificial Kolmogorov-type turbulence. In the new implementation, the magnetic turbulence generation, which is the most time-consuming part, is separated from the particle transport and is performed on a graphics card. In this article, the modification of the basic approach of integrating test particle trajectories to employ the SIMD (single instruction, multiple data) model is presented and verified. The efficiency of the new code is tested and several language-specific accelerating factors are discussed. For the example of isotropic magnetostatic turbulence, sample results are shown and a comparison to the results of the CPU implementation is performed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, C.I.; Sha, W.T.; Kasza, K.E.
As a result of the uncertainties in the understanding of the influence of thermal-buoyancy effects on the flow and heat transfer in Liquid Metal Fast Breeder Reactor heat exchangers and steam generators under off-normal operating conditions, an extensive experimental program is being conducted at Argonne National Laboratory to eliminate these uncertainties. Concurrently, a parallel analytical effort is also being pursued to develop a three-dimensional transient computer code (COMMIX-IHX) to study and predict heat exchanger performance under mixed, forced, and free convection conditions. This paper presents computational results from a heat exchanger simulation and compares them with the results from amore » test case exhibiting strong thermal buoyancy effects. Favorable agreement between experiment and code prediction is obtained.« less
Aerothermal modeling program, phase 2. Element B: Flow interaction experiment
NASA Technical Reports Server (NTRS)
Nikjooy, M.; Mongia, H. C.; Murthy, S. N. B.; Sullivan, J. P.
1986-01-01
The design process was improved and the efficiency, life, and maintenance costs of the turbine engine hot section was enhanced. Recently, there has been much emphasis on the need for improved numerical codes for the design of efficient combustors. For the development of improved computational codes, there is a need for an experimentally obtained data base to be used at test cases for the accuracy of the computations. The purpose of Element-B is to establish a benchmark quality velocity and scalar measurements of the flow interaction of circular jets with swirling flow typical of that in the dome region of annular combustor. In addition to the detailed experimental effort, extensive computations of the swirling flows are to be compared with the measurements for the purpose of assessing the accuracy of current and advanced turbulence and scalar transport models.
Statistical Analysis of CFD Solutions from the 6th AIAA CFD Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Derlaga, Joseph M.; Morrison, Joseph H.
2017-01-01
A graphical framework is used for statistical analysis of the results from an extensive N- version test of a collection of Reynolds-averaged Navier-Stokes computational uid dynam- ics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using both common and custom grid sequencees as well as multiple turbulence models for the June 2016 6th AIAA CFD Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic con guration for this workshop was the Common Research Model subsonic transport wing- body previously used for both the 4th and 5th Drag Prediction Workshops. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.
Numerical Simulations of Free Surface Magnetohydrodynamic Flows
NASA Astrophysics Data System (ADS)
Samulyak, Roman; Glimm, James; Oh, Wonho; Prykarpatskyy, Yarema
2003-11-01
We have developed a numerical algorithm and performed simulations of magnetohydrodynamic (MHD) free surface flows. The corresponding system of MHD equations is a system of strongly coupled hyperbolic and parabolic/elliptic equations in moving and geometrically complex domains. The hyperbolic system is solved using the front tracking technique for the free fluid interface. Parallel algorithms for solving elliptic and parabolic equations are based on a finite element discretization on moving grids dynamically conforming to fluid interfaces. The method has been implemented as an MHD extension of the FronTier code. The code has been applied for modeling the behavior of lithium and mercury jets in magnetic fields, laser ablation plumes, and the Richtmyer-Meshkov instability of a liquid mercury jet interacting with a high energy proton pulse in a strong magnetic field. Such an instability occurs in the target for the Muon Collider.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
Features of MCNP6 Relevant to Medical Radiation Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, H. Grady III; Goorley, John T.
2012-08-29
MCNP (Monte Carlo N-Particle) is a general-purpose Monte Carlo code for simulating the transport of neutrons, photons, electrons, positrons, and more recently other fundamental particles and heavy ions. Over many years MCNP has found a wide range of applications in many different fields, including medical radiation physics. In this presentation we will describe and illustrate a number of significant recently-developed features in the current version of the code, MCNP6, having particular utility for medical physics. Among these are major extensions of the ability to simulate large, complex geometries, improvement in memory requirements and speed for large lattices, introduction of mesh-basedmore » isotopic reaction tallies, advances in radiography simulation, expanded variance-reduction capabilities, especially for pulse-height tallies, and a large number of enhancements in photon/electron transport.« less
Double Compton and Cyclo-Synchrotron in Super-Eddington Discs, Magnetized Coronae, and Jets
NASA Astrophysics Data System (ADS)
McKinney, Jonathan C.; Chluba, Jens; Wielgus, Maciek; Narayan, Ramesh; Sadowski, Aleksander
2017-05-01
Black hole accretion discs accreting near the Eddington rate are dominated by bremsstrahlung cooling, but above the Eddington rate, the double Compton process can dominate in radiation-dominated regions, while the cyclo-synchrotron can dominate in strongly magnetized regions like a corona or a jet. We present an extension to the general relativistic radiation magnetohydrodynamic code harmrad to account for emission and absorption by thermal cyclo-synchrotron, double Compton, bremsstrahlung, low-temperature opal opacities, as well as Thomson and Compton scattering. The harmrad code and associated analysis and visualization codes have been made open-source and are publicly available at the github repository website. We approximate the radiation field as a Bose-Einstein distribution and evolve it using the radiation number-energy-momentum conservation equations in order to track photon hardening. We perform various simulations to study how these extensions affect the radiative properties of magnetically arrested discs accreting at Eddington to super-Eddington rates. We find that double Compton dominates bremsstrahlung in the disc within a radius of r ˜ 15rg (gravitational radii) at hundred times the Eddington accretion rate, and within smaller radii at lower accretion rates. Double Compton and cyclo-synchrotron regulate radiation and gas temperatures in the corona, while cyclo-synchrotron regulates temperatures in the jet. Interestingly, as the accretion rate drops to Eddington, an optically thin corona develops whose gas temperature of T ˜ 109K is ˜100 times higher than the disc's blackbody temperature. Our results show the importance of double Compton and synchrotron in super-Eddington discs, magnetized coronae and jets.
NASA Astrophysics Data System (ADS)
Poli, E.; Bock, A.; Lochbrunner, M.; Maj, O.; Reich, M.; Snicker, A.; Stegmeir, A.; Volpe, F.; Bertelli, N.; Bilato, R.; Conway, G. D.; Farina, D.; Felici, F.; Figini, L.; Fischer, R.; Galperti, C.; Happel, T.; Lin-Liu, Y. R.; Marushchenko, N. B.; Mszanowski, U.; Poli, F. M.; Stober, J.; Westerhof, E.; Zille, R.; Peeters, A. G.; Pereverzev, G. V.
2018-04-01
The paraxial WKB code TORBEAM (Poli, 2001) is widely used for the description of electron-cyclotron waves in fusion plasmas, retaining diffraction effects through the solution of a set of ordinary differential equations. With respect to its original form, the code has undergone significant transformations and extensions, in terms of both the physical model and the spectrum of applications. The code has been rewritten in Fortran 90 and transformed into a library, which can be called from within different (not necessarily Fortran-based) workflows. The models for both absorption and current drive have been extended, including e.g. fully-relativistic calculation of the absorption coefficient, momentum conservation in electron-electron collisions and the contribution of more than one harmonic to current drive. The code can be run also for reflectometry applications, with relativistic corrections for the electron mass. Formulas that provide the coupling between the reflected beam and the receiver have been developed. Accelerated versions of the code are available, with the reduced physics goal of inferring the location of maximum absorption (including or not the total driven current) for a given setting of the launcher mirrors. Optionally, plasma volumes within given flux surfaces and corresponding values of minimum and maximum magnetic field can be provided externally to speed up the calculation of full driven-current profiles. These can be employed in real-time control algorithms or for fast data analysis.
Self-Taught Low-Rank Coding for Visual Learning.
Li, Sheng; Li, Kang; Fu, Yun
2018-03-01
The lack of labeled data presents a common challenge in many computer vision and machine learning tasks. Semisupervised learning and transfer learning methods have been developed to tackle this challenge by utilizing auxiliary samples from the same domain or from a different domain, respectively. Self-taught learning, which is a special type of transfer learning, has fewer restrictions on the choice of auxiliary data. It has shown promising performance in visual learning. However, existing self-taught learning methods usually ignore the structure information in data. In this paper, we focus on building a self-taught coding framework, which can effectively utilize the rich low-level pattern information abstracted from the auxiliary domain, in order to characterize the high-level structural information in the target domain. By leveraging a high quality dictionary learned across auxiliary and target domains, the proposed approach learns expressive codings for the samples in the target domain. Since many types of visual data have been proven to contain subspace structures, a low-rank constraint is introduced into the coding objective to better characterize the structure of the given target set. The proposed representation learning framework is called self-taught low-rank (S-Low) coding, which can be formulated as a nonconvex rank-minimization and dictionary learning problem. We devise an efficient majorization-minimization augmented Lagrange multiplier algorithm to solve it. Based on the proposed S-Low coding mechanism, both unsupervised and supervised visual learning algorithms are derived. Extensive experiments on five benchmark data sets demonstrate the effectiveness of our approach.
Diffusive deposition of aerosols in Phebus containment during FPT-2 test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontautas, A.; Urbonavicius, E.
2012-07-01
At present the lumped-parameter codes is the main tool to investigate the complex response of the containment of Nuclear Power Plant in case of an accident. Continuous development and validation of the codes is required to perform realistic investigation of the processes that determine the possible source term of radioactive products to the environment. Validation of the codes is based on the comparison of the calculated results with the measurements performed in experimental facilities. The most extensive experimental program to investigate fission product release from the molten fuel, transport through the cooling circuit and deposition in the containment is performedmore » in PHEBUS test facility. Test FPT-2 performed in this facility is considered for analysis of processes taking place in containment. Earlier performed investigations using COCOSYS code showed that the code could be successfully used for analysis of thermal-hydraulic processes and deposition of aerosols, but there was also noticed that diffusive deposition on the vertical walls does not fit well with the measured results. In the CPA module of ASTEC code there is implemented different model for diffusive deposition, therefore the PHEBUS containment model was transferred from COCOSYS code to ASTEC-CPA to investigate the influence of the diffusive deposition modelling. Analysis was performed using PHEBUS containment model of 16 nodes. The calculated thermal-hydraulic parameters are in good agreement with measured results, which gives basis for realistic simulation of aerosol transport and deposition processes. Performed investigations showed that diffusive deposition model has influence on the aerosol deposition distribution on different surfaces in the test facility. (authors)« less
Coding response to a case-mix measurement system based on multiple diagnoses.
Preyra, Colin
2004-08-01
To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post.
Bergueiro, J; Igarzabal, M; Sandin, J C Suarez; Somacal, H R; Vento, V Thatar; Huck, H; Valda, A A; Repetto, M; Kreiner, A J
2011-12-01
Several ion sources have been developed and an ion source test stand has been mounted for the first stage of a Tandem-Electrostatic-Quadrupole facility For Accelerator-Based Boron Neutron Capture Therapy. A first source, designed, fabricated and tested is a dual chamber, filament driven and magnetically compressed volume plasma proton ion source. A 4 mA beam has been accelerated and transported into the suppressed Faraday cup. Extensive simulations of the sources have been performed using both 2D and 3D self-consistent codes. Copyright © 2011 Elsevier Ltd. All rights reserved.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
Electromagnetic processes in nucleus-nucleus collisions relating to space radiation research
NASA Technical Reports Server (NTRS)
Norbury, John W.
1992-01-01
Most of the papers within this report deal with electromagnetic processes in nucleus-nucleus collisions which are of concern in the space radiation program. In particular, the removal of one and two nucleons via both electromagnetic and strong interaction processes has been extensively investigated. The theory of relativistic Coulomb fission has also been developed. Several papers on quark models also appear. Finally, note that the theoretical methods developed in this work have been directly applied to the task of radiation protection of astronauts. This has been done by parameterizing the theoretical formalism in such a fashion that it can be used in cosmic ray transport codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, George J.
TOUGH+ v1.5 is a numerical code for the simulation of multi-phase, multi-component flow and transport of mass and heat through porous and fractured media, and represents the third update of the code since its first release [Moridis et al., 2008]. TOUGH+ is a successor to the TOUGH2 [Pruess et al., 1991; 2012] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. It is written in standard FORTRAN 95/2003, and can be run on any computational platform (workstations, PC, Macintosh). TOUGH+ v1.5 employs dynamic memory allocation, thus minimizing storage requirements. It has amore » completely modular structure, follows the tenets of Object-Oriented Programming (OOP), and involves the advanced features of FORTRAN 95/2003, i.e., modules, derived data types, the use of pointers, lists and trees, data encapsulation, defined operators and assignments, operator extension and overloading, use of generic procedures, and maximum use of the powerful intrinsic vector and matrix processing operations. TOUGH+ v1.5 is the core code for its family of applications, i.e., the part of the code that is common to all its applications. It provides a description of the underlying physics and thermodynamics of non-isothermal flow, of the mathematical and numerical approaches, as well as a detailed explanation of the general (common to all applications) input requirements, options, capabilities and output specifications. The core code cannot run by itself: it needs to be coupled with the code for the specific TOUGH+ application option that describes a particular type of problem. The additional input requirements specific to a particular TOUGH+ application options and related illustrative examples can be found in the corresponding User's Manual.« less
Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P; Palazzo, Alexander F; Moore, Melissa J; Roth, Frederick P
2017-03-01
Introns are found in 5' untranslated regions (5'UTRs) for 35% of all human transcripts. These 5'UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5'UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5'UTR intron status, we developed a classifier that can predict 5'UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5 ' proximal- i ntron- m inus-like-coding regions ("5IM" transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5' cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5' proximal positions. Finally, N 1 -methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5' proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N 1 -methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. © 2017 Cenik et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Validation of a program for supercritical power plant calculations
NASA Astrophysics Data System (ADS)
Kotowicz, Janusz; Łukowicz, Henryk; Bartela, Łukasz; Michalski, Sebastian
2011-12-01
This article describes the validation of a supercritical steam cycle. The cycle model was created with the commercial program GateCycle and validated using in-house code of the Institute of Power Engineering and Turbomachinery. The Institute's in-house code has been used extensively for industrial power plants calculations with good results. In the first step of the validation process, assumptions were made about the live steam temperature and pressure, net power, characteristic quantities for high- and low-pressure regenerative heat exchangers and pressure losses in heat exchangers. These assumptions were then used to develop a steam cycle model in Gate-Cycle and a model based on the code developed in-house at the Institute of Power Engineering and Turbomachinery. Properties, such as thermodynamic parameters at characteristic points of the steam cycle, net power values and efficiencies, heat provided to the steam cycle and heat taken from the steam cycle, were compared. The last step of the analysis was calculation of relative errors of compared values. The method used for relative error calculations is presented in the paper. The assigned relative errors are very slight, generally not exceeding 0.1%. Based on our analysis, it can be concluded that using the GateCycle software for calculations of supercritical power plants is possible.
Visual communication with retinex coding.
Huck, F O; Fales, C L; Davis, R E; Alter-Gartenberg, R
2000-04-10
Visual communication with retinex coding seeks to suppress the spatial variation of the irradiance (e.g., shadows) across natural scenes and preserve only the spatial detail and the reflectance (or the lightness) of the surface itself. The separation of reflectance from irradiance begins with nonlinear retinex coding that sharply and clearly enhances edges and preserves their contrast, and it ends with a Wiener filter that restores images from this edge and contrast information. An approximate small-signal model of image gathering with retinex coding is found to consist of the familiar difference-of-Gaussian bandpass filter and a locally adaptive automatic-gain control. A linear representation of this model is used to develop expressions within the small-signal constraint for the information rate and the theoretical minimum data rate of the retinex-coded signal and for the maximum-realizable fidelity of the images restored from this signal. Extensive computations and simulations demonstrate that predictions based on these figures of merit correlate closely with perceptual and measured performance. Hence these predictions can serve as a general guide for the design of visual communication channels that produce images with a visual quality that consistently approaches the best possible sharpness, clarity, and reflectance constancy, even for nonuniform irradiances. The suppression of shadows in the restored image is found to be constrained inherently more by the sharpness of their penumbra than by their depth.
Visual Communication with Retinex Coding
NASA Astrophysics Data System (ADS)
Huck, Friedrich O.; Fales, Carl L.; Davis, Richard E.; Alter-Gartenberg, Rachel
2000-04-01
Visual communication with retinex coding seeks to suppress the spatial variation of the irradiance (e.g., shadows) across natural scenes and preserve only the spatial detail and the reflectance (or the lightness) of the surface itself. The separation of reflectance from irradiance begins with nonlinear retinex coding that sharply and clearly enhances edges and preserves their contrast, and it ends with a Wiener filter that restores images from this edge and contrast information. An approximate small-signal model of image gathering with retinex coding is found to consist of the familiar difference-of-Gaussian bandpass filter and a locally adaptive automatic-gain control. A linear representation of this model is used to develop expressions within the small-signal constraint for the information rate and the theoretical minimum data rate of the retinex-coded signal and for the maximum-realizable fidelity of the images restored from this signal. Extensive computations and simulations demonstrate that predictions based on these figures of merit correlate closely with perceptual and measured performance. Hence these predictions can serve as a general guide for the design of visual communication channels that produce images with a visual quality that consistently approaches the best possible sharpness, clarity, and reflectance constancy, even for nonuniform irradiances. The suppression of shadows in the restored image is found to be constrained inherently more by the sharpness of their penumbra than by their depth.
NASA Astrophysics Data System (ADS)
López-Coto, R.; Hahn, J.; BenZvi, S.; Dingus, B.; Hinton, J.; Nisa, M. U.; Parsons, R. D.; Greus, F. Salesa; Zhang, H.; Zhou, H.
2018-11-01
The positron excess measured by PAMELA and AMS can only be explained if there is one or several sources injecting them. Moreover, at the highest energies, it requires the presence of nearby ( ∼ hundreds of parsecs) and middle age (maximum of ∼ hundreds of kyr) sources. Pulsars, as factories of electrons and positrons, are one of the proposed candidates to explain the origin of this excess. To calculate the contribution of these sources to the electron and positron flux at the Earth, we developed EDGE (Electron Diffusion and Gamma rays to the Earth), a code to treat the propagation of electrons and compute their diffusion from a central source with a flexible injection spectrum. Using this code, we can derive the source's gamma-ray spectrum, spatial extension, the all-electron density in space, the electron and positron flux reaching the Earth and the positron fraction measured at the Earth. We present in this paper the foundations of the code and study how different parameters affect the gamma-ray spectrum of a source and the electron flux measured at the Earth. We also studied the effect of several approximations usually performed in these studies. This code has been used to derive the results of the positron flux measured at the Earth in [1].
Pietan, Lucas L.; Spradling, Theresa A.
2016-01-01
In animals, mitochondrial DNA (mtDNA) typically occurs as a single circular chromosome with 13 protein-coding genes and 22 tRNA genes. The various species of lice examined previously, however, have shown mitochondrial genome rearrangements with a range of chromosome sizes and numbers. Our research demonstrates that the mitochondrial genomes of two species of chewing lice found on pocket gophers, Geomydoecus aurei and Thomomydoecus minor, are fragmented with the 1,536 base-pair (bp) cytochrome-oxidase subunit I (cox1) gene occurring as the only protein-coding gene on a 1,916–1,964 bp minicircular chromosome in the two species, respectively. The cox1 gene of T. minor begins with an atypical start codon, while that of G. aurei does not. Components of the non-protein coding sequence of G. aurei and T. minor include a tRNA (isoleucine) gene, inverted repeat sequences consistent with origins of replication, and an additional non-coding region that is smaller than the non-coding sequence of other lice with such fragmented mitochondrial genomes. Sequences of cox1 minichromosome clones for each species reveal extensive length and sequence heteroplasmy in both coding and noncoding regions. The highly variable non-gene regions of G. aurei and T. minor have little sequence similarity with one another except for a 19-bp region of phylogenetically conserved sequence with unknown function. PMID:27589589
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-11
... DEPARTMENT OF JUSTICE [OMB Number 1103-0094] Agency Information Collection Activities: Extension... public comment until May 10, 2011. This process is conducted in accordance with 5 CFR 1320.10. If you.... Department of Justice. [FR Doc. 2011-5678 Filed 3-10-11; 8:45 am] BILLING CODE 4410-AT-P ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
... Extension of Public Comment Period for Water Quality Standards for the State of Florida's Lakes and Flowing... comment period. SUMMARY: On January 14, 2010, EPA signed a proposed rule entitled ``Water Quality.... Mail to: Water Docket, U.S. Environmental Protection Agency, Mail code: 2822T, 1200 Pennsylvania Avenue...
2015-07-01
Acronyms ASCII American Standard Code for Information Interchange DAU data acquisition unit DDML data display markup language IHAL...Transfer Standard URI uniform resource identifier W3C World Wide Web Consortium XML extensible markup language XSD XML schema definition XML Style...Style Guide, RCC 125-15, July 2015 1 Introduction The next generation of telemetry systems will rely heavily on extensible markup language (XML
76 FR 31977 - Public Land Order No. 7768; Extension of Public Land Order No. 6861; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTB01000-L14300000.ET0000; MTM 79264] Public Land Order No. 7768; Extension of Public Land Order No. 6861; Montana AGENCY: Bureau of Land... and Minerals Management. [FR Doc. 2011-13720 Filed 6-1-11; 8:45 am] BILLING CODE P ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [61510-8451-0000; MTM 80092] Notice of Proposed Withdrawal Extension and Notification of a Public Meeting; Montana; Correction AGENCY: Bureau of... Staszak, Chief, Branch of Land Resources. [FR Doc. 2011-17716 Filed 7-13-11; 8:45 am] BILLING CODE 4310-55...
75 FR 55604 - Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-13
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTM01000-L14300000.ET0000; MTM 79264] Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana AGENCY: Bureau of... Miller, Acting Chief, Branch of Land Resources. [FR Doc. 2010-22740 Filed 9-10-10; 8:45 am] BILLING CODE...
Development of a Multifidelity Approach to Acoustic Liner Impedance Eduction
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Jones, Michael G.
2017-01-01
The use of acoustic liners has proven to be extremely effective in reducing aircraft engine fan noise transmission/radiation. However, the introduction of advanced fan designs and shorter engine nacelles has highlighted a need for novel acoustic liner designs that provide increased fan noise reduction over a broader frequency range. To achieve aggressive noise reduction goals, advanced broadband liner designs, such as zone liners and variable impedance liners, will likely depart from conventional uniform impedance configurations. Therefore, educing the impedance of these axial- and/or spanwise-variable impedance liners will require models that account for three-dimensional effects, thereby increasing computational expense. Thus, it would seem advantageous to investigate the use of multifidelity modeling approaches to impedance eduction for these advanced designs. This paper describes an extension of the use of the CDUCT-LaRC code to acoustic liner impedance eduction. The proposed approach is applied to a hardwall insert and conventional liner using simulated data. Educed values compare well with those educed using two extensively tested and validated approaches. The results are very promising and provide justification to further pursue the complementary use of CDUCT-LaRC with the currently used finite element codes to increase the efficiency of the eduction process for configurations involving three-dimensional effects.
Progressive video coding for noisy channels
NASA Astrophysics Data System (ADS)
Kim, Beong-Jo; Xiong, Zixiang; Pearlman, William A.
1998-10-01
We extend the work of Sherwood and Zeger to progressive video coding for noisy channels. By utilizing a 3D extension of the set partitioning in hierarchical trees (SPIHT) algorithm, we cascade the resulting 3D SPIHT video coder with a rate-compatible punctured convolutional channel coder for transmission of video over a binary symmetric channel. Progressive coding is achieved by increasing the target rate of the 3D embedded SPIHT video coder as the channel condition improves. The performance of our proposed coding system is acceptable at low transmission rate and bad channel conditions. Its low complexity makes it suitable for emerging applications such as video over wireless channels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Christopher J.; Stone, James M.; Gammie, Charles F.
2016-08-01
We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.
Definite Integrals, Some Involving Residue Theory Evaluated by Maple Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2010-01-01
The calculus of residue is applied to evaluate certain integrals in the range (-{infinity} to {infinity}) using the Maple symbolic code. These integrals are of the form {integral}{sub -{infinity}}{sup {infinity}} cos(x)/[(x{sup 2} + a{sup 2})(x{sup 2} + b{sup 2}) (x{sup 2} + c{sup 2})]dx and similar extensions. The Maple code is also applied to expressions in maximum likelihood estimator moments when sampling from the negative binomial distribution. In general the Maple code approach to the integrals gives correct answers to specified decimal places, but the symbolic result may be extremely long and complex.
Not All Order Memory Is Equal: Test Demands Reveal Dissociations in Memory for Sequence Information
ERIC Educational Resources Information Center
Jonker, Tanya R.; MacLeod, Colin M.
2017-01-01
Remembering the order of a sequence of events is a fundamental feature of episodic memory. Indeed, a number of formal models represent temporal context as part of the memory system, and memory for order has been researched extensively. Yet, the nature of the code(s) underlying sequence memory is still relatively unknown. Across 4 experiments that…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-21
....onetcenter.org/ombclearance.html . DATES: Written comments must be submitted to the office listed in the... information in the O*NET Web sites including O*NET OnLine ( http://online.onetcenter.org ), My Next Move ( www.MyNextMove.gov ) and My Next Move for Veterans ( www.MyNextMove.org/vets ) O*NET Code Connector ( www...
Picture data compression coder using subband/transform coding with a Lempel-Ziv-based coder
NASA Technical Reports Server (NTRS)
Glover, Daniel R. (Inventor)
1995-01-01
Digital data coders/decoders are used extensively in video transmission. A digitally encoded video signal is separated into subbands. Separating the video into subbands allows transmission at low data rates. Once the data is separated into these subbands it can be coded and then decoded by statistical coders such as the Lempel-Ziv based coder.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonen, F.A.; Johnson, K.I.; Liebetrau, A.M.
The VISA-II (Vessel Integrity Simulation Analysis code was originally developed as part of the NRC staff evaluation of pressurized thermal shock. VISA-II uses Monte Carlo simulation to evaluate the failure probability of a pressurized water reactor (PWR) pressure vessel subjected to a pressure and thermal transient specified by the user. Linear elastic fracture mechanics methods are used to model crack initiation and propagation. Parameters for initial crack size and location, copper content, initial reference temperature of the nil-ductility transition, fluence, crack-initiation fracture toughness, and arrest fracture toughness are treated as random variables. This report documents an upgraded version of themore » original VISA code as described in NUREG/CR-3384. Improvements include a treatment of cladding effects, a more general simulation of flaw size, shape and location, a simulation of inservice inspection, an updated simulation of the reference temperature of the nil-ductility transition, and treatment of vessels with multiple welds and initial flaws. The code has been extensively tested and verified and is written in FORTRAN for ease of installation on different computers. 38 refs., 25 figs.« less
3D numerical simulations of multiphase continental rifting
NASA Astrophysics Data System (ADS)
Naliboff, J.; Glerum, A.; Brune, S.
2017-12-01
Observations of rifted margin architecture suggest continental breakup occurs through multiple phases of extension with distinct styles of deformation. The initial rifting stages are often characterized by slow extension rates and distributed normal faulting in the upper crust decoupled from deformation in the lower crust and mantle lithosphere. Further rifting marks a transition to higher extension rates and coupling between the crust and mantle lithosphere, with deformation typically focused along large-scale detachment faults. Significantly, recent detailed reconstructions and high-resolution 2D numerical simulations suggest that rather than remaining focused on a single long-lived detachment fault, deformation in this phase may progress toward lithospheric breakup through a complex process of fault interaction and development. The numerical simulations also suggest that an initial phase of distributed normal faulting can play a key role in the development of these complex fault networks and the resulting finite deformation patterns. Motivated by these findings, we will present 3D numerical simulations of continental rifting that examine the role of temporal increases in extension velocity on rifted margin structure. The numerical simulations are developed with the massively parallel finite-element code ASPECT. While originally designed to model mantle convection using advanced solvers and adaptive mesh refinement techniques, ASPECT has been extended to model visco-plastic deformation that combines a Drucker Prager yield criterion with non-linear dislocation and diffusion creep. To promote deformation localization, the internal friction angle and cohesion weaken as a function of accumulated plastic strain. Rather than prescribing a single zone of weakness to initiate deformation, an initial random perturbation of the plastic strain field combined with rapid strain weakening produces distributed normal faulting at relatively slow rates of extension in both 2D and 3D simulations. Our presentation will focus on both the numerical assumptions required to produce these results and variations in 3D rifted margin architecture arising from a transition from slow to rapid rates of extension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, Joseph J.; Golovkin, I. E.; Woodruff, P. R.
2009-08-07
This Final Report summarizes work performed under DOE STTR Phase II Grant No. DE-FG02-05ER86258 during the project period from August 2006 to August 2009. The project, “Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments,” was led by Prism Computational Sciences (Madison, WI), and involved collaboration with subcontractors University of Nevada-Reno and Voss Scientific (Albuquerque, NM). In this project, we have: Developed and implemented a multi-dimensional, multi-frequency radiation transport model in the LSP hybrid fluid-PIC (particle-in-cell) code [1,2]. Updated the LSP code to support the use of accurate equation-of-state (EOS) tables generated by Prism’smore » PROPACEOS [3] code to compute more accurate temperatures in high energy density physics (HEDP) plasmas. Updated LSP to support the use of Prism’s multi-frequency opacity tables. Generated equation of state and opacity data for LSP simulations for several materials being used in plasma jet experimental studies. Developed and implemented parallel processing techniques for the radiation physics algorithms in LSP. Benchmarked the new radiation transport and radiation physics algorithms in LSP and compared simulation results with analytic solutions and results from numerical radiation-hydrodynamics calculations. Performed simulations using Prism radiation physics codes to address issues related to radiative cooling and ionization dynamics in plasma jet experiments. Performed simulations to study the effects of radiation transport and radiation losses due to electrode contaminants in plasma jet experiments. Updated the LSP code to generate output using NetCDF to provide a better, more flexible interface to SPECT3D [4] in order to post-process LSP output. Updated the SPECT3D code to better support the post-processing of large-scale 2-D and 3-D datasets generated by simulation codes such as LSP. Updated atomic physics modeling to provide for more comprehensive and accurate atomic databases that feed into the radiation physics modeling (spectral simulations and opacity tables). Developed polarization spectroscopy modeling techniques suitable for diagnosing energetic particle characteristics in HEDP experiments. A description of these items is provided in this report. The above efforts lay the groundwork for utilizing the LSP and SPECT3D codes in providing simulation support for DOE-sponsored HEDP experiments, such as plasma jet and fast ignition physics experiments. We believe that taken together, the LSP and SPECT3D codes have unique capabilities for advancing our understanding of the physics of these HEDP plasmas. Based on conversations early in this project with our DOE program manager, Dr. Francis Thio, our efforts emphasized developing radiation physics and atomic modeling capabilities that can be utilized in the LSP PIC code, and performing radiation physics studies for plasma jets. A relatively minor component focused on the development of methods to diagnose energetic particle characteristics in short-pulse laser experiments related to fast ignition physics. The period of performance for the grant was extended by one year to August 2009 with a one-year no-cost extension, at the request of subcontractor University of Nevada-Reno.« less
A three-dimensional code for muon propagation through the rock: MUSIC
NASA Astrophysics Data System (ADS)
Antonioli, P.; Ghetti, C.; Korolkova, E. V.; Kudryavtsev, V. A.; Sartorelli, G.
1997-10-01
We present a new three-dimensional Monte-Carlo code MUSIC (MUon SImulation Code) for muon propagation through the rock. All processes of muon interaction with matter with high energy loss (including the knock-on electron production) are treated as stochastic processes. The angular deviation and lateral displacement of muons due to multiple scattering, as well as bremsstrahlung, pair production and inelastic scattering are taken into account. The code has been applied to obtain the energy distribution and angular and lateral deviations of single muons at different depths underground. The muon multiplicity distributions obtained with MUSIC and CORSIKA (Extensive Air Shower simulation code) are also presented. We discuss the systematic uncertainties of the results due to different muon bremsstrahlung cross-sections.
The Julia programming language: the future of scientific computing
NASA Astrophysics Data System (ADS)
Gibson, John
2017-11-01
Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.
Cloudy - simulating the non-equilibrium microphysics of gas and dust, and its observed spectrum
NASA Astrophysics Data System (ADS)
Ferland, Gary J.
2014-01-01
Cloudy is an open-source plasma/spectral simulation code, last described in the open-access journal Revista Mexicana (Ferland et al. 2013, 2013RMxAA..49..137F). The project goal is a complete simulation of the microphysics of gas and dust over the full range of density, temperature, and ionization that we encounter in astrophysics, together with a prediction of the observed spectrum. Cloudy is one of the more widely used theory codes in astrophysics with roughly 200 papers citing its documentation each year. It is developed by graduate students, postdocs, and an international network of collaborators. Cloudy is freely available on the web at trac.nublado.org, the user community can post questions on http://groups.yahoo.com/neo/groups/cloudy_simulations/info, and summer schools are organized to learn more about Cloudy and its use (http://cloud9.pa.uky.edu gary/cloudy/CloudySummerSchool/). The code’s widespread use is possible because of extensive automatic testing. It is exercised over its full range of applicability whenever the source is changed. Changes in predicted quantities are automatically detected along with any newly introduced problems. The code is designed to be autonomous and self-aware. It generates a report at the end of a calculation that summarizes any problems encountered along with suggestions of potentially incorrect boundary conditions. This self-monitoring is a core feature since the code is now often used to generate large MPI grids of simulations, making it impossible for a user to verify each calculation by hand. I will describe some challenges in developing a large physics code, with its many interconnected physical processes, many at the frontier of research in atomic or molecular physics, all in an open environment.
Improvement of Mishchenko's T-matrix code for absorbing particles.
Moroz, Alexander
2005-06-10
The use of Gaussian elimination with backsubstitution for matrix inversion in scattering theories is discussed. Within the framework of the T-matrix method (the state-of-the-art code by Mishchenko is freely available at http://www.giss.nasa.gov/-crmim), it is shown that the domain of applicability of Mishchenko's FORTRAN 77 (F77) code can be substantially expanded in the direction of strongly absorbing particles where the current code fails to converge. Such an extension is especially important if the code is to be used in nanoplasmonic or nanophotonic applications involving metallic particles. At the same time, convergence can also be achieved for large nonabsorbing particles, in which case the non-Numerical Algorithms Group option of Mishchenko's code diverges. Computer F77 implementation of Mishchenko's code supplemented with Gaussian elimination with backsubstitution is freely available at http://www.wave-scattering.com.
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
SU-E-T-37: A GPU-Based Pencil Beam Algorithm for Dose Calculations in Proton Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalantzis, G; Leventouri, T; Tachibana, H
Purpose: Recent developments in radiation therapy have been focused on applications of charged particles, especially protons. Over the years several dose calculation methods have been proposed in proton therapy. A common characteristic of all these methods is their extensive computational burden. In the current study we present for the first time, to our best knowledge, a GPU-based PBA for proton dose calculations in Matlab. Methods: In the current study we employed an analytical expression for the protons depth dose distribution. The central-axis term is taken from the broad-beam central-axis depth dose in water modified by an inverse square correction whilemore » the distribution of the off-axis term was considered Gaussian. The serial code was implemented in MATLAB and was launched on a desktop with a quad core Intel Xeon X5550 at 2.67GHz with 8 GB of RAM. For the parallelization on the GPU, the parallel computing toolbox was employed and the code was launched on a GTX 770 with Kepler architecture. The performance comparison was established on the speedup factors. Results: The performance of the GPU code was evaluated for three different energies: low (50 MeV), medium (100 MeV) and high (150 MeV). Four square fields were selected for each energy, and the dose calculations were performed with both the serial and parallel codes for a homogeneous water phantom with size 300×300×300 mm3. The resolution of the PBs was set to 1.0 mm. The maximum speedup of ∼127 was achieved for the highest energy and the largest field size. Conclusion: A GPU-based PB algorithm for proton dose calculations in Matlab was presented. A maximum speedup of ∼127 was achieved. Future directions of the current work include extension of our method for dose calculation in heterogeneous phantoms.« less
Why is CDMA the solution for mobile satellite communication
NASA Technical Reports Server (NTRS)
Gilhousen, Klein S.; Jacobs, Irwin M.; Padovani, Roberto; Weaver, Lindsay A.
1989-01-01
It is demonstrated that spread spectrum Code Division Multiple Access (CDMA) systems provide an economically superior solution to satellite mobile communications by increasing the system maximum capacity with respect to single channel per carrier Frequency Division Multiple Access (FDMA) systems. Following the comparative analysis of CDMA and FDMA systems, the design of a model that was developed to test the feasibility of the approach and the performance of a spread spectrum system in a mobile environment. Results of extensive computer simulations as well as laboratory and field tests results are presented.
An Avatar-Based Italian Sign Language Visualization System
NASA Astrophysics Data System (ADS)
Falletto, Andrea; Prinetto, Paolo; Tiotto, Gabriele
In this paper, we present an experimental system that supports the translation from Italian to Italian Sign Language (ISL) of the deaf and its visualization through a virtual character. Our objective is to develop a complete platform useful for any application and reusable on several platforms including Web, Digital Television and offline text translation. The system relies on a database that stores both a corpus of Italian words and words coded in the ISL notation system. An interface for the insertion of data is implemented, that allows future extensions and integrations.
Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Cameron Russell; Mckigney, Edward Allen
The use of Bayesian inference in data analysis has become the standard for large scienti c experiments [1, 2]. The Monte Carlo Codes Group(XCP-3) at Los Alamos has developed a simple set of algorithms currently implemented in C++ and Python to easily perform at-prior Markov Chain Monte Carlo Bayesian inference with pure Metropolis sampling. These implementations are designed to be user friendly and extensible for customization based on speci c application requirements. This document describes the algorithmic choices made and presents two use cases.
The RAVE/VERTIGO vertex reconstruction toolkit and framework
NASA Astrophysics Data System (ADS)
Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.
2008-07-01
A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.
Evaluation of flaws in carbon steel piping. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zahoor, A.; Gamble, R.M.; Mehta, H.S.
1986-10-01
The objective of this program was to develop flaw evaluation procedures and allowable flaw sizes for ferritic piping used in light water reactor (LWR) power generation facilities. The program results provide relevant ASME Code groups with the information necessary to define flaw evaluation procedures, allowable flaw sizes, and their associated bases for Section XI of the code. Because there are several possible flaw-related failure modes for ferritic piping over the LWR operating temperature range, three analysis methods were employed to develop the evaluation procedures. These include limit load analysis for plastic collapse, elastic plastic fracture mechanics (EPFM) analysis for ductilemore » tearing, and linear elastic fracture mechanics (LEFM) analysis for non ductile crack extension. To ensure the appropriate analysis method is used in an evaluation, a step by step procedure also is provided to identify the relevant acceptance standard or procedure on a case by case basis. The tensile strength and toughness properties required to complete the flaw evaluation for any of the three analysis methods are included in the evaluation procedure. The flaw evaluation standards are provided in tabular form for the plastic collapse and ductile tearing modes, where the allowable part through flaw depth is defined as a function of load and flaw length. For non ductile crack extension, linear elastic fracture mechanics analysis methods, similar to those in Appendix A of Section XI, are defined. Evaluation flaw sizes and procedures are developed for both longitudinal and circumferential flaw orientations and normal/upset and emergency/faulted operating conditions. The tables are based on margins on load of 2.77 and 1.39 for circumferential flaws and 3.0 and 1.5 for longitudinal flaws for normal/upset and emergency/faulted conditions, respectively.« less
A Theory of False Cognitive Expectancies in Airline Pilots
NASA Astrophysics Data System (ADS)
Cortes, Antonio I.
The Theory of False Cognitive Expectancies was developed by studying high reliability flight operations. Airline pilots depend extensively on cognitive expectancies to perceive, understand, and predict actions and events. Out of 1,363 incident reports submitted by airline pilots to the National Aeronautics and Space Administration Aviation Safety Reporting System over a year's time, 110 reports were found to contain evidence of 127 false cognitive expectancies in pilots. A comprehensive taxonomy was developed with six categories of interest. The dataset of 127 false expectancies was used to initially code tentative taxon values for each category. Intermediate coding through constant comparative analysis completed the taxonomy. The taxonomy was used for the advanced coding of chronological context-dependent visualizations of expectancy factors, known as strands, which depict the major factors in the creation and propagation of each expectancy. Strands were mapped into common networks to detect highly represented expectancy processes. Theoretical integration established 11 sources of false expectancies, the most common expectancy errors, and those conspicuous factors worthy of future study. The most prevalent source of false cognitive expectancies within the dataset was determined to be unconscious individual modeling based on past events. Integrative analyses also revealed relationships between expectancies and flight deck automation, unresolved discrepancies, and levels of situation awareness. Particularly noteworthy were the findings that false expectancies can combine in three possible permutations to diminish situation awareness and examples of how false expectancies can be unwittingly transmitted from one person to another. The theory resulting from this research can enhance the error coding process used during aircraft line oriented safety audits, lays the foundation for developing expectancy management training programs, and will allow researchers to proffer hypotheses for human testing using flight simulators.
Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics
NASA Astrophysics Data System (ADS)
Takabe, Hideaki
2016-10-01
This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (ID & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.
NASA Astrophysics Data System (ADS)
Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard
2017-07-01
Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.
Interleaved concatenated codes: New perspectives on approaching the Shannon limit
Viterbi, A. J.; Viterbi, A. M.; Sindhushayana, N. T.
1997-01-01
The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit. PMID:11038568
Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, M.; Cuervo, D.; Ivanov, K.
2006-07-01
The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)
NASA Technical Reports Server (NTRS)
Flemming, Robert J.; Britton, Randall K.; Bond, Thomas H.
1994-01-01
The cost and time to certify or qualify a rotorcraft for flight in forecast icing has been a major impediment to the development of ice protection systems for helicopter rotors. Development and flight test programs for those aircraft that have achieved certification or qualification for flight in icing conditions have taken many years, and the costs have been very high. NASA, Sikorsky, and others have been conducting research into alternative means for providing information for the development of ice protection systems, and subsequent flight testing to substantiate the air-worthiness of a rotor ice protection system. Model rotor icing tests conducted in 1989 and 1993 have provided a data base for correlation of codes, and for the validation of wind tunnel icing test techniques. This paper summarizes this research, showing test and correlation trends as functions of cloud liquid water content, rotor lift, flight speed, and ambient temperature. Molds were made of several of the ice formations on the rotor blades. These molds were used to form simulated ice on the rotor blades, and the blades were then tested in a wind tunnel to determine flight performance characteristics. These simulated-ice rotor performance tests are discussed in the paper. The levels of correlation achieved and the role of these tools (codes and wind tunnel tests) in flight test planning, testing, and extension of flight data to the limits of the icing envelope are discussed. The potential application of simulated ice, the NASA LEWICE computer, the Sikorsky Generalized Rotor Performance aerodynamic computer code, and NASA Icing Research Tunnel rotor tests in a rotorcraft certification or qualification program are also discussed. The correlation of these computer codes with tunnel test data is presented, and a procedure or process to use these methods as part of a certification or qualification program is introduced.
Integrated Composite Analyzer (ICAN): Users and programmers manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1986-01-01
The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.
Thermodynamic Analysis of the Combustion of Metallic Materials
NASA Technical Reports Server (NTRS)
Wilson, D. Bruce; Stoltzfus, Joel M.
2000-01-01
Two types of computer codes are available to assist in the thermodynamic analysis of metallic materials combustion. One type of code calculates phase equilibrium data and is represented by CALPHAD. The other type of code calculates chemical reaction by the Gordon-McBride code. The first has seen significant application for alloy-phase diagrams, but only recently has it been considered for oxidation systems. The Gordon-McBride code has been applied to the combustion of metallic materials. Both codes are limited by their treatment of non-ideal solutions and the fact they are limited to treating volatile and gaseous species as ideal. This paper examines the significance of these limitations for combustion of metallic materials. In addition, the applicability of linear-free energy relationships for solid-phase oxidation and their possible extension to liquid-phase systems is examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Argo, P.E.; DeLapp, D.; Sutherland, C.D.
TRACKER is an extension of a three-dimensional Hamiltonian raytrace code developed some thirty years ago by R. Michael Jones. Subsequent modifications to this code, which is commonly called the {open_quotes}Jones Code,{close_quotes} were documented by Jones and Stephensen (1975). TRACKER incorporates an interactive user`s interface, modern differential equation integrators, graphical outputs, homing algorithms, and the Ionospheric Conductivity and Electron Density (ICED) ionosphere. TRACKER predicts the three-dimensional paths of radio waves through model ionospheres by numerically integrating Hamilton`s equations, which are a differential expression of Fermat`s principle of least time. By using continuous models, the Hamiltonian method avoids false caustics and discontinuousmore » raypath properties often encountered in other raytracing methods. In addition to computing the raypath, TRACKER also calculates the group path (or pulse travel time), the phase path, the geometrical (or {open_quotes}real{close_quotes}) pathlength, and the Doppler shift (if the time variation of the ionosphere is explicitly included). Computational speed can be traded for accuracy by specifying the maximum allowable integration error per step in the integration. Only geometrical optics are included in the main raytrace code; no partial reflections or diffraction effects are taken into account. In addition, TRACKER does not lend itself to statistical descriptions of propagation -- it requires a deterministic model of the ionosphere.« less
The Forest Method as a New Parallel Tree Method with the Sectional Voronoi Tessellation
NASA Astrophysics Data System (ADS)
Yahagi, Hideki; Mori, Masao; Yoshii, Yuzuru
1999-09-01
We have developed a new parallel tree method which will be called the forest method hereafter. This new method uses the sectional Voronoi tessellation (SVT) for the domain decomposition. The SVT decomposes a whole space into polyhedra and allows their flat borders to move by assigning different weights. The forest method determines these weights based on the load balancing among processors by means of the overload diffusion (OLD). Moreover, since all the borders are flat, before receiving the data from other processors, each processor can collect enough data to calculate the gravity force with precision. Both the SVT and the OLD are coded in a highly vectorizable manner to accommodate on vector parallel processors. The parallel code based on the forest method with the Message Passing Interface is run on various platforms so that a wide portability is guaranteed. Extensive calculations with 15 processors of Fujitsu VPP300/16R indicate that the code can calculate the gravity force exerted on 105 particles in each second for some ideal dark halo. This code is found to enable an N-body simulation with 107 or more particles for a wide dynamic range and is therefore a very powerful tool for the study of galaxy formation and large-scale structure in the universe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Automatic Multilevel Parallelization Using OpenMP
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Jost, Gabriele; Yan, Jerry; Ayguade, Eduard; Gonzalez, Marc; Martorell, Xavier; Biegel, Bryan (Technical Monitor)
2002-01-01
In this paper we describe the extension of the CAPO parallelization support tool to support multilevel parallelism based on OpenMP directives. CAPO generates OpenMP directives with extensions supported by the NanosCompiler to allow for directive nesting and definition of thread groups. We report first results for several benchmark codes and one full application that have been parallelized using our system.
76 FR 5564 - Grant of Interim Extension of the Term of U.S. Patent No. 4,971,802; MIFAMURTIDE
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-01
... DEPARTMENT OF COMMERCE Patent and Trademark Office [Docket No. PTO-P-2011-0001] Grant of Interim... Office, Commerce. ACTION: Notice of interim patent term extension. SUMMARY: The United States Patent and... States Patent and Trademark Office. [FR Doc. 2011-2088 Filed 1-31-11; 8:45 am] BILLING CODE 3510-16-P ...
Second order tensor finite element
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Fly, J.; Berry, C.; Tworzydlo, W.; Vadaketh, S.; Bass, J.
1990-01-01
The results of a research and software development effort are presented for the finite element modeling of the static and dynamic behavior of anisotropic materials, with emphasis on single crystal alloys. Various versions of two dimensional and three dimensional hybrid finite elements were implemented and compared with displacement-based elements. Both static and dynamic cases are considered. The hybrid elements developed in the project were incorporated into the SPAR finite element code. In an extension of the first phase of the project, optimization of experimental tests for anisotropic materials was addressed. In particular, the problem of calculating material properties from tensile tests and of calculating stresses from strain measurements were considered. For both cases, numerical procedures and software for the optimization of strain gauge and material axes orientation were developed.
Development tests for the 2.5 megawatt Mod-2 wind turbine generator
NASA Technical Reports Server (NTRS)
Andrews, J. S.; Baskin, J. M.
1982-01-01
The 2.5 megawatt MOD-2 wind turbine generator test program is discussed. The development of the 2.5 megawatt MOD-2 wind turbine generator included an extensive program of testing which encompassed verification of analytical procedures, component development, and integrated system verification. The test program was to assure achievement of the thirty year design operational life of the wind turbine system as well as to minimize costly design modifications which would otherwise have been required during on site system testing. Computer codes were modified, fatigue life of structure and dynamic components were verified, mechanical and electrical component and subsystems were functionally checked and modified where necessary to meet system specifications, and measured dynamic responses of coupled systems confirmed analytical predictions.
Perl at the Joint Astronomy Centre
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie; Tilanus, Remo P. J.; Best, Casey; Prestage, Richard M.; Shimek, Pam; Glazebrook, Karl; Farrell, Tony J.
Perl is used extensively at the JAC (UKIRT and JCMT) and because of the language's flexibility (enabling us to interface perl to any library) we are finding that it is possible to write all of our utilities in it. This simplifies support and aids code reuse (via the module system and object oriented interface) as well as shortening development time. Currently we have developed interfaces to messaging systems (ADAM and DRAMA), I/O libraries (NDF, GSD), astronomical libraries (SLALIB) and the Starlink noticeboard system (NBS). We have also developed tools to aid in data taking (the JCMT observation desk) and data processing (surf and orac-dr) This paper will briefly review the facilities available, with an emphasis on those which might be of interest to other observatories.
Coding Response to a Case-Mix Measurement System Based on Multiple Diagnoses
Preyra, Colin
2004-01-01
Objective To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. PMID:15230940
Computational analysis of Variable Thrust Engine (VTE) performance
NASA Technical Reports Server (NTRS)
Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.
1993-01-01
The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.
Computational fluid dynamic modelling of cavitation
NASA Technical Reports Server (NTRS)
Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.
1993-01-01
Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.
Transonic flow about a thick circular-arc airfoil
NASA Technical Reports Server (NTRS)
Mcdevitt, J. B.; Levy, L. L., Jr.; Deiwert, G. S.
1975-01-01
An experimental and theoretical study of transonic flow over a thick airfoil, prompted by a need for adequately documented experiments that could provide rigorous verification of viscous flow simulation computer codes, is reported. Special attention is given to the shock-induced separation phenomenon in the turbulent regime. Measurements presented include surface pressures, streamline and flow separation patterns, and shadowgraphs. For a limited range of free-stream Mach numbers the airfoil flow field is found to be unsteady. Dynamic pressure measurements and high-speed shadowgraph movies were taken to investigate this phenomenon. Comparisons of experimentally determined and numerically simulated steady flows using a new viscous-turbulent code are also included. The comparisons show the importance of including an accurate turbulence model. When the shock-boundary layer interaction is weak the turbulence model employed appears adequate, but when the interaction is strong, and extensive regions of separation are present, the model is inadequate and needs further development.
Enhancements to TetrUSS for NASA Constellation Program
NASA Technical Reports Server (NTRS)
Pandya, Mohagna J.; Frink, Neal T.; Abdol-Hamid, Khaled S.; Samareh, Jamshid A,; Parlete, Edward B.; Taft, James R.
2011-01-01
The NASA Constellation program is utilizing Computational Fluid Dynamics (CFD) predictions for generating aerodynamic databases and design loads for the Ares I, Ares I-X, and Ares V launch vehicles and for aerodynamic databases for the Orion crew exploration vehicle and its launch abort system configuration. This effort presents several challenges to applied aerodynamicists due to complex geometries and flow physics, as well as from the juxtaposition of short schedule program requirements with high fidelity CFD simulations. NASA TetrUSS codes (GridTool/VGRID/USM3D) have been making extensive contributions in this effort. This paper will provide an overview of several enhancements made to the various elements of TetrUSS suite of codes. Representative TetrUSS solutions for selected Constellation program elements will be shown. Best practices guidelines and scripting developed for generating TetrUSS solutions in a production environment will also be described.
A Design for Composing and Extending Vehicle Models
NASA Technical Reports Server (NTRS)
Madden, Michael M.; Neuhaus, Jason R.
2003-01-01
The Systems Development Branch (SDB) at NASA Langley Research Center (LaRC) creates simulation software products for research. Each product consists of an aircraft model with experiment extensions. SDB treats its aircraft models as reusable components, upon which experiments can be built. SDB has evolved aircraft model design with the following goals: 1. Avoid polluting the aircraft model with experiment code. 2. Discourage the copy and tailor method of reuse. The current evolution of that architecture accomplishes these goals by reducing experiment creation to extend and compose. The architecture mechanizes the operational concerns of the model's subsystems and encapsulates them in an interface inherited by all subsystems. Generic operational code exercises the subsystems through the shared interface. An experiment is thus defined by the collection of subsystems that it creates ("compose"). Teams can modify the aircraft subsystems for the experiment using inheritance and polymorphism to create variants ("extend").
NASA Technical Reports Server (NTRS)
Cebeci, T.; Chen, H. H.; Kaups, K.; Schimke, S.; Shin, J.
1992-01-01
A method for computing ice shapes along the leading edge of a wing and a method for predicting its aerodynamic performance degradation due to icing is described. Ice shapes are computed using an extension of the LEWICE code which was developed for airfoils. The aerodynamic properties of the iced wing are determined with an interactive scheme in which the solutions of the inviscid flow equations are obtained from a panel method and the solutions of the viscous flow equations are obtained from an inverse three-dimensional finite-difference boundary-layer method. A new interaction law is used to couple the inviscid and viscous flow solutions. The application of the LEWICE wing code to the calculation of ice shapes on a MS-317 swept wing shows good agreement with measurements. The interactive boundary-layer method is applied to a tapered ice wing in order to study the effect of icing on the aerodynamic properties of the wing at several angles of attack.
User-Defined Data Distributions in High-Level Programming Languages
NASA Technical Reports Server (NTRS)
Diaconescu, Roxana E.; Zima, Hans P.
2006-01-01
One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.
Robot Task Commander with Extensible Programming Environment
NASA Technical Reports Server (NTRS)
Hart, Stephen W (Inventor); Wightman, Brian J (Inventor); Dinh, Duy Paul (Inventor); Yamokoski, John D. (Inventor); Gooding, Dustin R (Inventor)
2014-01-01
A system for developing distributed robot application-level software includes a robot having an associated control module which controls motion of the robot in response to a commanded task, and a robot task commander (RTC) in networked communication with the control module over a network transport layer (NTL). The RTC includes a script engine(s) and a GUI, with a processor and a centralized library of library blocks constructed from an interpretive computer programming code and having input and output connections. The GUI provides access to a Visual Programming Language (VPL) environment and a text editor. In executing a method, the VPL is opened, a task for the robot is built from the code library blocks, and data is assigned to input and output connections identifying input and output data for each block. A task sequence(s) is sent to the control module(s) over the NTL to command execution of the task.
Biomimetic Artificial Epigenetic Code for Targeted Acetylation of Histones.
Taniguchi, Junichi; Feng, Yihong; Pandian, Ganesh N; Hashiya, Fumitaka; Hidaka, Takuya; Hashiya, Kaori; Park, Soyoung; Bando, Toshikazu; Ito, Shinji; Sugiyama, Hiroshi
2018-06-13
While the central role of locus-specific acetylation of histone proteins in eukaryotic gene expression is well established, the availability of designer tools to regulate acetylation at particular nucleosome sites remains limited. Here, we develop a unique strategy to introduce acetylation by constructing a bifunctional molecule designated Bi-PIP. Bi-PIP has a P300/CBP-selective bromodomain inhibitor (Bi) as a P300/CBP recruiter and a pyrrole-imidazole polyamide (PIP) as a sequence-selective DNA binder. Biochemical assays verified that Bi-PIPs recruit P300 to the nucleosomes having their target DNA sequences and extensively accelerate acetylation. Bi-PIPs also activated transcription of genes that have corresponding cognate DNA sequences inside living cells. Our results demonstrate that Bi-PIPs could act as a synthetic programmable histone code of acetylation, which emulates the bromodomain-mediated natural propagation system of histone acetylation to activate gene expression in a sequence-selective manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driscoll, Frederick R.
The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less
ProteoWizard: open source software for rapid proteomics tools development.
Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag
2008-11-01
The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.
NASA Astrophysics Data System (ADS)
Tornga, Shawn R.
The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as localization capability. Utilizing imaging information will show signal-to-noise gains over spectroscopic algorithms alone.
Effect of Light Water Reactor Water Environments on the Fatigue Life of Reactor Materials
Chopra, O. K.; Stevens, G. L.; Tregoning, R.; ...
2017-10-06
The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (Code) provides rules for the design of Class 1 components of nuclear power plants. Figures I-9.1 through I-9.6 of Appendix I to Section III of the Code specify fatigue design curves for applicable structural materials. However, the Code design curves do not explicitly address the effects of light water reactor (LWR) water environments. Existing fatigue strain-vs.-life (ε-N) laboratory data illustrate potentially significant effects of LWR water environments on the fatigue resistance of pressure vessel and piping steels. Extensive studies have been conducted at Argonne National Laboratory and elsewheremore » since 1990 to investigate the effects of LWR environments on the fatigue life of piping and pressure vessel steels. This article summarizes the results of these studies. Existing fatigue ε-N data were evaluated to identify the various material, environmental, and loading conditions that influence fatigue crack initiation; a methodology for estimating fatigue lives as a function of these parameters was developed. The effects were incorporated into the ASME Code Section III fatigue evaluations in terms of an environmental correction factor, F en, which is defined as the ratio of fatigue life in air at room temperature to the fatigue life in the LWR water environment at reactor operating temperatures. Available fatigue data were used to develop fatigue design curves for carbon and low-alloy steels, austenitic stainless steels, and nickel-chromium-iron (NiCr-Fe) alloys and their weld metals in air at room temperature. A review of the Code Section III fatigue adjustment factors of 2 on strain and 20 on life is also presented and the possible conservatism inherent in the choice of these adjustment factors is evaluated. A brief description of potential effects of neutron irradiation on fatigue crack initiation for these structural materials is also presented.« less
Effect of Light Water Reactor Water Environments on the Fatigue Life of Reactor Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chopra, O. K.; Stevens, G. L.; Tregoning, R.
The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (Code) provides rules for the design of Class 1 components of nuclear power plants. Figures I-9.1 through I-9.6 of Appendix I to Section III of the Code specify fatigue design curves for applicable structural materials. However, the Code design curves do not explicitly address the effects of light water reactor (LWR) water environments. Existing fatigue strain-vs.-life (ε-N) laboratory data illustrate potentially significant effects of LWR water environments on the fatigue resistance of pressure vessel and piping steels. Extensive studies have been conducted at Argonne National Laboratory and elsewheremore » since 1990 to investigate the effects of LWR environments on the fatigue life of piping and pressure vessel steels. This article summarizes the results of these studies. Existing fatigue ε-N data were evaluated to identify the various material, environmental, and loading conditions that influence fatigue crack initiation; a methodology for estimating fatigue lives as a function of these parameters was developed. The effects were incorporated into the ASME Code Section III fatigue evaluations in terms of an environmental correction factor, F en, which is defined as the ratio of fatigue life in air at room temperature to the fatigue life in the LWR water environment at reactor operating temperatures. Available fatigue data were used to develop fatigue design curves for carbon and low-alloy steels, austenitic stainless steels, and nickel-chromium-iron (NiCr-Fe) alloys and their weld metals in air at room temperature. A review of the Code Section III fatigue adjustment factors of 2 on strain and 20 on life is also presented and the possible conservatism inherent in the choice of these adjustment factors is evaluated. A brief description of potential effects of neutron irradiation on fatigue crack initiation for these structural materials is also presented.« less
NASA Astrophysics Data System (ADS)
Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.
2017-08-01
Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.
Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R
2018-01-01
Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.
Electromagnetic plasma simulation in realistic geometries
NASA Astrophysics Data System (ADS)
Brandon, S.; Ambrosiano, J. J.; Nielsen, D.
1991-08-01
Particle-in-Cell (PIC) calculations have become an indispensable tool to model the nonlinear collective behavior of charged particle species in electromagnetic fields. Traditional finite difference codes, such as CONDOR (2-D) and ARGUS (3-D), are used extensively to design experiments and develop new concepts. A wide variety of physical processes can be modeled simply and efficiently by these codes. However, experiments have become more complex. Geometrical shapes and length scales are becoming increasingly more difficult to model. Spatial resolution requirements for the electromagnetic calculation force large grids and small time steps. Many hours of CRAY YMP time may be required to complete 2-D calculation -- many more for 3-D calculations. In principle, the number of mesh points and particles need only to be increased until all relevant physical processes are resolved. In practice, the size of a calculation is limited by the computer budget. As a result, experimental design is being limited by the ability to calculate, not by the experimenters ingenuity or understanding of the physical processes involved. Several approaches to meet these computational demands are being pursued. Traditional PIC codes continue to be the major design tools. These codes are being actively maintained, optimized, and extended to handle large and more complex problems. Two new formulations are being explored to relax the geometrical constraints of the finite difference codes. A modified finite volume test code, TALUS, uses a data structure compatible with that of standard finite difference meshes. This allows a basic conformal boundary/variable grid capability to be retrofitted to CONDOR. We are also pursuing an unstructured grid finite element code, MadMax. The unstructured mesh approach provides maximum flexibility in the geometrical model while also allowing local mesh refinement.
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
Implications of Information Theory for Computational Modeling of Schizophrenia
Wibral, Michael; Phillips, William A.
2017-01-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory—such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio—can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development. PMID:29601053
Digital barcodes of suspension array using laser induced breakdown spectroscopy
He, Qinghua; Liu, Yixi; He, Yonghong; Zhu, Liang; Zhang, Yilong; Shen, Zhiyuan
2016-01-01
We show a coding method of suspension array based on the laser induced breakdown spectroscopy (LIBS), which promotes the barcodes from analog to digital. As the foundation of digital optical barcodes, nanocrystals encoded microspheres are prepared with self-assembly encapsulation method. We confirm that digital multiplexing of LIBS-based coding method becomes feasible since the microsphere can be coded with direct read-out data of wavelengths, and the method can avoid fluorescence signal crosstalk between barcodes and analyte tags, which lead to overall advantages in accuracy and stability to current fluorescent multicolor coding method. This demonstration increases the capability of multiplexed detection and accurate filtrating, expanding more extensive applications of suspension array in life science. PMID:27808270
Layered Wyner-Ziv video coding.
Xu, Qian; Xiong, Zixiang
2006-12-01
Following recent theoretical works on successive Wyner-Ziv coding (WZC), we propose a practical layered Wyner-Ziv video coder using the DCT, nested scalar quantization, and irregular LDPC code based Slepian-Wolf coding (or lossless source coding with side information at the decoder). Our main novelty is to use the base layer of a standard scalable video coder (e.g., MPEG-4/H.26L FGS or H.263+) as the decoder side information and perform layered WZC for quality enhancement. Similar to FGS coding, there is no performance difference between layered and monolithic WZC when the enhancement bitstream is generated in our proposed coder. Using an H.26L coded version as the base layer, experiments indicate that WZC gives slightly worse performance than FGS coding when the channel (for both the base and enhancement layers) is noiseless. However, when the channel is noisy, extensive simulations of video transmission over wireless networks conforming to the CDMA2000 1X standard show that H.26L base layer coding plus Wyner-Ziv enhancement layer coding are more robust against channel errors than H.26L FGS coding. These results demonstrate that layered Wyner-Ziv video coding is a promising new technique for video streaming over wireless networks.
NASA Technical Reports Server (NTRS)
Dolinar, S.
1988-01-01
Over the past six to eight years, an extensive research effort was conducted to investigate advanced coding techniques which promised to yield more coding gain than is available with current NASA standard codes. The delay in Galileo's launch due to the temporary suspension of the shuttle program provided the Galileo project with an opportunity to evaluate the possibility of including some version of the advanced codes as a mission enhancement option. A study was initiated last summer to determine if substantial coding gain was feasible for Galileo and, is so, to recommend a suitable experimental code for use as a switchable alternative to the current NASA-standard code. The Galileo experimental code study resulted in the selection of a code with constant length 15 and rate 1/4. The code parameters were chosen to optimize performance within cost and risk constraints consistent with retrofitting the new code into the existing Galileo system design and launch schedule. The particular code was recommended after a very limited search among good codes with the chosen parameters. It will theoretically yield about 1.5 dB enhancement under idealizing assumptions relative to the current NASA-standard code at Galileo's desired bit error rates. This ideal predicted gain includes enough cushion to meet the project's target of at least 1 dB enhancement under real, non-ideal conditions.
Using ontologies to improve semantic interoperability in health data.
Liyanage, Harshana; Krause, Paul; De Lusignan, Simon
2015-07-10
The present-day health data ecosystem comprises a wide array of complex heterogeneous data sources. A wide range of clinical, health care, social and other clinically relevant information are stored in these data sources. These data exist either as structured data or as free-text. These data are generally individual person-based records, but social care data are generally case based and less formal data sources may be shared by groups. The structured data may be organised in a proprietary way or be coded using one-of-many coding, classification or terminologies that have often evolved in isolation and designed to meet the needs of the context that they have been developed. This has resulted in a wide range of semantic interoperability issues that make the integration of data held on these different systems changing. We present semantic interoperability challenges and describe a classification of these. We propose a four-step process and a toolkit for those wishing to work more ontologically, progressing from the identification and specification of concepts to validating a final ontology. The four steps are: (1) the identification and specification of data sources; (2) the conceptualisation of semantic meaning; (3) defining to what extent routine data can be used as a measure of the process or outcome of care required in a particular study or audit and (4) the formalisation and validation of the final ontology. The toolkit is an extension of a previous schema created to formalise the development of ontologies related to chronic disease management. The extensions are focused on facilitating rapid building of ontologies for time-critical research studies.
Hose, D R; Lawford, P V; Narracott, A J; Penrose, J M T; Jones, I P
2003-01-01
Fluid-solid interaction is a primary feature of cardiovascular flows. There is increasing interest in the numerical solution of these systems as the extensive computational resource required for such studies becomes available. One form of coupling is an external weak coupling of separate solid and fluid mechanics codes. Information about the stress tensor and displacement vector at the wetted boundary is passed between the codes, and an iterative scheme is employed to move towards convergence of these parameters at each time step. This approach has the attraction that separate codes with the most extensive functionality for each of the separate phases can be selected, which might be important in the context of the complex rheology and contact mechanics that often feature in cardiovascular systems. Penrose and Staples describe a weak coupling of CFX for computational fluid mechanics to ANSYS for solid mechanics, based on a simple Jacobi iteration scheme. It is important to validate the coupled numerical solutions. An extensive analytical study of flow in elastic-walled tubes was carried out by Womersley in the late 1950s. This paper describes the performance of the coupling software for the straight elastic-walled tube, and compares the results with Womersley's analytical solutions. It also presents preliminary results demonstrating the application of the coupled software in the context of a stented vessel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moffat, Harry K.; Jove-Colon, Carlos F.
2009-06-01
In this report, we summarize our work on developing a production level capability for modeling brine thermodynamic properties using the open-source code Cantera. This implementation into Cantera allows for the application of chemical thermodynamics to describe the interactions between a solid and an electrolyte solution at chemical equilibrium. The formulations to evaluate the thermodynamic properties of electrolytes are based on Pitzer's model to calculate molality-based activity coefficients using a real equation-of-state (EoS) for water. In addition, the thermodynamic properties of solutes at elevated temperature and pressures are computed using the revised Helgeson-Kirkham-Flowers (HKF) EoS for ionic and neutral aqueous species.more » The thermodynamic data parameters for the Pitzer formulation and HKF EoS are from the thermodynamic database compilation developed for the Yucca Mountain Project (YMP) used with the computer code EQ3/6. We describe the adopted equations and their implementation within Cantera and also provide several validated examples relevant to the calculations of extensive properties of electrolyte solutions.« less
AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE
NASA Technical Reports Server (NTRS)
Liever, P. A.; Sheta, E. F.; Habchi, S. D.
2006-01-01
A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.
Some Progress in Large-Eddy Simulation using the 3-D Vortex Particle Method
NASA Technical Reports Server (NTRS)
Winckelmans, G. S.
1995-01-01
This two-month visit at CTR was devoted to investigating possibilities in LES modeling in the context of the 3-D vortex particle method (=vortex element method, VEM) for unbounded flows. A dedicated code was developed for that purpose. Although O(N(sup 2)) and thus slow, it offers the advantage that it can easily be modified to try out many ideas on problems involving up to N approx. 10(exp 4) particles. Energy spectrums (which require O(N(sup 2)) operations per wavenumber) are also computed. Progress was realized in the following areas: particle redistribution schemes, relaxation schemes to maintain the solenoidal condition on the particle vorticity field, simple LES models and their VEM extension, possible new avenues in LES. Model problems that involve strong interaction between vortex tubes were computed, together with diagnostics: total vorticity, linear and angular impulse, energy and energy spectrum, enstrophy. More work is needed, however, especially regarding relaxation schemes and further validation and development of LES models for VEM. Finally, what works well will eventually have to be incorporated into the fast parallel tree code.
NASA Technical Reports Server (NTRS)
Mertens, Christopher J.; Moyers, Michael F.; Walker, Steven A.; Tweed, John
2010-01-01
Recent developments in NASA s deterministic High charge (Z) and Energy TRaNsport (HZETRN) code have included lateral broadening of primary ion beams due to small-angle multiple Coulomb scattering, and coupling of the ion-nuclear scattering interactions with energy loss and straggling. This new version of HZETRN is based on Green function methods, called GRNTRN, and is suitable for modeling transport with both space environment and laboratory boundary conditions. Multiple scattering processes are a necessary extension to GRNTRN in order to accurately model ion beam experiments, to simulate the physical and biological-effective radiation dose, and to develop new methods and strategies for light ion radiation therapy. In this paper we compare GRNTRN simulations of proton lateral broadening distributions with beam measurements taken at Loma Linda University Proton Therapy Facility. The simulated and measured lateral broadening distributions are compared for a 250 MeV proton beam on aluminum, polyethylene, polystyrene, bone substitute, iron, and lead target materials. The GRNTRN results are also compared to simulations from the Monte Carlo MCNPX code for the same projectile-target combinations described above.
Latent hardening size effect in small-scale plasticity
NASA Astrophysics Data System (ADS)
Bardella, Lorenzo; Segurado, Javier; Panteghini, Andrea; Llorca, Javier
2013-07-01
We aim at understanding the multislip behaviour of metals subject to irreversible deformations at small-scales. By focusing on the simple shear of a constrained single-crystal strip, we show that discrete Dislocation Dynamics (DD) simulations predict a strong latent hardening size effect, with smaller being stronger in the range [1.5 µm, 6 µm] for the strip height. We attempt to represent the DD pseudo-experimental results by developing a flow theory of Strain Gradient Crystal Plasticity (SGCP), involving both energetic and dissipative higher-order terms and, as a main novelty, a strain gradient extension of the conventional latent hardening. In order to discuss the capability of the SGCP theory proposed, we implement it into a Finite Element (FE) code and set its material parameters on the basis of the DD results. The SGCP FE code is specifically developed for the boundary value problem under study so that we can implement a fully implicit (Backward Euler) consistent algorithm. Special emphasis is placed on the discussion of the role of the material length scales involved in the SGCP model, from both the mechanical and numerical points of view.
InSAR Scientific Computing Environment
NASA Technical Reports Server (NTRS)
Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.
2011-01-01
This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and new codes, abstraction and generalization of the data model for efficient manipulation of objects among modules, and well-designed module interfaces suitable for command- line execution or GUI-programming. The framework is designed to allow users contributions to promote maximum utility and sophistication of the code, creating an open-source community that could extend the framework into the indefinite future.
Update On the Status of the FLUKA Monte Carlo Transport Code*
NASA Technical Reports Server (NTRS)
Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.
2006-01-01
The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation dose for beam target areas, dose calculations for radiation therapy as well as being adapted for use in the simulation of events in the ALICE detector at the LHC.
Subjective evaluation of next-generation video compression algorithms: a case study
NASA Astrophysics Data System (ADS)
De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio
2010-08-01
This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.
,
2006-01-01
GDA (Geologic Data Assistant) is an extension to ArcPad, a mobile mapping software program by Environmental Systems Research Institute (ESRI) designed to run on personal digital assistant (PDA) computers. GDA and ArcPad allow a PDA to replace the paper notebook and field map traditionally used for geologic mapping. GDA allows easy collection of field data.
Development of the PARVMEC Code for Rapid Analysis of 3D MHD Equilibrium
NASA Astrophysics Data System (ADS)
Seal, Sudip; Hirshman, Steven; Cianciosa, Mark; Wingen, Andreas; Unterberg, Ezekiel; Wilcox, Robert; ORNL Collaboration
2015-11-01
The VMEC three-dimensional (3D) MHD equilibrium has been used extensively for designing stellarator experiments and analyzing experimental data in such strongly 3D systems. Recent applications of VMEC include 2D systems such as tokamaks (in particular, the D3D experiment), where application of very small (delB/B ~ 10-3) 3D resonant magnetic field perturbations render the underlying assumption of axisymmetry invalid. In order to facilitate the rapid analysis of such equilibria (for example, for reconstruction purposes), we have undertaken the task of parallelizing the VMEC code (PARVMEC) to produce a scalable and temporally rapidly convergent equilibrium code for use on parallel distributed memory platforms. The parallelization task naturally splits into three distinct parts 1) radial surfaces in the fixed-boundary part of the calculation; 2) two 2D angular meshes needed to compute the Green's function integrals over the plasma boundary for the free-boundary part of the code; and 3) block tridiagonal matrix needed to compute the full (3D) pre-conditioner near the final equilibrium state. Preliminary results show that scalability is achieved for tasks 1 and 3, with task 2 still nearing completion. The impact of this work on the rapid reconstruction of D3D plasmas using PARVMEC in the V3FIT code will be discussed. Work supported by U.S. DOE under Contract DE-AC05-00OR22725 with UT-Battelle, LLC.
NASA Astrophysics Data System (ADS)
de Schryver, C.; Weithoffer, S.; Wasenmüller, U.; Wehn, N.
2012-09-01
Channel coding is a standard technique in all wireless communication systems. In addition to the typically employed methods like convolutional coding, turbo coding or low density parity check (LDPC) coding, algebraic codes are used in many cases. For example, outer BCH coding is applied in the DVB-S2 standard for satellite TV broadcasting. A key operation for BCH and the related Reed-Solomon codes are multiplications in finite fields (Galois Fields), where extension fields of prime fields are used. A lot of architectures for multiplications in finite fields have been published over the last decades. This paper examines four different multiplier architectures in detail that offer the potential for very high throughputs. We investigate the implementation performance of these multipliers on FPGA technology in the context of channel coding. We study the efficiency of the multipliers with respect to area, frequency and throughput, as well as configurability and scalability. The implementation data of the fully verified circuits are provided for a Xilinx Virtex-4 device after place and route.
Statistical Analysis of CFD Solutions from the Third AIAA Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.; Hemsch, Michael J.
2007-01-01
The first AIAA Drag Prediction Workshop, held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third Drag Prediction Workshop focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This work evaluated the effect of grid refinement on the code-to-code scatter for the clean attached flow test cases and the separated flow test cases.
NASA Technical Reports Server (NTRS)
Sandor, Aniko; Moses, Haifa
2016-01-01
Speech alarms have been used extensively in aviation and included in International Building Codes (IBC) and National Fire Protection Association's (NFPA) Life Safety Code. However, they have not been implemented on space vehicles. Previous studies conducted at NASA JSC showed that speech alarms lead to faster identification and higher accuracy. This research evaluated updated speech and tone alerts in a laboratory environment and in the Human Exploration Research Analog (HERA) in a realistic setup.
Extending Mondrian Memory Protection
2010-11-01
a kernel semaphore is locked or unlocked. In addition, we extended the system call interface to receive notifications about user-land locking...operations (such as calls to the mutex and semaphore code provided by the C library). By patching the dynamically loadable GLibC5, we are able to test... semaphores , and spinlocks. RTO-MP-IST-091 10- 9 Extending Mondrian Memory Protection to loading extension plugins. This prevents any untrusted code
Unsteady transonic viscous-inviscid interaction using Euler and boundary-layer equations
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar; Whitfield, Dave
1989-01-01
The Euler code is used extensively for computation of transonic unsteady aerodynamics. The boundary layer code solves the 3-D, compressible, unsteady, mean flow kinetic energy integral boundary layer equations in the direct mode. Inviscid-viscous coupling is handled using porosity boundary conditions. Some of the advantages and disadvantages of using the Euler and boundary layer equations for investigating unsteady viscous-inviscid interaction is examined.
NASA Astrophysics Data System (ADS)
Indi Sriprisan, Sirikul; Townsend, Lawrence; Cucinotta, Francis A.; Miller, Thomas M.
Purpose: An analytical knockout-ablation-coalescence model capable of making quantitative predictions of the neutron spectra from high-energy nucleon-nucleus and nucleus-nucleus collisions is being developed for use in space radiation protection studies. The FORTRAN computer code that implements this model is called UBERNSPEC. The knockout or abrasion stage of the model is based on Glauber multiple scattering theory. The ablation part of the model uses the classical evaporation model of Weisskopf-Ewing. In earlier work, the knockout-ablation model has been extended to incorporate important coalescence effects into the formalism. Recently, alpha coalescence has been incorporated, and the ability to predict light ion spectra with the coalescence model added. The earlier versions were limited to nuclei with mass numbers less than 69. In this work, the UBERNSPEC code has been extended to make predictions of secondary neutrons and light ion production from the interactions of heavy charged particles with higher mass numbers (as large as 238). The predictions are compared with published measurements of neutron spectra and light ion energy for a variety of collision pairs. Furthermore, the predicted spectra from this work are compared with the predictions from the recently-developed heavy ion event generator incorporated in the Monte Carlo radiation transport code HETC-HEDS.
Development of the Code RITRACKS
NASA Technical Reports Server (NTRS)
Plante, Ianik; Cucinotta, Francis A.
2013-01-01
A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).
Limit analysis of hollow spheres or spheroids with Hill orthotropic matrix
NASA Astrophysics Data System (ADS)
Pastor, Franck; Pastor, Joseph; Kondo, Djimedo
2012-03-01
Recent theoretical studies of the literature are concerned by the hollow sphere or spheroid (confocal) problems with orthotropic Hill type matrix. They have been developed in the framework of the limit analysis kinematical approach by using very simple trial velocity fields. The present Note provides, through numerical upper and lower bounds, a rigorous assessment of the approximate criteria derived in these theoretical works. To this end, existing static 3D codes for a von Mises matrix have been easily extended to the orthotropic case. Conversely, instead of the non-obvious extension of the existing kinematic codes, a new original mixed approach has been elaborated on the basis of the plane strain structure formulation earlier developed by F. Pastor (2007). Indeed, such a formulation does not need the expressions of the unit dissipated powers. Interestingly, it delivers a numerical code better conditioned and notably more rapid than the previous one, while preserving the rigorous upper bound character of the corresponding numerical results. The efficiency of the whole approach is first demonstrated through comparisons of the results to the analytical upper bounds of Benzerga and Besson (2001) or Monchiet et al. (2008) in the case of spherical voids in the Hill matrix. Moreover, we provide upper and lower bounds results for the hollow spheroid with the Hill matrix which are compared to those of Monchiet et al. (2008).
Biochemical and genetic analysis of the role of the viral polymerase in enterovirus recombination.
Woodman, Andrew; Arnold, Jamie J; Cameron, Craig E; Evans, David J
2016-08-19
Genetic recombination in single-strand, positive-sense RNA viruses is a poorly understand mechanism responsible for generating extensive genetic change and novel phenotypes. By moving a critical cis-acting replication element (CRE) from the polyprotein coding region to the 3' non-coding region we have further developed a cell-based assay (the 3'CRE-REP assay) to yield recombinants throughout the non-structural coding region of poliovirus from dually transfected cells. We have additionally developed a defined biochemical assay in which the only protein present is the poliovirus RNA dependent RNA polymerase (RdRp), which recapitulates the strand transfer events of the recombination process. We have used both assays to investigate the role of the polymerase fidelity and nucleotide turnover rates in recombination. Our results, of both poliovirus intertypic and intratypic recombination in the CRE-REP assay and using a range of polymerase variants in the biochemical assay, demonstrate that RdRp fidelity is a fundamental determinant of recombination frequency. High fidelity polymerases exhibit reduced recombination and low fidelity polymerases exhibit increased recombination in both assays. These studies provide the basis for the analysis of poliovirus recombination throughout the non-structural region of the virus genome and provide a defined biochemical assay to further dissect this important evolutionary process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Rapid Assessment of Agility for Conceptual Design Synthesis
NASA Technical Reports Server (NTRS)
Biezad, Daniel J.
1996-01-01
This project consists of designing and implementing a real-time graphical interface for a workstation-based flight simulator. It is capable of creating a three-dimensional out-the-window scene of the aircraft's flying environment, with extensive information about the aircraft's state displayed in the form of a heads-up-display (HUD) overlay. The code, written in the C programming language, makes calls to Silicon Graphics' Graphics Library (GL) to draw the graphics primitives. Included in this report is a detailed description of the capabilities of the code, including graphical examples, as well as a printout of the code itself
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brower, Richard C.
This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less
Fast H.264/AVC FRExt intra coding using belief propagation.
Milani, Simone
2011-01-01
In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.
Tiltrotor Aeroacoustic Code (TRAC) Prediction Assessment and Initial Comparisons with Tram Test Data
NASA Technical Reports Server (NTRS)
Burley, Casey L.; Brooks, Thomas F.; Charles, Bruce D.; McCluer, Megan
1999-01-01
A prediction sensitivity assessment to inputs and blade modeling is presented for the TiltRotor Aeroacoustic Code (TRAC). For this study, the non-CFD prediction system option in TRAC is used. Here, the comprehensive rotorcraft code, CAMRAD.Mod1, coupled with the high-resolution sectional loads code HIRES, predicts unsteady blade loads to be used in the noise prediction code WOPWOP. The sensitivity of the predicted blade motions, blade airloads, wake geometry, and acoustics is examined with respect to rotor rpm, blade twist and chord, and to blade dynamic modeling. To accomplish this assessment, an interim input-deck for the TRAM test model and an input-deck for a reference test model are utilized in both rigid and elastic modes. Both of these test models are regarded as near scale models of the V-22 proprotor (tiltrotor). With basic TRAC sensitivities established, initial TRAC predictions are compared to results of an extensive test of an isolated model proprotor. The test was that of the TiltRotor Aeroacoustic Model (TRAM) conducted in the Duits-Nederlandse Windtunnel (DNW). Predictions are compared to measured noise for the proprotor operating over an extensive range of conditions. The variation of predictions demonstrates the great care that must be taken in defining the blade motion. However, even with this variability, the predictions using the different blade modeling successfully capture (bracket) the levels and trends of the noise for conditions ranging from descent to ascent.
Tiltrotor Aeroacoustic Code (TRAC) Prediction Assessment and Initial Comparisons With TRAM Test Data
NASA Technical Reports Server (NTRS)
Burley, Casey L.; Brooks, Thomas F.; Charles, Bruce D.; McCluer, Megan
1999-01-01
A prediction sensitivity assessment to inputs and blade modeling is presented for the TiltRotor Aeroacoustic Code (TRAC). For this study, the non-CFD prediction system option in TRAC is used. Here, the comprehensive rotorcraft code, CAMRAD.Mod 1, coupled with the high-resolution sectional loads code HIRES, predicts unsteady blade loads to be used in the noise prediction code WOPWOP. The sensitivity of the predicted blade motions, blade airloads, wake geometry, and acoustics is examined with respect to rotor rpm, blade twist and chord, and to blade dynamic modeling. To accomplish this assessment. an interim input-deck for the TRAM test model and an input-deck for a reference test model are utilized in both rigid and elastic modes. Both of these test models are regarded as near scale models of the V-22 proprotor (tiltrotor). With basic TRAC sensitivities established, initial TRAC predictions are compared to results of an extensive test of an isolated model proprotor. The test was that of the TiltRotor Aeroacoustic Model (TRAM) conducted in the Duits-Nederlandse Windtunnel (DNW). Predictions are compared to measured noise for the proprotor operating over an extensive range of conditions. The variation of predictions demonstrates the great care that must be taken in defining the blade motion. However, even with this variability, the predictions using the different blade modeling successfully capture (bracket) the levels and trends of the noise for conditions ranging from descent to ascent.
Does theory influence the effectiveness of health behavior interventions? Meta-analysis.
Prestwich, Andrew; Sniehotta, Falko F; Whittington, Craig; Dombrowski, Stephan U; Rogers, Lizzie; Michie, Susan
2014-05-01
To systematically investigate the extent and type of theory use in physical activity and dietary interventions, as well as associations between extent and type of theory use with intervention effectiveness. An in-depth analysis of studies included in two systematic reviews of physical activity and healthy eating interventions (k = 190). Extent and type of theory use was assessed using the Theory Coding Scheme (TCS) and intervention effectiveness was calculated using Hedges's g. Metaregressions assessed the relationships between these measures. Fifty-six percent of interventions reported a theory base. Of these, 90% did not report links between all of their behavior change techniques (BCTs) with specific theoretical constructs and 91% did not report links between all the specified constructs with BCTs. The associations between a composite score or specific items on the TCS and intervention effectiveness were inconsistent. Interventions based on Social Cognitive Theory or the Transtheoretical Model were similarly effective and no more effective than interventions not reporting a theory base. The coding of theory in these studies suggested that theory was not often used extensively in the development of interventions. Moreover, the relationships between type of theory used and the extent of theory use with effectiveness were generally weak. The findings suggest that attempts to apply the two theories commonly used in this review more extensively are unlikely to increase intervention effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Incorporating Non-Linear Sorption into High Fidelity Subsurface Reactive Transport Models
NASA Astrophysics Data System (ADS)
Matott, L. S.; Rabideau, A. J.; Allen-King, R. M.
2014-12-01
A variety of studies, including multiple NRC (National Research Council) reports, have stressed the need for simulation models that can provide realistic predictions of contaminant behavior during the groundwater remediation process, most recently highlighting the specific technical challenges of "back diffusion and desorption in plume models". For a typically-sized remediation site, a minimum of about 70 million grid cells are required to achieve desired cm-level thickness among low-permeability lenses responsible for driving the back-diffusion phenomena. Such discretization is nearly three orders of magnitude more than is typically seen in modeling practice using public domain codes like RT3D (Reactive Transport in Three Dimensions). Consequently, various extensions have been made to the RT3D code to support efficient modeling of recently proposed dual-mode non-linear sorption processes (e.g. Polanyi with linear partitioning) at high-fidelity scales of grid resolution. These extensions have facilitated development of exploratory models in which contaminants are introduced into an aquifer via an extended multi-decade "release period" and allowed to migrate under natural conditions for centuries. These realistic simulations of contaminant loading and migration provide high fidelity representation of the underlying diffusion and sorption processes that control remediation. Coupling such models with decision support processes is expected to facilitate improved long-term management of complex remediation sites that have proven intractable to conventional remediation strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viktor K. Decyk
The UCLA work on this grant was to design and help implement an object-oriented version of the GTC code, which is written in Fortran90. The GTC code is the main global gyrokinetic code used in this project, and over the years multiple, incompatible versions have evolved. The reason for this effort is to allow multiple authors to work together on GTC and to simplify future enhancements to GTC. The effort was designed to proceed incrementally. Initially, an upper layer of classes (derived types and methods) was implemented which called the original GTC code 'under the hood.' The derived types pointedmore » to data in the original GTC code, and the methods called the original GTC subroutines. The original GTC code was modified only very slightly. This allowed one to define (and refine) a set of classes which described the important features of the GTC code in a new, more abstract way, with a minimum of implementation. Furthermore, classes could be added one at a time, and at the end of the each day, the code continued to work correctly. This work was done in close collaboration with Y. Nishimura from UC Irvine and Stefan Ethier from PPPL. Ten classes were ultimately defined and implemented: gyrokinetic and drift kinetic particles, scalar and vector fields, a mesh, jacobian, FLR, equilibrium, interpolation, and particles species descriptors. In the second state of this development, some of the scaffolding was removed. The constructors in the class objects now allocated the data and the array data in the original GTC code was removed. This isolated the components and now allowed multiple instantiations of the objects to be created, in particular, multiple ion species. Again, the work was done incrementally, one class at a time, so that the code was always working properly. This work was done in close collaboration with Y. Nishimura and W. Zhang from UC Irvine and Stefan Ethier from PPPL. The third stage of this work was to integrate the capabilities of the various versions of the GTC code into one flexible and extensible version. To do this, we developed a methodology to implement Design Patterns in Fortran90. Design Patterns are abstract solutions to generic programming problems, which allow one to handle increased complexity. This work was done in collaboration with Henry Gardner, a computer scientist (and former plasma physicist) from the Australian National University. As an example, the Strategy Pattern is being used in GTC to support multiple solvers. This new code is currently being used in the study of energetic particles. A document describing the evolution of the GTC code to this new object-oriented version is available to users of GTC.« less
Using Rose and Compass for Authentication
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2009-07-09
Many recent non-proliferation software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project. ROSEmore » is an LLNL-developed robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. It continues to be extended to support the automated analysis of binaries (x86, ARM, and PowerPC). We continue to extend ROSE to address a number of security specific requirements and apply it to software authentication for non-proliferation projects. We will give an update on the status of our work.« less
Resonant scattering experiments with radioactive nuclear beams - Recent results and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teranishi, T.; Sakaguchi, S.; Uesaka, T.
2013-04-19
Resonant scattering with low-energy radioactive nuclear beams of E < 5 MeV/u have been studied at CRIB of CNS and at RIPS of RIKEN. As an extension to the present experimental technique, we will install an advanced polarized proton target for resonant scattering experiments. A Monte-Carlo simulation was performed to study the feasibility of future experiments with the polarized target. In the Monte-Carlo simulation, excitation functions and analyzing powers were calculated using a newly developed R-matrix calculation code. A project of a small-scale radioactive beam facility at Kyushu University is also briefly described.
Application of a Scalable, Parallel, Unstructured-Grid-Based Navier-Stokes Solver
NASA Technical Reports Server (NTRS)
Parikh, Paresh
2001-01-01
A parallel version of an unstructured-grid based Navier-Stokes solver, USM3Dns, previously developed for efficient operation on a variety of parallel computers, has been enhanced to incorporate upgrades made to the serial version. The resultant parallel code has been extensively tested on a variety of problems of aerospace interest and on two sets of parallel computers to understand and document its characteristics. An innovative grid renumbering construct and use of non-blocking communication are shown to produce superlinear computing performance. Preliminary results from parallelization of a recently introduced "porous surface" boundary condition are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasari, Venkat; Sadlier, Ronald J; Geerhart, Mr. Billy
Well-defined and stable quantum networks are essential to realize functional quantum applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. We develop new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.
Program Helps To Determine Chemical-Reaction Mechanisms
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Radhakrishnan, K.
1995-01-01
General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code developed for use in solving complex, homogeneous, gas-phase, chemical-kinetics problems. Provides for efficient and accurate chemical-kinetics computations and provides for sensitivity analysis for variety of problems, including problems involving honisothermal conditions. Incorporates mathematical models for static system, steady one-dimensional inviscid flow, reaction behind incident shock wave (with boundary-layer correction), and perfectly stirred reactor. Computations of equilibrium properties performed for following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. Written in FORTRAN 77 with exception of NAMELIST extensions used for input.
RAVE—a Detector-independent vertex reconstruction toolkit
NASA Astrophysics Data System (ADS)
Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian
2007-10-01
A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".
Computational Nuclear Physics and Post Hartree-Fock Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.
We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less
\\Space: A new code to estimate \\temp, \\logg, and elemental abundances
NASA Astrophysics Data System (ADS)
Boeche, C.
2016-09-01
\\Space is a FORTRAN95 code that derives stellar parameters and elemental abundances from stellar spectra. To derive these parameters, \\Space does not measure equivalent widths of lines nor it uses templates of synthetic spectra, but it employs a new method based on a library of General Curve-Of-Growths. To date \\Space works on the wavelength range 5212-6860 Å and 8400-8921 Å, and at the spectral resolution R=2000-20000. Extensions of these limits are possible. \\Space is a highly automated code suitable for application to large spectroscopic surveys. A web front end to this service is publicly available at http://dc.g-vo.org/SP_ACE together with the library and the binary code.
Development of an Unstructured Mesh Code for Flows About Complete Vehicles
NASA Technical Reports Server (NTRS)
Peraire, Jaime; Gupta, K. K. (Technical Monitor)
2001-01-01
This report describes the research work undertaken at the Massachusetts Institute of Technology, under NASA Research Grant NAG4-157. The aim of this research is to identify effective algorithms and methodologies for the efficient and routine solution of flow simulations about complete vehicle configurations. For over ten years we have received support from NASA to develop unstructured mesh methods for Computational Fluid Dynamics. As a result of this effort a methodology based on the use of unstructured adapted meshes of tetrahedra and finite volume flow solvers has been developed. A number of gridding algorithms, flow solvers, and adaptive strategies have been proposed. The most successful algorithms developed from the basis of the unstructured mesh system FELISA. The FELISA system has been extensively for the analysis of transonic and hypersonic flows about complete vehicle configurations. The system is highly automatic and allows for the routine aerodynamic analysis of complex configurations starting from CAD data. The code has been parallelized and utilizes efficient solution algorithms. For hypersonic flows, a version of the code which incorporates real gas effects, has been produced. The FELISA system is also a component of the STARS aeroservoelastic system developed at NASA Dryden. One of the latest developments before the start of this grant was to extend the system to include viscous effects. This required the development of viscous generators, capable of generating the anisotropic grids required to represent boundary layers, and viscous flow solvers. We show some sample hypersonic viscous computations using the developed viscous generators and solvers. Although this initial results were encouraging it became apparent that in order to develop a fully functional capability for viscous flows, several advances in solution accuracy, robustness and efficiency were required. In this grant we set out to investigate some novel methodologies that could lead to the required improvements. In particular we focused on two fronts: (1) finite element methods and (2) iterative algebraic multigrid solution techniques.
High Efficiency Low Cost CO2 Compression Using Supersonic Shock Wave Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, J; Aarnio, M; Grosvenor, A
2010-12-31
Development and testing results from a supersonic compressor are presented. The compressor achieved record pressure ratio for a fully-supersonic stage and successfully demonstrated the technology potential. Several tasks were performed in compliance with the DOE award objectives. A high-pressure ratio compressor was retrofitted to improve rotordynamics behavior and successfully tested. An outside review panel confirmed test results and design approach. A computational fluid dynamics code used to analyze the Ramgen supersonic flowpath was extensively and successfully modified to improve use on high-performance computing platforms. A comprehensive R&D implementation plan was developed and used to lay the groundwork for a futuremore » full-scale compressor demonstration. Conceptual design for a CO2 demonstration compressor was developed and reviewed.« less
Axon-Sorting Multifunctional Nerve Guides: Accelerating Restoration of Nerve Function
2014-10-01
factor (singly & in selected combinations) in the organotypic model system for preferential sensory or motor axon extension. Use confocal microscopy to...track axon extension of labeled sensory or motor neurons from spinal cord slices (motor) or dorsal root ganglia ( DRG ) (sensory). 20 Thy1-YFP mice...RESEARCH ACCOMPLISHMENTS: • Established a system of color-coded mixed nerve tracking using GFP and RFP expressing motor and sensory neurons (Figure 1
Automatic Multilevel Parallelization Using OpenMP
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Jost, Gabriele; Yan, Jerry; Ayguade, Eduard; Gonzalez, Marc; Martorell, Xavier; Biegel, Bryan (Technical Monitor)
2002-01-01
In this paper we describe the extension of the CAPO (CAPtools (Computer Aided Parallelization Toolkit) OpenMP) parallelization support tool to support multilevel parallelism based on OpenMP directives. CAPO generates OpenMP directives with extensions supported by the NanosCompiler to allow for directive nesting and definition of thread groups. We report some results for several benchmark codes and one full application that have been parallelized using our system.