Assessment of the MHD capability in the ATHENA code using data from the ALEX facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, P.A.
1989-03-01
The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code is a system transient analysis code with multi-loop, multi-fluid capabilities, which is available to the fusion community at the National Magnetic Fusion Energy Computing Center (NMFECC). The work reported here assesses the ATHENA magnetohydrodynamic (MHD) pressure drop model for liquid metals flowing through a strong magnetic field. An ATHENA model was developed for two simple geometry, adiabatic test sections used in the Argonne Liquid Metal Experiment (ALEX) at Argonne National Laboratory (ANL). The pressure drops calculated by ATHENA agreed well with the experimental results from the ALEX facility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, P.A.
1988-10-28
The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code is a system transient analysis code with multi-loop, multi-fluid capabilities, which is available to the fusion community at the National Magnetic Fusion Energy Computing Center (NMFECC). The work reported here assesses the ATHENA magnetohydrodynamic (MHD) pressure drop model for liquid metals flowing through a strong magnetic field. An ATHENA model was developed for two simple geometry, adiabatic test sections used in the Argonne Liquid Metal Experiment (ALEX) at Argonne National Laboratory (ANL). The pressure drops calculated by ATHENA agreed well with the experimental results from the ALEX facility. 13 refs., 4more » figs., 2 tabs.« less
NASA Astrophysics Data System (ADS)
Calafiura, Paolo; Leggett, Charles; Seuster, Rolf; Tsulaia, Vakhtang; Van Gemmeren, Peter
2015-12-01
AthenaMP is a multi-process version of the ATLAS reconstruction, simulation and data analysis framework Athena. By leveraging Linux fork and copy-on-write mechanisms, it allows for sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain configurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows the running of AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of AthenaMP in the diversity of ATLAS event processing workloads on various computing resources: Grid, opportunistic resources and HPC.
ATHENA 3D: A finite element code for ultrasonic wave propagation
NASA Astrophysics Data System (ADS)
Rose, C.; Rupin, F.; Fouquet, T.; Chassignole, B.
2014-04-01
The understanding of wave propagation phenomena requires use of robust numerical models. 3D finite element (FE) models are generally prohibitively time consuming. However, advances in computing processor speed and memory allow them to be more and more competitive. In this context, EDF R&D developed the 3D version of the well-validated FE code ATHENA2D. The code is dedicated to the simulation of wave propagation in all kinds of elastic media and in particular, heterogeneous and anisotropic materials like welds. It is based on solving elastodynamic equations in the calculation zone expressed in terms of stress and particle velocities. The particularity of the code relies on the fact that the discretization of the calculation domain uses a Cartesian regular 3D mesh while the defect of complex geometry can be described using a separate (2D) mesh using the fictitious domains method. This allows combining the rapidity of regular meshes computation with the capability of modelling arbitrary shaped defects. Furthermore, the calculation domain is discretized with a quasi-explicit time evolution scheme. Thereby only local linear systems of small size have to be solved. The final step to reduce the computation time relies on the fact that ATHENA3D has been parallelized and adapted to the use of HPC resources. In this paper, the validation of the 3D FE model is discussed. A cross-validation of ATHENA 3D and CIVA is proposed for several inspection configurations. The performances in terms of calculation time are also presented in the cases of both local computer and computation cluster use.
Minerva: Cylindrical coordinate extension for Athena
NASA Astrophysics Data System (ADS)
Skinner, M. Aaron; Ostriker, Eve C.
2013-02-01
Minerva is a cylindrical coordinate extension of the Athena astrophysical MHD code of Stone, Gardiner, Teuben, and Hawley. The extension follows the approach of Athena's original developers and has been designed to alter the existing Cartesian-coordinates code as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport (CT), a central feature of the Athena algorithm, while making use of previously implemented code modules such as the Riemann solvers. Angular momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully.
IMPLEMENTATION OF SINK PARTICLES IN THE ATHENA CODE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong Hao; Ostriker, Eve C., E-mail: hgong@astro.umd.edu, E-mail: eco@astro.princeton.edu
2013-01-15
We describe the implementation and tests of sink particle algorithms in the Eulerian grid-based code Athena. The introduction of sink particles enables the long-term evolution of systems in which localized collapse occurs, and it is impractical (or unnecessary) to resolve the accretion shocks at the centers of collapsing regions. We discuss the similarities and differences of our methods compared to other implementations of sink particles. Our criteria for sink creation are motivated by the properties of the Larson-Penston collapse solution. We use standard particle-mesh methods to compute particle and gas gravity together. Accretion of mass and momenta onto sinks ismore » computed using fluxes returned by the Riemann solver. A series of tests based on previous analytic and numerical collapse solutions is used to validate our method and implementation. We demonstrate use of our code for applications with a simulation of planar converging supersonic turbulent flow, in which multiple cores form and collapse to create sinks; these sinks continue to interact and accrete from their surroundings over several Myr.« less
NASA Astrophysics Data System (ADS)
White, Christopher Joseph
We describe the implementation of sophisticated numerical techniques for general-relativistic magnetohydrodynamics simulations in the Athena++ code framework. Improvements over many existing codes include the use of advanced Riemann solvers and of staggered-mesh constrained transport. Combined with considerations for computational performance and parallel scalability, these allow us to investigate black hole accretion flows with unprecedented accuracy. The capability of the code is demonstrated by exploring magnetically arrested disks.
A validated non-linear Kelvin-Helmholtz benchmark for numerical hydrodynamics
NASA Astrophysics Data System (ADS)
Lecoanet, D.; McCourt, M.; Quataert, E.; Burns, K. J.; Vasil, G. M.; Oishi, J. S.; Brown, B. P.; Stone, J. M.; O'Leary, R. M.
2016-02-01
The non-linear evolution of the Kelvin-Helmholtz instability is a popular test for code verification. To date, most Kelvin-Helmholtz problems discussed in the literature are ill-posed: they do not converge to any single solution with increasing resolution. This precludes comparisons among different codes and severely limits the utility of the Kelvin-Helmholtz instability as a test problem. The lack of a reference solution has led various authors to assert the accuracy of their simulations based on ad hoc proxies, e.g. the existence of small-scale structures. This paper proposes well-posed two-dimensional Kelvin-Helmholtz problems with smooth initial conditions and explicit diffusion. We show that in many cases numerical errors/noise can seed spurious small-scale structure in Kelvin-Helmholtz problems. We demonstrate convergence to a reference solution using both ATHENA, a Godunov code, and DEDALUS, a pseudo-spectral code. Problems with constant initial density throughout the domain are relatively straightforward for both codes. However, problems with an initial density jump (which are the norm in astrophysical systems) exhibit rich behaviour and are more computationally challenging. In the latter case, ATHENA simulations are prone to an instability of the inner rolled-up vortex; this instability is seeded by grid-scale errors introduced by the algorithm, and disappears as resolution increases. Both ATHENA and DEDALUS exhibit late-time chaos. Inviscid simulations are riddled with extremely vigorous secondary instabilities which induce more mixing than simulations with explicit diffusion. Our results highlight the importance of running well-posed test problems with demonstrated convergence to a reference solution. To facilitate future comparisons, we include as supplementary material the resolved, converged solutions to the Kelvin-Helmholtz problems in this paper in machine-readable form.
The Athena Astrophysical MHD Code in Cylindrical Geometry
NASA Astrophysics Data System (ADS)
Skinner, M. A.; Ostriker, E. C.
2011-10-01
We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Christopher J.; Stone, James M.; Gammie, Charles F.
2016-08-01
We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.
Chassignole, B; Duwig, V; Ploix, M-A; Guy, P; El Guerjouma, R
2009-12-01
Multipass welds made in austenitic stainless steel, in the primary circuit of nuclear power plants with pressurized water reactors, are characterized by an anisotropic and heterogeneous structure that disturbs the ultrasonic propagation and makes ultrasonic non-destructive testing difficult. The ATHENA 2D finite element simulation code was developed to help understand the various physical phenomena at play. In this paper, we shall describe the attenuation model implemented in this code to give an account of wave scattering phenomenon through polycrystalline materials. This model is in particular based on the optimization of two tensors that characterize this material on the basis of experimental values of ultrasonic velocities attenuation coefficients. Three experimental configurations, two of which are representative of the industrial welds assessment case, are studied in view of validating the model through comparison with the simulation results. We shall thus provide a quantitative proof that taking into account the attenuation in the ATHENA code dramatically improves the results in terms of the amplitude of the echoes. The association of the code and detailed characterization of a weld's structure constitutes a remarkable breakthrough in the interpretation of the ultrasonic testing on this type of component.
AstroBlend: Visualization package for use with Blender
NASA Astrophysics Data System (ADS)
Naiman, J. P.
2015-12-01
AstroBlend is a visualization package for use in the three dimensional animation and modeling software, Blender. It reads data in via a text file or can use pre-fab isosurface files stored as OBJ or Wavefront files. AstroBlend supports a variety of codes such as FLASH (ascl:1010.082), Enzo (ascl:1010.072), and Athena (ascl:1010.014), and combines artistic 3D models with computational astrophysics datasets to create models and animations.
A Two-moment Radiation Hydrodynamics Module in ATHENA Using a Godunov Method
NASA Astrophysics Data System (ADS)
Skinner, M. A.; Ostriker, E. C.
2013-04-01
We describe a module for the Athena code that solves the grey equations of radiation hydrodynamics (RHD) using a local variable Eddington tensor (VET) based on the M1 closure of the two-moment hierarchy of the transfer equation. The variables are updated via a combination of explicit Godunov methods to advance the gas and radiation variables including the non-stiff source terms, and a local implicit method to integrate the stiff source terms. We employ the reduced speed of light approximation (RSLA) with subcycling of the radiation variables in order to reduce computational costs. The streaming and diffusion limits are well-described by the M1 closure model, and our implementation shows excellent behavior for problems containing both regimes simultaneously. Our operator-split method is ideally suited for problems with a slowly-varying radiation field and dynamical gas flows, in which the effect of the RSLA is minimal.
NASA Technical Reports Server (NTRS)
Mehdipour, M.; Kaastra, J. S.; Kallman, T.
2016-01-01
Atomic data and plasma models play a crucial role in the diagnosis and interpretation of astrophysical spectra, thus influencing our understanding of the Universe. In this investigation we present a systematic comparison of the leading photoionization codes to determine how much their intrinsic differences impact X-ray spectroscopic studies of hot plasmas in photoionization equilibrium. We carry out our computations using the Cloudy, SPEX, and XSTAR photoionization codes, and compare their derived thermal and ionization states for various ionizing spectral energy distributions. We examine the resulting absorption-line spectra from these codes for the case of ionized outflows in active galactic nuclei. By comparing the ionic abundances as a function of ionization parameter, we find that on average there is about 30 deviation between the codes in where ionic abundances peak. For H-like to B-like sequence ions alone, this deviation in is smaller at about 10 on average. The comparison of the absorption-line spectra in the X-ray band shows that there is on average about 30 deviation between the codes in the optical depth of the lines produced at log 1 to 2, reducing to about 20 deviation at log 3. We also simulate spectra of the ionized outflows with the current and upcoming high-resolution X-ray spectrometers, on board XMM-Newton, Chandra, Hitomi, and Athena. From these simulations we obtain the deviation on the best-fit model parameters, arising from the use of different photoionization codes, which is about 10 to40. We compare the modeling uncertainties with the observational uncertainties from the simulations. The results highlight the importance of continuous development and enhancement of photoionization codes for the upcoming era of X-ray astronomy with Athena.
Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT
NASA Astrophysics Data System (ADS)
Wynne, Ben; ATLAS Collaboration
2017-10-01
We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent execution of algorithms within an event. This has the potential to significantly reduce the memory footprint on future manycore devices. An additional benefit of the HLT implementation within AthenaMT is that it facilitates the integration of offline code into the HLT. The trigger must retain high rejection in the face of increasing numbers of pileup collisions. This will be achieved by greater use of offline algorithms that are designed to maximize the discrimination of signal from background. Therefore a unification of the HLT and offline reconstruction software environment is required. This has been achieved while at the same time retaining important HLT-specific optimisations that minimize the computation performed to reach a trigger decision. Such optimizations include early event rejection and reconstruction within restricted geometrical regions. We report on an HLT prototype in which the need for HLT-specific components has been reduced to a minimum. Promising results have been obtained with a prototype that includes the key elements of trigger functionality including regional reconstruction and early event rejection. We report on the first experience of migrating trigger selections to this new framework and present the next steps towards a full implementation of the ATLAS trigger.
A Radiation Transfer Solver for Athena Using Short Characteristics
NASA Astrophysics Data System (ADS)
Davis, Shane W.; Stone, James M.; Jiang, Yan-Fei
2012-03-01
We describe the implementation of a module for the Athena magnetohydrodynamics (MHD) code that solves the time-independent, multi-frequency radiative transfer (RT) equation on multidimensional Cartesian simulation domains, including scattering and non-local thermodynamic equilibrium (LTE) effects. The module is based on well known and well tested algorithms developed for modeling stellar atmospheres, including the method of short characteristics to solve the RT equation, accelerated Lambda iteration to handle scattering and non-LTE effects, and parallelization via domain decomposition. The module serves several purposes: it can be used to generate spectra and images, to compute a variable Eddington tensor (VET) for full radiation MHD simulations, and to calculate the heating and cooling source terms in the MHD equations in flows where radiation pressure is small compared with gas pressure. For the latter case, the module is combined with the standard MHD integrators using operator splitting: we describe this approach in detail, including a new constraint on the time step for stability due to radiation diffusion modes. Implementation of the VET method for radiation pressure dominated flows is described in a companion paper. We present results from a suite of test problems for both the RT solver itself and for dynamical problems that include radiative heating and cooling. These tests demonstrate that the radiative transfer solution is accurate and confirm that the operator split method is stable, convergent, and efficient for problems of interest. We demonstrate there is no need to adopt ad hoc assumptions of questionable accuracy to solve RT problems in concert with MHD: the computational cost for our general-purpose module for simple (e.g., LTE gray) problems can be comparable to or less than a single time step of Athena's MHD integrators, and only few times more expensive than that for more general (non-LTE) problems.
Organization and Management of Project Athena.
ERIC Educational Resources Information Center
Champine, George A.
1991-01-01
Project Athena is a $100 million, eight-year project to install a large network of high performance computer work stations for education and research at the Massachusetts Institute of Technology (MIT). Organizational, legal, and administrative aspects of the project allow two competitors (Digital Equipment Corporation and IBM) to work together…
A RADIATION TRANSFER SOLVER FOR ATHENA USING SHORT CHARACTERISTICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Shane W.; Stone, James M.; Jiang Yanfei
2012-03-01
We describe the implementation of a module for the Athena magnetohydrodynamics (MHD) code that solves the time-independent, multi-frequency radiative transfer (RT) equation on multidimensional Cartesian simulation domains, including scattering and non-local thermodynamic equilibrium (LTE) effects. The module is based on well known and well tested algorithms developed for modeling stellar atmospheres, including the method of short characteristics to solve the RT equation, accelerated Lambda iteration to handle scattering and non-LTE effects, and parallelization via domain decomposition. The module serves several purposes: it can be used to generate spectra and images, to compute a variable Eddington tensor (VET) for full radiationmore » MHD simulations, and to calculate the heating and cooling source terms in the MHD equations in flows where radiation pressure is small compared with gas pressure. For the latter case, the module is combined with the standard MHD integrators using operator splitting: we describe this approach in detail, including a new constraint on the time step for stability due to radiation diffusion modes. Implementation of the VET method for radiation pressure dominated flows is described in a companion paper. We present results from a suite of test problems for both the RT solver itself and for dynamical problems that include radiative heating and cooling. These tests demonstrate that the radiative transfer solution is accurate and confirm that the operator split method is stable, convergent, and efficient for problems of interest. We demonstrate there is no need to adopt ad hoc assumptions of questionable accuracy to solve RT problems in concert with MHD: the computational cost for our general-purpose module for simple (e.g., LTE gray) problems can be comparable to or less than a single time step of Athena's MHD integrators, and only few times more expensive than that for more general (non-LTE) problems.« less
How to review 4 million lines of ATLAS code
NASA Astrophysics Data System (ADS)
Stewart, Graeme A.; Lampl, Walter;
2017-10-01
As the ATLAS Experiment prepares to move to a multi-threaded framework (AthenaMT) for Run3, we are faced with the problem of how to migrate 4 million lines of C++ source code. This code has been written over the past 15 years and has often been adapted, re-written or extended to the changing requirements and circumstances of LHC data taking. The code was developed by different authors, many of whom are no longer active, and under the deep assumption that processing ATLAS data would be done in a serial fashion. In order to understand the scale of the problem faced by the ATLAS software community, and to plan appropriately the significant efforts posed by the new AthenaMT framework, ATLAS embarked on a wide ranging review of our offline code, covering all areas of activity: event generation, simulation, trigger, reconstruction. We discuss the difficulties in even logistically organising such reviews in an already busy community, how to examine areas in sufficient depth to learn key areas in need of upgrade, yet also to finish the reviews in a timely fashion. We show how the reviews were organised and how the ouptuts were captured in a way that the sub-system communities could then tackle the problems uncovered on a realistic timeline. Further, we discuss how the review has inuenced the overall planning for the Run 3 ATLAS offline code.
Computers, Electronic Networking and Education: Some American Experiences.
ERIC Educational Resources Information Center
McConnell, David
1991-01-01
Describes new developments in distributed educational computing at Massachusetts Institute of Technology (MIT, "Athena"), Carnegie Mellon University ("Andrew"), Brown University "Intermedia"), Electronic University Network (California), Western Behavioral Sciences Institute (California), and University of California,…
I-NERI Quarterly Technical Report (April 1 to June 30, 2005)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang Oh; Prof. Hee Cheon NO; Prof. John Lee
2005-06-01
The objective of this Korean/United States/laboratory/university collaboration is to develop new advanced computational methods for safety analysis codes for very-high-temperature gas-cooled reactors (VHTGRs) and numerical and experimental validation of these computer codes. This study consists of five tasks for FY-03: (1) development of computational methods for the VHTGR, (2) theoretical modification of aforementioned computer codes for molecular diffusion (RELAP5/ATHENA) and modeling CO and CO2 equilibrium (MELCOR), (3) development of a state-of-the-art methodology for VHTGR neutronic analysis and calculation of accurate power distributions and decay heat deposition rates, (4) reactor cavity cooling system experiment, and (5) graphite oxidation experiment. Second quartermore » of Year 3: (A) Prof. NO and Kim continued Task 1. As a further plant application of GAMMA code, we conducted two analyses: IAEA GT-MHR benchmark calculation for LPCC and air ingress analysis for PMR 600MWt. The GAMMA code shows comparable peak fuel temperature trend to those of other country codes. The analysis results for air ingress show much different trend from that of previous PBR analysis: later onset of natural circulation and less significant rise in graphite temperature. (B) Prof. Park continued Task 2. We have designed new separate effect test device having same heat transfer area and different diameter and total number of U-bands of air cooling pipe. New design has smaller pressure drop in the air cooling pipe than the previous one as designed with larger diameter and less number of U-bands. With the device, additional experiments have been performed to obtain temperature distributions of the water tank, the surface and the center of cooling pipe on axis. The results will be used to optimize the design of SNU-RCCS. (C) Prof. NO continued Task 3. The experimental work of air ingress is going on without any concern: With nuclear graphite IG-110, various kinetic parameters and reaction rates for the C/CO2 reaction were measured. Then, the rates of C/CO2 reaction were compared to the ones of C/O2 reaction. The rate equation for C/CO2 has been developed. (D) INL added models to RELAP5/ATHENA to cacilate the chemical reactions in a VHTR during an air ingress accident. Limited testing of the models indicate that they are calculating a correct special distribution in gas compositions. (E) INL benchmarked NACOK natural circulation data. (F) Professor Lee et al at the University of Michigan (UM) Task 5. The funding was received from the DOE Richland Office at the end of May and the subcontract paperwork was delivered to the UM on the sixth of June. The objective of this task is to develop a state of the art neutronics model for determining power distributions and decay heat deposition rates in a VHTGR core. Our effort during the reporting period covered reactor physics analysis of coated particles and coupled nuclear-thermal-hydraulic (TH) calculations, together with initial calculations for decay heat deposition rates in the core.« less
Influence of Stress Corrosion Crack Morphology on Ultrasonic Examination Performances
NASA Astrophysics Data System (ADS)
Dupond, O.; Duwig, V.; Fouquet, T.
2009-03-01
Stress Corrosion Cracking represents a potential damage for several components in PWR. For this reason, NDE of stress corrosion cracks corresponds to an important stake for Electricité de France (EDF) both for availability and for safety of plants. This paper is dedicated to the ultrasonic examination of SCC crack defects. The study mixes an experimental approach conducted on artificial flaws—meant to represent the characteristic morphologic features often encountered on SCC cracks—and a 2D finite element modelling with the code ATHENA 2D developed by EDF. Results indicate that ATHENA reproduces correctly the interaction of the beam on the complex defect. Indeed specific ultrasonic responses resulting from the defect morphology have been observed experimentally and reproduced with the modelling.
NASA Astrophysics Data System (ADS)
Martínez-Núnez, S.; Barcons, X.; Barret, D.; Bozzo, E.; Carrera, F. J.; Ceballos, M. T.; Gómez, S.; Monterde, M. P.; Rau, A.
2017-03-01
The Athena Community Office (ACO) has been established by ESA's Athena Science Study Team (ASST) in order to obtain support in performing its tasks assigned by ESA, and most specially in the ASST role as "focal point for the interests of the broad scientific community". The ACO is led by the Instituto de Física de Cantabria (CSIC-UC), and its activities are funded by CSIC and UC. Further ACO contributors are the University of Geneva, MPE and IRAP. In this poster, we present ACO to the Spanish Astronomical Community, informing about its main responsibilities, which are: assist the ASST in organising and collecting support from the Athena Working Groups and Topical Panels; organise and maintain the documentation generated by the Athena Working Groups and Topical Panels; manage the Working Group and Topical Panel membership lists; assist the ASST in promoting Athena science capabilities in the research world, through conferences and workshops; keep a record of all papers and presentations related to Athena; support the production of ASST documents; produce and distribute regularly an Athena Newsletter, informing the community about all mission and science developments; create and maintain the Athena Community web portal; maintain an active communication activity; promote, organise and support Athena science-related public outreach, in coordination with ESA and other agencies involved when appropriate; and, design, produce materials and provide pointers to available materials produced by other parties. In summary, ACO is meant to become a focal point to facilitate the scientific exchange between the Athena activities and the scientific community at large, and to disseminate the Athena science objectives to the general public.
Multi-threaded ATLAS simulation on Intel Knights Landing processors
NASA Astrophysics Data System (ADS)
Farrell, Steven; Calafiura, Paolo; Leggett, Charles; Tsulaia, Vakhtang; Dotti, Andrea; ATLAS Collaboration
2017-10-01
The Knights Landing (KNL) release of the Intel Many Integrated Core (MIC) Xeon Phi line of processors is a potential game changer for HEP computing. With 72 cores and deep vector registers, the KNL cards promise significant performance benefits for highly-parallel, compute-heavy applications. Cori, the newest supercomputer at the National Energy Research Scientific Computing Center (NERSC), was delivered to its users in two phases with the first phase online at the end of 2015 and the second phase now online at the end of 2016. Cori Phase 2 is based on the KNL architecture and contains over 9000 compute nodes with 96GB DDR4 memory. ATLAS simulation with the multithreaded Athena Framework (AthenaMT) is a good potential use-case for the KNL architecture and supercomputers like Cori. ATLAS simulation jobs have a high ratio of CPU computation to disk I/O and have been shown to scale well in multi-threading and across many nodes. In this paper we will give an overview of the ATLAS simulation application with details on its multi-threaded design. Then, we will present a performance analysis of the application on KNL devices and compare it to a traditional x86 platform to demonstrate the capabilities of the architecture and evaluate the benefits of utilizing KNL platforms like Cori for ATLAS production.
Development of a Next Generation Concurrent Framework for the ATLAS Experiment
NASA Astrophysics Data System (ADS)
Calafiura, P.; Lampl, W.; Leggett, C.; Malon, D.; Stewart, G.; Wynne, B.
2015-12-01
The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. With current memory consumption for 64 bit ATLAS reconstruction in a high luminosity environment approaching 4GB, it will become impossible to fully occupy all cores in a machine without exhausting available memory. However, since maximizing performance per watt will be a key metric, a mechanism must be found to use all cores as efficiently as possible. In this paper we report on our progress with a practical demonstration of the use of multithreading in the ATLAS reconstruction software, using the GaudiHive framework. We have expanded support to Calorimeter, Inner Detector, and Tracking code, discussing what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on both the performance gains, and what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. We also present our findings on implementing a hybrid multi-threaded / multi-process framework, to take advantage of the strengths of each type of concurrency, while avoiding some of their corresponding limitations.
Evaluating Performances of Solar-Energy Systems
NASA Technical Reports Server (NTRS)
Jaffe, L. D.
1987-01-01
CONC11 computer program calculates performances of dish-type solar thermal collectors and power systems. Solar thermal power system consists of one or more collectors, power-conversion subsystems, and powerprocessing subsystems. CONC11 intended to aid system designer in comparing performance of various design alternatives. Written in Athena FORTRAN and Assembler.
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Duquette, William H.
2013-01-01
TRISA, the U.S. Army TRADOC G2 Intelligence Support Activity, received Athena 1 in 2009. They first used Athena 3 to support studies in 2011. This paper describes Athena 4, which they started using in October 2012. A final section discusses issues that are being considered for incorporation into Athena 5 and later. Athena's objective is to help skilled intelligence analysts anticipate the likely consequences of complex courses of action that use our country's entire power base, not just our military capabilities, for operations in troubled regions of the world. Measures of effectiveness emphasize who is in control and the effects of our actions on the attitudes and well-being of civilians. The planning horizon encompasses not weeks or months, but years. Athena is a scalable, laptop-based simulation with weekly resolution. Up to three months of simulated time can pass between game turns that require user interaction. Athena's geographic scope is nominally a country, but can be a region within a county. Geographic resolution is "neighborhoods", which are defined by the user and may be actual neighborhoods, provinces, or anything in between. Models encompass phenomena whose effects are expected to be relevant over a medium-term planning horizon-three months to three years. The scope and intrinsic complexity of the problem dictate a spiral development process. That is, the model is used during development and lessons learned are used to improve the model. Even more important is that while every version must consider the "big picture" at some level of detail, development priority is given to those issues that are most relevant to currently anticipated studies. For example, models of the delivery and effectiveness of information operations messaging were among the additions in Athena 4.
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Duquette, William H.
2013-01-01
TRISA, the U.S. Army TRADOC G2 Intelligence Support Activity, received Athena 1 in 2009. They first used Athena 3 to support studies in 2011. This paper describes Athena 4, which they started using in October 2012. A final section discusses issues that are being considered for incorporation into Athena 5 and later. Athena's objective is to help skilled intelligence analysts anticipate the likely consequences of complex courses of action that use our country's entire power base, not just our military capabilities, for operations in troubled regions of the world. Measures of effectiveness emphasize who is in control and the effects of our actions on the attitudes and well being of civilians. The planning horizon encompasses not weeks or months, but years.Athena is a scalable, laptop-based simulation with weekly resolution. Up to three months of simulated time can pass between game turns that require user interaction. Athena's geographic scope is nominally a country, but can be a region within a county. Geographic resolution is "neighborhoods", which are defined by the user and may be actual neighborhoods, provinces, or anything in between. Models encompass phenomena whose effects are expected to be relevant over a medium-term planning horizon--three months to three years.The scope and intrinsic complexity of the problem dictate a spiral development process. That is, the model is used during development and lessons learned are used to improve the model. Even more important is that while every version must consider the "big picture" at some level of detail, development priority is given to those issues that are most relevant to currently anticipated studies. For example, models of the delivery and effectiveness of information operations messaging were among the additions in Athena 4.
The Athena Pancam and Color Microscopic Imager (CMI)
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Herkenhoff, K. E.; Schwochert, M.; Morris, R. V.; Sullivan, R.
2000-01-01
The Athena Mars rover payload includes two primary science-grade imagers: Pancam, a multispectral, stereo, panoramic camera system, and the Color Microscopic Imager (CMI), a multispectral and variable depth-of-field microscope. Both of these instruments will help to achieve the primary Athena science goals by providing information on the geology, mineralogy, and climate history of the landing site. In addition, Pancam provides important support for rover navigation and target selection for Athena in situ investigations. Here we describe the science goals, instrument designs, and instrument performance of the Pancam and CMI investigations.
Current test results for the Athena radar responsive tag
NASA Astrophysics Data System (ADS)
Ormesher, Richard C.; Martinez, Ana; Plummer, Kenneth W.; Erlandson, David; Delaware, Sheri; Clark, David R.
2006-05-01
Sandia National Laboratories has teamed with General Atomics and Sierra Monolithics to develop the Athena tag for the Army's Radar Tag Engagement (RaTE) program. The radar-responsive Athena tag can be used for Blue Force tracking and Combat Identification (CID) as well as data collection, identification, and geolocation applications. The Athena tag is small (~4.5" x 2.4" x 4.2"), battery-powered, and has an integral antenna. Once remotely activated by a Synthetic Aperture Radar (SAR) or Moving Target Indicator (MTI) radar, the tag transponds modulated pulses to the radar at a low transmit power. The Athena tag can operate Ku-band and X-band airborne SAR and MTI radars. This paper presents results from current tag development testing activities. Topics covered include recent field tests results from the AN/APY-8 Lynx, F16/APG-66, and F15E/APG-63 V(1) radars and other Fire Control radars. Results show that the Athena tag successfully works with multiple radar platforms, in multiple radar modes, and for multiple applications. Radar-responsive tags such as Athena have numerous applications in military and government arenas. Military applications include battlefield situational awareness, combat identification, targeting, personnel recovery, and unattended ground sensors. Government applications exist in nonproliferation, counter-drug, search-and-rescue, and land-mapping activities.
A hybrid architecture for the implementation of the Athena neural net model
NASA Technical Reports Server (NTRS)
Koutsougeras, C.; Papachristou, C.
1989-01-01
The implementation of an earlier introduced neural net model for pattern classification is considered. Data flow principles are employed in the development of a machine that efficiently implements the model and can be useful for real time classification tasks. Further enhancement with optical computing structures is also considered.
NASA Astrophysics Data System (ADS)
Kim, Jeong-Gyu; Kim, Woong-Tae; Ostriker, Eve C.; Skinner, M. Aaron
2017-12-01
We present an implementation of an adaptive ray-tracing (ART) module in the Athena hydrodynamics code that accurately and efficiently handles the radiative transfer involving multiple point sources on a three-dimensional Cartesian grid. We adopt a recently proposed parallel algorithm that uses nonblocking, asynchronous MPI communications to accelerate transport of rays across the computational domain. We validate our implementation through several standard test problems, including the propagation of radiation in vacuum and the expansions of various types of H II regions. Additionally, scaling tests show that the cost of a full ray trace per source remains comparable to that of the hydrodynamics update on up to ∼ {10}3 processors. To demonstrate application of our ART implementation, we perform a simulation of star cluster formation in a marginally bound, turbulent cloud, finding that its star formation efficiency is 12% when both radiation pressure forces and photoionization by UV radiation are treated. We directly compare the radiation forces computed from the ART scheme with those from the M1 closure relation. Although the ART and M1 schemes yield similar results on large scales, the latter is unable to resolve the radiation field accurately near individual point sources.
ATHENA, ARTEMIS, HEPHAESTUS: data analysis for X-ray absorption spectropscopy using IFEFFIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravel, B.; Newville, M.; UC)
2010-07-20
A software package for the analysis of X-ray absorption spectroscopy (XAS) data is presented. This package is based on the IFEFFIT library of numerical and XAS algorithms and is written in the Perl programming language using the Perl/Tk graphics toolkit. The programs described here are: (i) ATHENA, a program for XAS data processing, (ii) ARTEMIS, a program for EXAFS data analysis using theoretical standards from FEFF and (iii) HEPHAESTUS, a collection of beamline utilities based on tables of atomic absorption data. These programs enable high-quality data analysis that is accessible to novices while still powerful enough to meet the demandsmore » of an expert practitioner. The programs run on all major computer platforms and are freely available under the terms of a free software license.« less
NASA Astrophysics Data System (ADS)
Gueudré, C.; Marrec, L. Le; Chekroun, M.; Moysan, J.; Chassignole, B.; Corneloup, G.
2011-06-01
Multipass welds made in austenitic stainless steel, in the primary circuit of nuclear power plants with pressurized water reactors, are characterized by an anisotropic and heterogeneous structure that disturbs the ultrasonic propagation and challenge the ultrasonic non-destructive testing. The simulation in this type of structure is now possible thanks to the MINA code which allows the grain orientation modeling taking into account the welding process, and the ATHENA code to exactly simulate the ultrasonic propagation. We propose studying the case where the order of the passes is unknown to estimate the possibility of reconstructing this important parameter by ultrasound measures. The first results are presented.
Science requirements and optimization of the silicon pore optics design for the Athena mirror
NASA Astrophysics Data System (ADS)
Willingale, R.; Pareschi, G.; Christensen, F.; den Herder, J.-W.; Ferreira, D.; Jakobsen, A.; Ackermann, M.; Collon, M.; Bavdaz, M.
2014-07-01
The science requirements for the Athena X-ray mirror are to provide a collecting area of 2 m2 at 1 keV, an angular resolution of ~5 arc seconds half energy eidth (HEW) and a field of view of diameter 40-50 arc minutes. This combination of area and angular resolution over a wide field are possible because of unique features of the Silicon pore optics (SPO) technology used. Here we describe the optimization and modifications of the SPO technology required to achieve the Athena mirror specification and demonstrate how the optical design of the mirror system impacts on the scientific performance of Athena.
Evolution of the ATLAS Software Framework towards Concurrency
NASA Astrophysics Data System (ADS)
Jones, R. W. L.; Stewart, G. A.; Leggett, C.; Wynne, B. M.
2015-05-01
The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather requirements for an updated framework, going back to the first principles of how event processing occurs. In this paper we report on both these aspects of our work. For the hive based demonstrators, we discuss what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. These lessons were fed into our considerations of a new framework and we present preliminary conclusions on this work. In particular we identify areas where the framework can be simplified in order to aid the implementation of a concurrent event processing scheme. Finally, we discuss the practical difficulties involved in migrating a large established code base to a multi-threaded framework and how this can be achieved for LHC Run 3.
Multiscale Pressure-Balanced Structures in Three-dimensional Magnetohydrodynamic Turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Liping; Zhang, Lei; Feng, Xueshang
2017-02-10
Observations of solar wind turbulence indicate the existence of multiscale pressure-balanced structures (PBSs) in the solar wind. In this work, we conduct a numerical simulation to investigate multiscale PBSs and in particular their formation in compressive magnetohydrodynamic turbulence. By the use of the higher-order Godunov code Athena, a driven compressible turbulence with an imposed uniform guide field is simulated. The simulation results show that both the magnetic pressure and the thermal pressure exhibit a turbulent spectrum with a Kolmogorov-like power law, and that in many regions of the simulation domain they are anticorrelated. The computed wavelet cross-coherence spectra of themore » magnetic pressure and the thermal pressure, as well as their space series, indicate the existence of multiscale PBSs, with the small PBSs being embedded in the large ones. These multiscale PBSs are likely to be related to the highly oblique-propagating slow-mode waves, as the traced multiscale PBS is found to be traveling in a certain direction at a speed consistent with that predicted theoretically for a slow-mode wave propagating in the same direction.« less
Antiproton Trapping for Advanced Space Propulsion Applications
NASA Technical Reports Server (NTRS)
Smith, Gerald A.
1998-01-01
The Summary of Research parallels the Statement of Work (Appendix I) submitted with the proposal, and funded effective Feb. 1, 1997 for one year. A proposal was submitted to CERN in October, 1996 to carry out an experiment on the synthesis and study of fundamental properties of atomic antihydrogen. Since confined atomic antihydrogen is potentially the most powerful and elegant source of propulsion energy known, its confinement and properties are of great interest to the space propulsion community. Appendix II includes an article published in the technical magazine Compressed Air, June 1997, which describes CERN antiproton facilities, and ATHENA. During the period of this grant, Prof. Michael Holzscheiter served as spokesman for ATHENA and, in collaboration with Prof. Gerald Smith, worked on the development of the antiproton confinement trap, which is an important part of the ATHENA experiment. Appendix III includes a progress report submitted to CERN on March 12, 1997 concerning development of the ATHENA detector. Section 4.1 reviews technical responsibilities within the ATHENA collaboration, including the Antiproton System, headed by Prof. Holzscheiter. The collaboration was advised (see Appendix IV) on June 13, 1997 that the CERN Research Board had approved ATHENA for operation at the new Antiproton Decelerator (AD), presently under construction. First antiproton beams are expected to be delivered to experiments in about one year. Progress toward assembly of the ATHENA detector and initial testing expected in 1999 has been excellent. Appendix V includes a copy of the minutes of the most recently documented collaboration meeting held at CERN of October 24, 1997, which provides more information on development of systems, including the antiproton trapping apparatus. On February 10, 1998 Prof. Smith gave a 3 hour lecture on the Physics of Antimatter, as part of the Physics for the Third Millennium Lecture Series held at MSFC. Included in Appendix VI are notes and graphs presented on the ATHENA experiment. Portable antiproton trap has been under development. The goal is to store and transport antiprotons from a production site, such as Fermilab near Chicago, to a distant site, such as Huntsville, AL, thus demonstrating the portability of antiprotons.
NASA Astrophysics Data System (ADS)
Vásquez-Ramírez, Raquel; Alor-Hernández, Giner; Sánchez-Ramírez, Cuauhtémoc; Guzmán-Luna, Jaime; Zatarain-Cabada, Ramón; Barrón-Estrada, María-Lucía
2014-07-01
Education has become a key component of any society since it is the means by which humanity functions and governs itself. It allows individuals to appropriately integrate into a given community. For this reason, new ways of interaction between students and educational contents are emerging in order to improve the quality of education. In this context, devices such as computers, smartphones, or electronic tablets represent new ways of accessing educational resources which do not limit students to their usage merely inside the classroom since these devices are available anywhere. Nowadays, television has become one of these technological tools able to support the teaching-learning process through documentary films or movies, among others. However, two main issues appear. First, some of these educational contents are not those needed by a professor since information is restricted, and second, the development of TV-based applications requires an integrative approach involving the support of several specialists in education who provide the guidelines needed to build high-quality contents, as well as application designers and developers who are able to deliver the educational applications demanded by students. This work presents a system called AthenaTV to generate android-based educational applications for TV. AthenaTV takes into account the 10-foot design scheme used by Google to develop interfaces based on interface design patterns established in Google TV, and it is based on the android development guidelines and HTML5 standard.
NASA Technical Reports Server (NTRS)
Bowman, C. D.; Bebak, M.; Bollen, D. M.; Curtis, K.; Daniel, C.; Grigsby, B.; Herman, T.; Haynes, E.; Lineberger, D. H.; Pieruccini, S.
2004-01-01
The exceptional imagery and data acquired by the Mars Exploration Rovers since their January 2004 landing have captured the attention of scientists, the public, and students and teachers worldwide. One aspect of particular interest lies with a group of high school teachers and students actively engaged in the Athena Student Interns Program. The Athena Student Interns Program (ASIP) is a joint effort between NASA s Mars Public Engagement Office and the Athena Science Investigation that began in early 1999 as a pilot student-scientist research partnership program associated with the FIDO prototype Mars rover field test . The program is designed to actively engage high school students and their teachers in Mars exploration and scientific inquiry. In ASIP, groups of students and teachers from around the country work with mentors from the mission s Athena Science Team to carry out an aspect of the mission.
Status of the ESA L1 mission candidate ATHENA
NASA Astrophysics Data System (ADS)
Rando, N.; Martin, D.; Lumb, D.; Verhoeve, P.; Oosterbroek, T.; Bavdaz, M.; Fransen, S.; Linder, M.; Peyrou-Lauga, R.; Voirin, T.; Braghin, M.; Mangunsong, S.; van Pelt, M.; Wille, E.
2012-09-01
ATHENA (Advanced Telescope for High Energy Astrophysics) was an L class mission candidate within the science programme Cosmic Vision 2015-2025 of the European Space Agency, with a planned launch by 2022. ATHENA was conceived as an ESA-led project, open to the possibility of focused contributions from JAXA and NASA. By allowing astrophysical observations between 100 eV and 10 keV, it would represent the new generation X-ray observatory, following the XMM-Newton, Astro-H and Chandra heritage. The main scientific objectives of ATHENA include the study of large scale structures, the evolution of black holes, strong gravity effects, neutron star structure as well as investigations into dark matter. The ATHENA mission concept would be based on focal length of 12m achieved via a rigid metering tube and a twoaperture, x-ray telescope. Two identical x-ray mirrors would illuminate fixed focal plane instruments: a cryogenic imaging spectrometer (XMS) and a wide field imager (WFI). The S/C is designed to be fully compatible with Ariane 5 ECA. The observatory would operate at SE-L2, with a nominal lifetime of 5 yr. This paper provides a summary of the reformulation activities, completed in December 2011. An overview of the spacecraft design and of the payload is provided, including both telescope and instruments. Following the ESA Science Programme Committee decision on the L1 mission in May 2012, ATHENA was not selected to enter Definition Phase.
Project ATHENA Creates Surrogate Human Organ Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacQueen, Luke; Knospel, Fanny; Sherrod, Stacy
2015-06-15
The development of miniature surrogate human organs, coupled with highly sensitive mass spectrometry technologies, could one day revolutionize the way new drugs and toxic agents are studied. “By developing this ‘homo minutus,’ we are stepping beyond the need for animal or Petri dish testing: There are huge benefits in developing drug and toxicity analysis systems that can mimic the response of actual human organs,” said Rashi Iyer, a senior scientist at Los Alamos National Laboratory. ATHENA, the Advanced Tissue-engineered Human Ectypal Network Analyzer project team, is nearing the full integration of four human organ constructs — liver, heart, lung andmore » kidney — each organ component is about the size of a smartphone screen, and the whole ATHENA “body” of interconnected organs will fit neatly on a desk. A new video available on the Los Alamos National Laboratory YouTube channel updates the ATHENA project as it begins to integrate the various organ systems into a single system. Some 40 percent of pharmaceuticals fail their clinical trials and there are thousands of chemicals whose effects on humans are simply unknown. Providing a realistic, cost-effective and rapid screening system such as ATHENA with high-throughput capabilities could provide major benefits to the medical field, screening more accurately and offering a greater chance of clinical trial success. ATHENA is funded by the Defense Threat Reduction Agency (DTRA) and is a collaboration of Los Alamos National Laboratory, Harvard University, Vanderbilt University, Charité Universitätsmedizin, Berlin, Germany, CFD Research Corporation, and the University of California San Francisco.« less
The Athena X-ray Integral Field Unit (X-IFU)
NASA Astrophysics Data System (ADS)
Pajot, F.; Barret, D.; Lam-Trong, T.; den Herder, J.-W.; Piro, L.; Cappi, M.; Huovelin, J.; Kelley, R.; Mas-Hesse, J. M.; Mitsuda, K.; Paltani, S.; Rauw, G.; Rozanska, A.; Wilms, J.; Barbera, M.; Douchin, F.; Geoffray, H.; den Hartog, R.; Kilbourne, C.; Le Du, M.; Macculi, C.; Mesnager, J.-M.; Peille, P.
2018-04-01
The X-ray Integral Field Unit (X-IFU) of the Advanced Telescope for High-ENergy Astrophysics (Athena) large-scale mission of ESA will provide spatially resolved high-resolution X-ray spectroscopy from 0.2 to 12 keV, with 5^' ' } pixels over a field of view of 5 arc minute equivalent diameter and a spectral resolution of 2.5 eV (FWHM) up to 7 keV. The core scientific objectives of Athena drive the main performance parameters of the X-IFU. We present the current reference configuration of the X-IFU, and the key issues driving the design of the instrument.
Impact of Gene Patents and Licensing Practices on Access to Genetic Testing for Alzheimer’s Disease
Skeehan, Katie; Heaney, Christopher; Cook-Deegan, Robert
2010-01-01
Genetic testing for Alzheimer’s disease (AD) includes genotyping for apolipoprotein E, for late-onset AD, and three rare autosomal dominant, early-onset forms of AD associated with different genes (APP, PSEN1 and PSEN2). According to researchers, patents have not impeded research in the field, nor were patents an important consideration in the quest for the genetic risk factors. Athena Diagnostics holds exclusive licenses from Duke University for three “method” patents covering APOE genetic testing. Athena offers tests for APOE and genes associated with early onset, autosomal dominant AD. One of those presenilin genes is patented and exclusively licensed to Athena; the other presenilin gene was patented but the patent was allowed to lapse; and one (APP) is patented only as a research tool and patent claims do not cover diagnostic use. Direct-to-consumer testing is available for some AD-related genes, apparently without a license. Athena Diagnostics consolidated its position in the market for AD genetic testing by collecting exclusive rights to patents arising from university research. Duke University also used its licenses to Athena to enforce adherence to clinical guidelines, including elimination of the service from Smart Genetics, which was offering direct-to-consumer risk assessment based on APOE genotyping. PMID:20393312
Axisymmetric Shearing Box Models of Magnetized Disks
NASA Astrophysics Data System (ADS)
Guan, Xiaoyue; Gammie, Charles F.
2008-01-01
The local model, or shearing box, has proven a useful model for studying the dynamics of astrophysical disks. Here we consider the evolution of magnetohydrodynamic (MHD) turbulence in an axisymmetric local model in order to evaluate the limitations of global axisymmetric models. An exploration of the model parameter space shows the following: (1) The magnetic energy and α-decay approximately exponentially after an initial burst of turbulence. For our code, HAM, the decay time τ propto Res , where Res/2 is the number of zones per scale height. (2) In the initial burst of turbulence the magnetic energy is amplified by a factor proportional to Res3/4λR, where λR is the radial scale of the initial field. This scaling applies only if the most unstable wavelength of the magnetorotational instability is resolved and the final field is subthermal. (3) The shearing box is a resonant cavity and in linear theory exhibits a discrete set of compressive modes. These modes are excited by the MHD turbulence and are visible as quasi-periodic oscillations (QPOs) in temporal power spectra of fluid variables at low spatial resolution. At high resolution the QPOs are hidden by a noise continuum. (4) In axisymmetry disk turbulence is local. The correlation function of the turbulence is limited in radial extent, and the peak magnetic energy density is independent of the radial extent of the box LR for LR > 2H. (5) Similar results are obtained for the HAM, ZEUS, and ATHENA codes; ATHENA has an effective resolution that is nearly double that of HAM and ZEUS. (6) Similar results are obtained for 2D and 3D runs at similar resolution, but only for particular choices of the initial field strength and radial scale of the initial magnetic field.
Project ATHENA Creates Surrogate Human Organ Systems
MacQueen, Luke; Knospel, Fanny; Sherrod, Stacy; Iy
2018-06-06
The development of miniature surrogate human organs, coupled with highly sensitive mass spectrometry technologies, could one day revolutionize the way new drugs and toxic agents are studied. âBy developing this âhomo minutus,â we are stepping beyond the need for animal or Petri dish testing: There are huge benefits in developing drug and toxicity analysis systems that can mimic the response of actual human organs,â said Rashi Iyer, a senior scientist at Los Alamos National Laboratory. ATHENA, the Advanced Tissue-engineered Human Ectypal Network Analyzer project team, is nearing the full integration of four human organ constructs â liver, heart, lung and kidney â each organ component is about the size of a smartphone screen, and the whole ATHENA âbodyâ of interconnected organs will fit neatly on a desk. A new video available on the Los Alamos National Laboratory YouTube channel updates the ATHENA project as it begins to integrate the various organ systems into a single system. Some 40 percent of pharmaceuticals fail their clinical trials and there are thousands of chemicals whose effects on humans are simply unknown. Providing a realistic, cost-effective and rapid screening system such as ATHENA with high-throughput capabilities could provide major benefits to the medical field, screening more accurately and offering a greater chance of clinical trial success. ATHENA is funded by the Defense Threat Reduction Agency (DTRA) and is a collaboration of Los Alamos National Laboratory, Harvard University, Vanderbilt University, Charité Universitätsmedizin, Berlin, Germany, CFD Research Corporation, and the University of California San Francisco.
The Advanced Telescope for High Energy Astrophysics
NASA Astrophysics Data System (ADS)
Guainazzi, Matteo
2017-08-01
Athena (the Advanced Telescope for High Energy Astrophysics) is a next generation X-ray observatory currently under study by ESA for launch in 2028. Athena is designed to address the Hot and Energetic Universe science theme, which addresses two key questions: 1) How did ordinary matter evolve into the large scale structures we see today? 2) How do black holes grow and shape the Universe. To address these topics Athena employs an innovative X-ray telescope based on Silicon Pore Optics technology to deliver extremely light weight and high throughput, while retaining excellent angular resolution. The mirror can be adjusted to focus onto one of two focal place instruments: the X-ray Integral Field Unit (X-IFU) which provides spatially-resolved, high resolution spectroscopy, and the Wide Field Imager (WFI) which provides spectral imaging over a large field of view, as well as high time resolution and count rate tolerance. Athena is currently in Phase A and the study status will be reviewed, along with the scientific motivations behind the mission.
Ovseiko, Pavel V; Chapple, Alison; Edmunds, Laurel D; Ziebland, Sue
2017-02-21
While in the United Kingdom, Ireland, and Australia, higher education and research institutions are widely engaged with the Athena SWAN Charter for Women in Science to advance gender equality, empirical research on this process and its impact is rare. This study combined two data sets (free- text comments from a survey and qualitative interviews) to explore the range of experiences and perceptions of participation in Athena SWAN in medical science departments of a research-intensive university in Oxford, United Kingdom. The study is based on the secondary analysis of data from two projects: 59 respondents to an anonymous online survey (42 women, 17 men) provided relevant free-text comments and, separately, 37 women participated in face-to-face narrative interviews. Free-text survey comments and narrative interviews were analysed thematically using constant comparison. Both women and men said that participation in Athena SWAN had brought about important structural and cultural changes, including increased support for women's careers, greater appreciation of caring responsibilities, and efforts to challenge discrimination and bias. Many said that these positive changes would not have happened without linkage of Athena SWAN to government research funding, while others thought there were unintended consequences. Concerns about the programme design and implementation included a perception that Athena SWAN has limited ability to address longstanding and entrenched power and pay imbalances, persisting lack of work-life balance in academic medicine, questions about the sustainability of positive changes, belief that achieving the award could become an end in itself, resentment about perceived positive discrimination, and perceptions that further structural and cultural changes were needed in the university and wider society. The findings from this study suggest that Athena SWAN has a positive impact in advancing gender equality, but there may be limits to how much it can improve gender equality without wider institutional and societal changes. To address the fundamental causes of gender inequality would require cultural change and welfare state policies incentivising men to increase their participation in unpaid work in the family, which is beyond the scope of higher education and research policy.
NASA Astrophysics Data System (ADS)
Burgarella, Denis; Ciesla, Laure; Boquien, Mederic; Buat, Veronique; Roehlly, Yannick
2015-09-01
The star formation rate density traces the formation of stars in the universe. To estimate the star formation rate of galaxies, we can use a wide range of star formation tracers: continuum measurements in most wavelength domains, lines, supernovae and GRBs... All of them have pros and cons. Most of the monochromatic tracers are hampered but the effects of dust. But, before being able to estimate the star formation rate, we first need to obtain a safe estimate to the dust attenuation. The advantage of the X-ray wavelength range is that we can consider it as free from the effect of dust. In this talk, we will estimate how many galaxies we could detect with ATHENA to obtain the star formation density. For this, I will use my recent Herschel paper where the total (UV + IR) star formation rate density was evaluated up to z ~ 4 and provide quantitative figures for what ATHENA will detect as a function of the redshift and the luminosity. ATHENA will need predictions that are in agreement with what we observe in the other wavelength ranges. I will present the code CIGALE (http://cigale.lam.fr). The new and public version of CIGALE (released in April 2015) allows to model the emission of galaxies from the far-ultraviolet to the radio and it can make prediction in any of these wavelength ranges. I will show how galaxy star formation rates can be estimated in a way that combines all the advantages of monochromatic tracers but not the caveats. It should be stressed that we can model the emission of AGNs in the FUV-to-FIR range using several models. Finally, I will explain why we seriously consider to extend CIGALE to the x-ray range to predict the X-ray emission of galaxies including any AGN.
NASA Astrophysics Data System (ADS)
Schuh, Terance; Li, Yutong; Elghossain, Geena; Wiita, Paul J.
2018-06-01
We have computed a suite of simulations of propagating three-dimensional relativistic jets, involving substantial ranges of initial jet Lorentz factors and ratios of jet density to external medium density. These allow us to categorize the respective AGN into Fanaroff-Riley class I (jet dominated) and FR class II (lobe-dominated) based upon the stability and morphology of the simulations. We used the Athena code to produce a substantial collection of large 3D variations of jets, many of which propagate stably and quickly for over 100 jet radii, but others of which eventually go unstable and fill up slowing advancing lobes. Most of these simulations have jet-to-ambient medium densities between 0.005 and 0.5 and velocities between 0.90c and 0.995c. Comparing the times when some jets go unstable to these initial parameters allow us to find a threshold where radio-loud AGNs transition from class II to class I. With these high resolution fully 3D relativistic simulations we can represent the jets more accurately and thus improve upon and refine earlier results that were based on 2D simulations.
TESSIM: a simulator for the Athena-X-IFU
NASA Astrophysics Data System (ADS)
Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; den Hartog, R. H.; Bandler, S. R.; de Plaa, J.; den Herder, J.-W. A.
2016-07-01
We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS- files which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http://www.sternwarte.uni-erlangen.de/research/sixte/).
TESSIM: A Simulator for the Athena-X-IFU
NASA Technical Reports Server (NTRS)
Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; Den Hartog, R. H.; Bandler, S. R.; De Plaa, J.;
2016-01-01
We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS-les which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http:www.sternwarte.uni-erlangen.deresearchsixte).
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Duquette, William H.; Provenzano, Joseph P.; Brunzie, Theodore J.; Jordan, Benjamin
2011-01-01
The Athena simulation software supports an analyst from DoD or other federal agency in making stability and reconstruction projections for operational analyses in areas like Iraq or Afghanistan. It encompasses the use of all elements of national power: diplomatic, information, military, and economic (DIME), and anticipates their effects on political, military, economic, social, information, and infrastructure (PMESII) variables in real-world battle space environments. Athena is a stand-alone model that provides analysts with insights into the effectiveness of complex operations by anticipating second-, third-, and higher-order effects. For example, the first-order effect of executing a curfew may be to reduce insurgent activity, but it may also reduce consumer spending and keep workers home as second-order effects. Reduced spending and reduced labor may reduce the gross domestic product (GDP) as a third-order effect. Damage to the economy will have further consequences. The Athena approach has also been considered for application in studies related to climate change and the smart grid. It can be applied to any project where the impacts on the population and their perceptions are important, and where population perception is important to the success of the project.
NASA Technical Reports Server (NTRS)
Wang, Alian; Haskin, Larry A.; Jolliff, Bradley; Wdowiak, Tom; Agresti, David; Lane, Arthur L.
2000-01-01
Raman spectroscopy provides a powerful tool for in situ mineralogy, petrology, and detection of water and carbon. The Athena Raman spectrometer is a microbeam instrument intended for close-up analyses of targets (rock or soils) selected by the Athena Pancam and Mini-TES. It will take 100 Raman spectra along a linear traverse of approximately one centimeter (point-counting procedure) in one to four hours during the Mars' night. From these spectra, the following information about the target will extracted: (1) the identities of major, minor, and trace mineral phases, organic species (e.g., PAH or kerogen-like polymers), reduced inorganic carbon, and water-bearing phases; (2) chemical features (e.g. Mg/Fe ratio) of major minerals; and (3) rock textural features (e.g., mineral clusters, amygdular filling and veins). Part of the Athena payload, the miniaturized Raman spectrometer has been under development in a highly interactive collaboration of a science team at Washington University and the University of Alabama at Birmingham, and an engineering team at the Jet Propulsion Laboratory. The development has completed the brassboard stage and has produced the design for the engineering model.
The close environment of Supermassive Black Holes
NASA Astrophysics Data System (ADS)
Matt, Giorgio
2016-07-01
There are two main scientific goals of the "Close environment of SMBH" Athena Topical Panel: the determination of the BH spin distribution in the local Universe, and of the geometry of the hot X-ray emitting corona via time lag measurements. The rationale behind these goals, and how they will be achieved with Athena, will be discussed in this talk.
The Athena Mars Rover Investigation
NASA Technical Reports Server (NTRS)
Squyres, S. W.; Arvidson, R. E.; Bell, J. F., III; Carr, M.; Christensen, P.; DesMarais, D.; Economou, T.; Gorevan, S.; Haskin, L.; Herkenhoff, K.
2000-01-01
The Mars Surveyor program requires tools for martian surface exploration, including remote sensing, in-situ sensing, and sample collection. The Athena Mars rover payload is a suite of scientific instruments and sample collection tools designed to: (1) Provide color stereo imaging of martian surface environments, and remotely-sensed point discrimination of mineralogical composition; (2) Determine the elemental and mineralogical composition of martian surface materials; (3) Determine the fine-scale textural properties of these materials; and (4) Collect and store samples. The Athena payload is designed to be implemented on a long-range rover such as the one now under consideration for the 2003 Mars opportunity. The payload is at a high state of maturity, and most of the instruments have now been built for flight.
The third stage of Lunar Prospector's Athena is placed atop the second stage at LC 46 at CCAS
NASA Technical Reports Server (NTRS)
1997-01-01
The third stage of the Lockheed Martin Athena launch vehicle is placed atop the vehicle's second stage at Launch Complex 46 at Cape Canaveral Air Station. Athena is scheduled to carry the Lunar Prospector spacecraft for an 18-month mission that will orbit the Earth's moon to collect data from the lunar surface. Scientific experiments to be conducted by the Prospector include locating water ice that may exist near the lunar poles, gathering data to understand the evolution of the lunar highland crust and the lunar magnetic field, finding radon outgassing events, and describing the lunar gravity field by means of Doppler tracking. The launch is now scheduled for early-January 1998.
The third stage of Lunar Prospector's Athena is lifted at LC 46 at CCAS
NASA Technical Reports Server (NTRS)
1997-01-01
The third stage of the Lockheed Martin Athena launch vehicle is lifted at Launch Complex 46 at Cape Canaveral Air Station before mating to the second stage already on the pad. Athena is scheduled to carry the Lunar Prospector spacecraft for an 18- month mission that will orbit the Earth's moon to collect data from the lunar surface. Scientific experiments to be conducted by the Prospector include locating water ice that may exist near the lunar poles, gathering data to understand the evolution of the lunar highland crust and the lunar magnetic field, finding radon outgassing events, and describing the lunar gravity field by means of Doppler tracking. The launch is now scheduled for early- January 1998.
Protonium production in ATHENA
NASA Astrophysics Data System (ADS)
Venturelli, L.; Amoretti, M.; Amsler, C.; Bonomi, G.; Carraro, C.; Cesar, C. L.; Charlton, M.; Doser, M.; Fontana, A.; Funakoshi, R.; Genova, P.; Hayano, R. S.; Jørgensen, L. V.; Kellerbauer, A.; Lagomarsino, V.; Landua, R.; Rizzini, E. Lodi; Macrì, M.; Madsen, N.; Manuzio, G.; Mitchard, D.; Montagna, P.; Posada, L. G.; Pruys, H.; Regenfus, C.; Rotondi, A.; Testera, G.; van der Werf, D. P.; Variola, A.; Yamazaki, Y.; Zurlo, N.; Athena Collaboration
2007-08-01
The ATHENA experiment at CERN, after producing cold antihydrogen atoms for the first time in 2002, has synthesised protonium atoms in vacuum at very low energies. Protonium, i.e. the antiproton-proton bound system, is of interest for testing fundamental physical theories. In the nested penning trap of the ATHENA apparatus protonium has been produced as result of a chemical reaction between an antiproton and the simplest matter molecule, H2+. The formed protonium atoms have kinetic energies in the range 40-700 meV and are metastable with mean lifetimes of the order of 1 μs. Our result shows that it will be possible to start measurements on protonium at low energy antiproton facilities, such as the AD at CERN or FLAIR at GSI.
The ATHENA telescope and optics status
NASA Astrophysics Data System (ADS)
Bavdaz, Marcos; Wille, Eric; Ayre, Mark; Ferreira, Ivo; Shortt, Brian; Fransen, Sebastiaan; Collon, Maximilien; Vacanti, Giuseppe; Barriere, Nicolas; Landgraf, Boris; Haneveld, Jeroen; van Baren, Coen; Zuknik, Karl-Heintz; Della Monica Ferreira, Desiree; Massahi, Sonny; Christensen, Finn; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Spiga, Daniele; Valsecchi, Giuseppe; Vernani, Dervis; Oliver, Paul; Seidel, André
2017-08-01
The work on the definition and technological preparation of the ATHENA (Advanced Telescope for High ENergy Astrophysics) mission continues to progress. In parallel to the study of the accommodation of the telescope, many aspects of the X-ray optics are being evolved further. The optics technology chosen for ATHENA is the Silicon Pore Optics (SPO), which hinges on technology spin-in from the semiconductor industry, and uses a modular approach to produce large effective area lightweight telescope optics with a good angular resolution. Both system studies and the technology developments are guided by ESA and implemented in industry, with participation of institutional partners. In this paper an overview of the current status of the telescope optics accommodation and technology development activities is provided.
Benchmarking the Multidimensional Stellar Implicit Code MUSIC
NASA Astrophysics Data System (ADS)
Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.
2017-04-01
We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.
Background simulations for the wide field imager aboard the ATHENA X-ray Observatory
NASA Astrophysics Data System (ADS)
Hauf, Steffen; Kuster, Markus; Hoffmann, Dieter H. H.; Lang, Philipp-Michael; Neff, Stephan; Pia, Maria Grazia; Strüder, Lothar
2012-09-01
The ATHENA X-ray observatory was a European Space Agency project for a L-class mission. ATHENA was to be based upon a simplified IXO design with the number of instruments and the focal length of the Wolter optics being reduced. One of the two instruments, the Wide Field Imager (WFI) was to be a DePFET based focal plane pixel detector, allowing for high time and spatial resolution spectroscopy in the energy-range between 0.1 and 15 keV. In order to fulfill the mission goals a high sensitivity is essential, especially to study faint and extended sources. Thus a detailed understanding of the detector background induced by cosmic ray particles is crucial. During the mission design generally extensive Monte-Carlo simulations are used to estimate the detector background in order to optimize shielding components and software rejection algorithms. The Geant4 toolkit1,2 is frequently the tool of choice for this purpose. Alongside validation of the simulation environment with XMM-Newton EPIC-pn and Space Shuttle STS-53 data we present estimates for the ATHENA WFI cosmic ray induced background including long-term activation, which demonstrate that DEPFET-technology based detectors are able to achieve the required sensitivity.
Silicon pore optics development for ATHENA
NASA Astrophysics Data System (ADS)
Collon, Maximilien J.; Vacanti, Giuseppe; Günther, Ramses; Yanson, Alex; Barrière, Nicolas; Landgraf, Boris; Vervest, Mark; Chatbi, Abdelhakim; Beijersbergen, Marco W.; Bavdaz, Marcos; Wille, Eric; Haneveld, Jeroen; Koelewijn, Arenda; Leenstra, Anne; Wijnperle, Maurice; van Baren, Coen; Müller, Peter; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Conconi, Paolo; Christensen, Finn E.
2015-09-01
The ATHENA mission, a European large (L) class X-ray observatory to be launched in 2028, will essentially consist of an X-ray lens and two focal plane instruments. The lens, based on a Wolter-I type double reflection grazing incidence angle design, will be very large (~ 3 m in diameter) to meet the science requirements of large effective area (1-2 m2 at a few keV) at a focal length of 12 m. To meet the high angular resolution (5 arc seconds) requirement the X-ray lens will also need to be very accurate. Silicon Pore Optics (SPO) technology has been invented to enable building such a lens and thus enabling the ATHENA mission. We will report in this paper on the latest status of the development, including details of X-ray test campaigns.
The Wide Field Imager for Athena
NASA Astrophysics Data System (ADS)
Rau, A.; Nandra, K.; Meidinger, N.; Plattner, M.
2017-10-01
The Wide Field Imager (WFI) is one of the two scientific instruments of Athena, ESA's next large X-ray Observatory with launch in 2028. The instrument will provide two defining capabilities to the mission sensitive wide-field imaging spectroscopy and excellent high-count rate performance. It will do so with the use of two separate detectors systems, the Large Detector Array (LDA) optimized for its field of view (40'×40') with a 100 fold survey speed increase compared to existing X-ray missions, and the Fast Detector (FD) tweaked for high throughput and low pile-up for point sources as bright as the Crab. In my talk I will present the key performance parameters of the instrument and their links to the scientific goals of Athena and summarize the status of the ongoing development activities.
Reynolds, Matthew R; Nilsson, Jonas; Akerborg, Orjan; Jhaveri, Mehul; Lindgren, Peter
2013-01-01
The first antiarrhythmic drug to demonstrate a reduced rate of cardiovascular hospitalization in atrial fibrillation/flutter (AF/AFL) patients was dronedarone in a placebo-controlled, double-blind, parallel arm Trial to assess the efficacy of dronedarone 400 mg bid for the prevention of cardiovascular Hospitalization or death from any cause in patiENts with Atrial fibrillation/atrial flutter (ATHENA trial). The potential cost-effectiveness of dronedarone in this patient population has not been reported in a US context. This study assesses the cost-effectiveness of dronedarone from a US health care payers' perspective. ATHENA patient data were applied to a patient-level health state transition model. Probabilities of health state transitions were derived from ATHENA and published data. Associated costs used in the model (2010 values) were obtained from published sources when trial data were not available. The base-case model assumed that patients were treated with dronedarone for the duration of ATHENA (mean 21 months) and were followed over a lifetime. Cost-effectiveness, from the payers' perspective, was determined using a Monte Carlo microsimulation (1 million fictitious patients). Dronedarone plus standard care provided 0.13 life years gained (LYG), and 0.11 quality-adjusted life years (QALYs), over standard care alone; cost/QALY was $19,520 and cost/LYG was $16,930. Compared to lower risk patients, patients at higher risk of stroke (Congestive heart failure, history of Hypertension, Age ≥ 75 years, Diabetes mellitus, and past history of Stroke or transient ischemic attack (CHADS(2)) scores 3-6 versus 0) had a lower cost/QALY ($9580-$16,000 versus $26,450). Cost/QALY was highest in scenarios assuming lifetime dronedarone therapy, no cardiovascular mortality benefit, no cost associated with AF/AFL recurrence on standard care, and when discounting of 5% was compared with 0%. By extrapolating the results of a large, multicenter, randomized clinical trial (ATHENA), this model suggests that dronedarone is a cost-effective treatment option for approved indications (paroxysmal/persistent AF/AFL) in the US.
Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A
2006-01-01
Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.
Caffrey, Louise; Mattingley, Helena; Williamson, Catherine; McKevitt, Christopher
2016-01-01
Objectives Gender inequity has persisted in academic medicine. Yet equity is vital for countries to achieve their full potential in terms of translational research and patient benefit. This study sought to understand how the gender equity programme, Athena SWAN, can be enabled and constrained by interactions between the programme and the context it is implemented into, and whether these interactions might produce unintended consequences. Design Multimethod qualitative case studies using a realist evaluation approach. Setting 5 departments from a university medical school hosting a Translational Research Organisation. Participants 25 hours of observations of gender equality committee meetings, 16 in-depth interviews with Heads of Departments, Committee Leads and key personnel involved in the initiative. 4 focus groups with 15 postdoctoral researchers, lecturers and senior lecturers. Results The implementation of Athena SWAN principles was reported to have created social space to address gender inequity and to have highlighted problematic practices to staff. However, a number of factors reduced the programme's potential to impact gender inequity. Gender inequity was reproduced in the programme's enactment as female staff was undertaking a disproportionate amount of Athena SWAN work, with potential negative impacts on individual women's career progression. Early career researchers experienced problems accessing Athena SWAN initiatives. Furthermore, the impact of the programme was perceived to be undermined by wider institutional practices, national policies and societal norms, which are beyond the programme's remit. Conclusions Gender equity programmes have the potential to address inequity. However, paradoxically, they can also unintentionally reproduce and reinforce gender inequity through their enactment. Potential programme impacts may be undermined by barriers to staff availing of career development and training initiatives, and by wider institutional practices, national policies and societal norms. PMID:27609850
The third stage of Lunar Prospector's Athena arrives at LC 46 at CCAS
NASA Technical Reports Server (NTRS)
1997-01-01
The third stage of the Lockheed Martin Athena launch vehicle arrives at Launch Complex 46 at Cape Canaveral Air Station before it is mated to the second stage. The protective covering for safe transportation is removed before the third stage is lifted on the launch pad. Athena is scheduled to carry the Lunar Prospector spacecraft for an 18-month mission that will orbit the Earth's moon to collect data from the lunar surface. Scientific experiments to be conducted by the Prospector include locating water ice that may exist near the lunar poles, gathering data to understand the evolution of the lunar highland crust and the lunar magnetic field, finding radon outgassing events, and describing the lunar gravity field by means of Doppler tracking. The launch is now scheduled for early-January 1998.
1997-12-09
NASA's Lunar Prospector is taken out of its crate at Astrotech, a commercial payload processing facility, in Titusville, Fla. The small robotic spacecraft, to be launched for NASA on an Athena 2 rocket by Lockheed Martin, is designed to provide the first global maps of the Moon's surface compositional elements and its gravitational and magnetic fields. While at Astrotech, Lunar Prospector will be fueled with its attitude control propellant and then mated to a Trans-Lunar Injection Stage which is a solid propellant upper stage motor. The combination will next be spin tested to verify proper balance, then encapsulated into an Athena nose fairing. Then the Lunar Prospector will be transported from Astrotech to Cape Canaveral Air Station and mated to an Athena rocket. The launch of Lunar Prospector is scheduled for Jan. 5, 1998 at 8:31 p.m
1997-12-09
NASA's Lunar Prospector is taken out of its crate at Astrotech, a commercial payload processing facility, in Titusville, Fla. The small robotic spacecraft, to be launched for NASA on an Athena 2 rocket by Lockheed Martin, is designed to provide the first global maps of the Moon's surface compositional elements and its gravitational and magnetic fields. While at Astrotech, Lunar Prospector will be fueled with its attitude control propellant and then mated to a Trans-Lunar Injection Stage which is a solid propellant upper stage motor. The combination will next be spin tested to verify proper balance, then encapsulated into an Athena nose fairing. Then the Lunar Prospector will be transported from Astrotech to Cape Canaveral Air Station and mated to an Athena rocket. The launch of Lunar Prospector is scheduled for Jan. 5, 1998 at 8:31 p.m
Athena Research Ship System (Users Guide)
1988-05-01
Users may arrange for their own account any logistic support that does not impact the ship directly; such as crane service, drayage, small craft, flying...craft, photographic services, and the like. Any services that impact the ships’ structural, propulsion and electrical or electronic systems must be...by block number) This manual was developed to provide general information regarding the ATHENA RESEARCH SHIP SYSTEM and specific data relative to the
ATHENA, the Desktop Human "Body"
Iyer, Rashi; Harris, Jennifer
2018-05-18
Creating surrogate human organs, coupled with insights from highly sensitive mass spectrometry technologies, a new project is on the brink of revolutionizing the way we screen new drugs and toxic agents. ATHENA, the Advanced Tissue-engineered Human Ectypal Network Analyzer project team, is developing four human organ constructs - liver, heart, lung and kidney - that are based on a significantly miniaturized platform. Each organ component will be about the size of a smartphone screen, and the whole ATHENA "body" of interconnected organs would fit neatly on a desk. "By developing this 'homo minutus,' we are stepping beyond the need for animal or Petri dish testing: There are huge benefits in developing drug and toxicity analysis systems that can mimic the response of actual human organs," said Rashi Iyer, a senior scientist at Los Alamos National Laboratory, the lead laboratory on the five-year, $19 million multi-institutional effort. The project is supported by the Defense Threat Reduction Agency (DTRA). Some 40 percent of pharmaceuticals fail their clinical trials, Iyer noted, and there are thousands of chemicals whose effects on humans are simply unknown. Providing a realistic, cost-effective and rapid screening system such as ATHENA with high-throughput capabilities could provide major benefits to the medical field, screening more accurately and offering a greater chance of clinical trial success.
AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading
NASA Astrophysics Data System (ADS)
Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration
2017-10-01
ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.
ATHENA, the Desktop Human "Body"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iyer, Rashi; Harris, Jennifer
2014-09-29
Creating surrogate human organs, coupled with insights from highly sensitive mass spectrometry technologies, a new project is on the brink of revolutionizing the way we screen new drugs and toxic agents. ATHENA, the Advanced Tissue-engineered Human Ectypal Network Analyzer project team, is developing four human organ constructs - liver, heart, lung and kidney - that are based on a significantly miniaturized platform. Each organ component will be about the size of a smartphone screen, and the whole ATHENA "body" of interconnected organs would fit neatly on a desk. "By developing this 'homo minutus,' we are stepping beyond the need formore » animal or Petri dish testing: There are huge benefits in developing drug and toxicity analysis systems that can mimic the response of actual human organs," said Rashi Iyer, a senior scientist at Los Alamos National Laboratory, the lead laboratory on the five-year, $19 million multi-institutional effort. The project is supported by the Defense Threat Reduction Agency (DTRA). Some 40 percent of pharmaceuticals fail their clinical trials, Iyer noted, and there are thousands of chemicals whose effects on humans are simply unknown. Providing a realistic, cost-effective and rapid screening system such as ATHENA with high-throughput capabilities could provide major benefits to the medical field, screening more accurately and offering a greater chance of clinical trial success.« less
Performance Characteristics of Lithium Ion Prototype Cells for 2003 Mars Sample Return Athena Rover
NASA Technical Reports Server (NTRS)
Ratnakumar, B. V.; Smart, M. C.; Ewell, R.; Surampudi, S.; Marsh, R. A.
2000-01-01
A viewgraph presentation outlines the mission objectives and power subsystem for the Mars Sample Return (MSR) Athena Rover. The NASA-DOD (depth of discharge) Interagency Li Ion program objectives are discussed. Evaluation tests performed at JPL are listed, and test results are shown for the Li-Ion cell initial capacity, charge/discharge capacity, voltage and ratio, specific energy, watt-hour efficiency, and cell voltage at various temperatures.
Industrialization of the mirror plate coatings for the ATHENA mission
NASA Astrophysics Data System (ADS)
Massahi, S.; Christensen, F. E.; Ferreira, D. D. M.; Shortt, B.; Collon, M.; Sforzini, J.; Landgraf, B.; Hinze, F.; Aulhorn, S.; Biedermann, R.
2017-08-01
In the frame of the development of the Advanced Telescope for High-ENergy Astrophysics (Athena) mission, currently in phase A, ESA is continuing to mature the optics technology and the associated mass production techniques. These efforts are driven by the programmatic and technical requirement of reaching TRL 6 prior to proposing the mission for formal adoption (planned for 2020). A critical part of the current phase A preparation activities is addressing the industrialization of the Silicon Pore Optics mirror plates coating. This include the transfer of the well-established coating processes and techniques, performed at DTU Space, to an industrial scale facility suitable for coating the more than 100,000 mirror plates required for Athena. In this paper, we explain the considerations for the planned coating facility including, requirement specification, equipment and supplier selection, preparing the coating facility for the deposition equipment, designing and fabrication.
ATHENA: Remote Sensing Science Center for Cultural Heritage in Cyprus
NASA Astrophysics Data System (ADS)
Hadjimitsis, Diofantos G.; Agapiou, Athos; Lysandrou, Vasiliki; Themistocleous, Kyriakos; Cuca, Branka; Lasaponara, Rosa; Masini, Nicola; Krauss, Thomas; Cerra, Daniele; Gessner, Ursula; Schreier, Gunter
2016-04-01
The Cultural Heritage (CH) sector, especially those of monuments and sites has always been facing a number of challenges from environmental pressure, pollution, human intervention from tourism to destruction by terrorism.Within this context, CH professionals are seeking to improve currently used methodologies, in order to better understand, protect and valorise the common European past and common identity. "ATHENA" H2020-TWINN-2015 project will seek to improve and expand the capabilities of the Cyprus University of Technology, involving professionals dealing with remote sensing technologies for supporting CH sector from the National Research Center of Italy (CNR) and German Aerospace Centre (DLR). The ATHENA centre will be devoted to the development, introduction and systematic use of advanced remote sensing science and technologies in the field of archaeology, built cultural heritage, their multi-temporal analysis and interpretation and the distant monitoring of their natural and anthropogenic environment in the area of Eastern Mediterranean.
The Athena X-ray Integral Field Unit
NASA Astrophysics Data System (ADS)
Barret, D.
2017-10-01
The Athena X-ray Integral Field Unit (X-IFU) is a high-resolution X-ray spectrometer, providing 2.5 eV spectral resolution, over a 5' (equivalent diameter) field of view, and count rate capabilities up to 1 Crab in the 0.2-12 keV range. Approaching the end of its feasibility study (scheduled around the end of 2017), I will briefly recall the scientific objectives of Athena driving the X-IFU specifications and will describe its current baseline configuration and the expected performances. I will outline the on-going technology developments that will enable the X-IFU. The X-IFU will be developed by an international consortium led by France (IRAP/CNES), the Netherlands (SRON), Italy (IAPS), with ESA member state contributions from Belgium, Finland, Germany, Poland, Spain and Switzerland, and international partner contributions from Japan and the United States. This talk is given on behalf of the X-IFU Consortium.
TES-Based X-Ray Microcalorimeter Performances Under AC Bias and FDM for Athena
NASA Technical Reports Server (NTRS)
Akamatsu, H.; Gottardi, L.; de Vries, C. P.; Adams, J. S.; Bandler, S. R.; Bruijn, M. P.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Gao, J. R.;
2016-01-01
Athena is a European X-ray observatory, scheduled for launch in 2028. Athena will employ a high-resolution imaging spectrometer called X-ray integral field unit (X-IFU), consisting of an array of 4000 transition edge sensor (TES) microcalorimeter pixels. For the readout of X-IFU, we are developing frequency domain multiplexing, which is the baseline readout system. In this paper, we report on the performance of a TES X-ray calorimeter array fabricated at Goddard Space Flight Center (GSFC) at MHz frequencies for the baseline of X-IFU detector. During single-pixel AC bias characterization, we measured X-ray energy resolutions (at 6 keV) of about 2.9 eV at both 2.3 and 3.7 MHz. Furthermore, in the multiplexing mode, we measured X-ray energy resolutions of about 2.9 eV at 1.3 and 1.7 MHz.
SIRENA software for Athena X-IFU event reconstruction
NASA Astrophysics Data System (ADS)
Ceballos, M. T.; Cobo, B.; Peille, P.; Wilms, J.; Brand, T.; Dauser, T.; Bandler, S.; Smith, S.
2017-03-01
The X-ray Observatory Athena was proposed in April 2014 as the mission to implement the science theme "The Hot and Energetic Universe" selected by ESA for L2 (the second Large-class mission in ESA’s Cosmic Vision science programme). One of the two X-ray detectors designed to be onboard Athena is X-IFU, a cryogenic microcalorimeter based on Transition Edge Sensor (TES) technology that will provide spatially resolved high-resolution spectroscopy. X-IFU will be developed by an international consortium led by IRAP (PI), SRON (co-PI) and IAPS/INAF (co-PI) and involving ESA Member States, Japan and the United States. In Spain, IFCA (CSIC-UC) has an anticipated contribution to X-IFU through the Digital Readout Electronics (DRE) unit, in particular in the Event Processor Subsystem. For this purpose and in collaboration with the Athena end-to-end simulations team, we are currently developing the SIRENA package as part of the publicly available SIXTE end-to-end simulator. SIRENA comprises a set of processing algorithms aimed at recognizing, from a noisy signal, the intensity pulses generated by the absorption of the X-ray photons, to lately reconstruct their energy, position and arrival time. This poster describes the structure of the package and the different algorithms currently implemented as well as their comparative performance in the energy resolution achieved in the reconstruction of the instrument events.
The X-Ray Integral Field Unit and the Athena mission
NASA Astrophysics Data System (ADS)
Piro, Luigi; Barret, Didier; Den herder, Jan-willem
The Athena+ mission concept is designed to implement the Hot and Energetic Universe science theme submitted to the European Space Agency in response to the call for White Papers for the definition of the L2 and L3 missions of its science program. The Athena+ science payload consists of a large aperture high angular resolution X-ray optics and twelve meters away, two interchangeable focal plane instruments: the X-ray Integral Field Unit (X-IFU) and the Wide Field Imager (WFI). The X-IFU is a cryogenic X-ray spectrometer, based on a large array of Transition Edge Sensors (TES), offering 2.5 eV spectral resolution, with ˜ 5’’ pixels, over a field of view of 5 arc minutes in diameter. In this talk, we briefly describe the Athena+ mission concept and the X-IFU performance being driven by science requirements. We then present the X-IFU detector and readout electronics principles, the current design of the focal plane assembly, the cooling chain and review the global architecture design. Finally, we describe the current performance estimates, in terms of effective area, particle background rejection, count rate capability and velocity measurements. Finally, we emphasize on the latest technology developments concerning TES array fabrication, spectral resolution and readout performance achieved to show that significant progresses are being accomplished towards the demanding X-IFU requirements.
Caffrey, Louise; Wyatt, David; Fudge, Nina; Mattingley, Helena; Williamson, Catherine; McKevitt, Christopher
2016-09-08
Gender inequity has persisted in academic medicine. Yet equity is vital for countries to achieve their full potential in terms of translational research and patient benefit. This study sought to understand how the gender equity programme, Athena SWAN, can be enabled and constrained by interactions between the programme and the context it is implemented into, and whether these interactions might produce unintended consequences. Multimethod qualitative case studies using a realist evaluation approach. 5 departments from a university medical school hosting a Translational Research Organisation. 25 hours of observations of gender equality committee meetings, 16 in-depth interviews with Heads of Departments, Committee Leads and key personnel involved in the initiative. 4 focus groups with 15 postdoctoral researchers, lecturers and senior lecturers. The implementation of Athena SWAN principles was reported to have created social space to address gender inequity and to have highlighted problematic practices to staff. However, a number of factors reduced the programme's potential to impact gender inequity. Gender inequity was reproduced in the programme's enactment as female staff was undertaking a disproportionate amount of Athena SWAN work, with potential negative impacts on individual women's career progression. Early career researchers experienced problems accessing Athena SWAN initiatives. Furthermore, the impact of the programme was perceived to be undermined by wider institutional practices, national policies and societal norms, which are beyond the programme's remit. Gender equity programmes have the potential to address inequity. However, paradoxically, they can also unintentionally reproduce and reinforce gender inequity through their enactment. Potential programme impacts may be undermined by barriers to staff availing of career development and training initiatives, and by wider institutional practices, national policies and societal norms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Student Participation in Rover Field Trials
NASA Astrophysics Data System (ADS)
Bowman, C. D.; Arvidson, R. E.; Nelson, S. V.; Sherman, D. M.; Squyres, S. W.
2001-12-01
The LAPIS program was developed in 1999 as part of the Athena Science Payload education and public outreach, funded by the JPL Mars Program Office. For the past three years, the Athena Science Team has been preparing for 2003 Mars Exploration Rover Mission operations using the JPL prototype Field Integrated Design and Operations (FIDO) rover in extended rover field trials. Students and teachers participating in LAPIS work with them each year to develop a complementary mission plan and implement an actual portion of the annual tests using FIDO and its instruments. LAPIS is designed to mirror an end-to-end mission: Small, geographically distributed groups of students form an integrated mission team, working together with Athena Science Team members and FIDO engineers to plan, implement, and archive a two-day test mission, controlling FIDO remotely over the Internet using the Web Interface for Telescience (WITS) and communicating with each other by email, the web, and teleconferences. The overarching goal of LAPIS is to get students excited about science and related fields. The program provides students with the opportunity to apply knowledge learned in school, such as geometry and geology, to a "real world" situation and to explore careers in science and engineering through continuous one-on-one interactions with teachers, Athena Science Team mentors, and FIDO engineers. A secondary goal is to help students develop improved communication skills and appreciation of teamwork, enhanced problem-solving skills, and increased self-confidence. The LAPIS program will provide a model for outreach associated with future FIDO field trials and the 2003 Mars mission operations. The base of participation will be broadened beyond the original four sites by taking advantage of the wide geographic distribution of Athena team member locations. This will provide greater numbers of students with the opportunity to actively engage in rover testing and to explore the possibilities of science, engineering, and technology.
NASA Technical Reports Server (NTRS)
Jaffe, L. D.
1984-01-01
The CONC/11 computer program designed for calculating the performance of dish-type solar thermal collectors and power systems is discussed. This program is intended to aid the system or collector designer in evaluating the performance to be expected with possible design alternatives. From design or test data on the characteristics of the various subsystems, CONC/11 calculates the efficiencies of the collector and the overall power system as functions of the receiver temperature for a specified insolation. If desired, CONC/11 will also determine the receiver aperture and the receiver temperature that will provide the highest efficiencies at a given insolation. The program handles both simple and compound concentrators. The CONC/11 is written in Athena Extended FORTRAN (similar to FORTRAN 77) to operate primarily in an interactive mode on a Sperry 1100/81 computer. It could also be used on many small computers. A user's manual is also provided for this program.
Surveys with Athena: results from detailed SIXTE simulations
NASA Astrophysics Data System (ADS)
Lanzuisi, G.; Comastri, A.; Aird, J.; Brusa, M.; Cappelluti, N.; Gilli, R.; Matute, I.
2017-10-01
"Formation and early growth of BH' and "Accretion by supermassive BH through cosmic time' are two of the scientific objectives of the Athena mission. To these and other topics (i.e. first galaxy groups, cold and warm obscuration and feedback signatures in AGN at high z), a large fraction (20-25%) of the Athena Mock Observing Plan is devoted, in the form of a multi-tiered (deep-medium-wide) survey with the WFI. We used the flexible SIXTE simulator to study the impact of different instrumental configurations, in terms of WFI FOV, mirror psf, background levels, on the performance in the three layers of the WFI survey. We mainly focus on the scientific objective that drives the survey configuration: the detection of at least 10 AGN at z=6-8 with Log(LX)=43-43.5 erg/s and 10 at z=8.10 with Log(LX)=44-44.5 erg/s. Implications for other scientific objectives involved in the survey are also discussed.
The Athena X-ray Integral Field Unit (X-IFU)
NASA Technical Reports Server (NTRS)
Barret, Didier; Trong, Thein Lam; Den Herder, Jan-Willem; Piro, Luigi; Barcons, Xavier; Huovelin, Juhani; Kelley, Richard; Mas-Hesse, J. Miquel; Mitsuda, Kazuhisa; Paltani, Stephane;
2016-01-01
The X-ray Integral Field Unit (X-IFU) on board the Advanced Telescope for High-ENergy Astrophysics (Athena) will provide spatially resolved high-resolution X-ray spectroscopy from 0.2 to 12 keV, with 5 pixels over a field of view of 5 arc minute equivalent diameter and a spectral resolution of 2.5 eV up to 7 keV. In this paper, we first review the core scientific objectives of Athena, driving the main performance parameters of the X-IFU, namely the spectral resolution, the field of view, the effective area, the count rate capabilities, the instrumental background. We also illustrate the breakthrough potential of the X-IFU for some observatory science goals. Then we brie y describe the X-IFU design as defined at the time of the mission consolidation review concluded in May 2016, and report on its predicted performance. Finally, we discuss some options to improve the instrument performance while not increasing its complexity and resource demands (e.g. count rate capability, spectral resolution). (2016) .
NASA Astrophysics Data System (ADS)
Ehle, Matthias
2015-09-01
The Advanced Telescope for High Energy Astrophysics (Athena) is a large-class mission of the European Space Agency (ESA). It is currently entering an assessment study phase, with launch planned for 2028. Athena has been designed to address the science theme "The Hot and Energetic Universe", which poses two key questions: - How does ordinary matter assemble into the large-scale structures we see today? - How do black holes grow and influence the Universe? The mission will employ a variety of techniques to address these topics in a comprehensive matter, including spatially-resolved high resolution spectroscopy, sensitive wide field imaging, high throughput spectral-timing, and fast follow-up of transient phenomena. The purpose of this conference is to gather together all members of the astronomical community worldwide who have an interest in Athena. The main focus of the meeting is to discuss the key science questions which will be addressed by the mission. A significant portion of the programme is devoted to presenting the status of the project and discussing the synergies with other future large multi-wavelength facilities and missions. Scientific topics include: - Formation, evolution and physical properties of clusters of galaxies - Cosmic feedback - The missing baryons and the WHIM - Supermassive black hole evolution - Accretion physics and strong gravity - High energy transient phenomena - Solar system and exoplanets - Star formation and evolution - The physics of compact object - Supernovae, supernova remnants and the ISM - Multiwavelength synergies
Evolutionary Models of Cold, Magnetized, Interstellar Clouds
NASA Technical Reports Server (NTRS)
Gammie, Charles F.; Ostriker, Eve; Stone, James M.
2004-01-01
We modeled the long-term and small-scale evolution of molecular clouds using direct 2D and 3D magnetohydrodynamic (MHD) simulations. This work followed up on previous research by our group under auspices of the ATP in which we studied the energetics of turbulent, magnetized clouds and their internal structure on intermediate scales. Our new work focused on both global and smallscale aspects of the evolution of turbulent, magnetized clouds, and in particular studied the response of turbulent proto-cloud material to passage through the Galactic spiral potential, and the dynamical collapse of turbulent, magnetized (supercritical) clouds into fragments to initiate the formation of a stellar cluster. Technical advances under this program include developing an adaptive-mesh MHD code as a successor to ZEUS (ATHENA) in order to follow cloud fragmentation, developing a shearing-sheet MHD code which includes self-gravity and externally-imposed gravity to follow the evolution of clouds in the Galactic potential, and developing radiative transfer models to evaluate the internal ionization of clumpy clouds exposed to external photoionizing UV and CR radiation. Gammie's work at UIUC focused on the radiative transfer aspects of this program.
NASA Technical Reports Server (NTRS)
Peille, Phillip; Ceballos, Maria Teresa; Cobo, Beatriz; Wilms, Joern; Bandler, Simon; Smith, Stephen J.; Dauser, Thomas; Brand, Thorsten; Den Haretog, Roland; de Plaa, Jelle;
2016-01-01
The X-ray Integral Field Unit (X-IFU) microcalorimeter, on-board Athena, with its focal plane comprising 3840 Transition Edge Sensors (TESs) operating at 90 mK, will provide unprecedented spectral-imaging capability in the 0.2-12 keV energy range. It will rely on the on-board digital processing of current pulses induced by the heat deposited in the TES absorber, as to recover the energy of each individual events. Assessing the capabilities of the pulse reconstruction is required to understand the overall scientific performance of the X-IFU, notably in terms of energy resolution degradation with both increasing energies and count rates. Using synthetic data streams generated by the X-IFU End-to-End simulator, we present here a comprehensive benchmark of various pulse reconstruction techniques, ranging from standard optimal filtering to more advanced algorithms based on noise covariance matrices. Beside deriving the spectral resolution achieved by the different algorithms, a first assessment of the computing power and ground calibration needs is presented. Overall, all methods show similar performances, with the reconstruction based on noise covariance matrices showing the best improvement with respect to the standard optimal filtering technique. Due to prohibitive calibration needs, this method might however not be applicable to the X-IFU and the best compromise currently appears to be the so-called resistance space analysis which also features very promising high count rate capabilities.
Crosstalk in an FDM Laboratory Setup and the Athena X-IFU End-to-End Simulator
NASA Astrophysics Data System (ADS)
den Hartog, R.; Kirsch, C.; de Vries, C.; Akamatsu, H.; Dauser, T.; Peille, P.; Cucchetti, E.; Jackson, B.; Bandler, S.; Smith, S.; Wilms, J.
2018-04-01
The impact of various crosstalk mechanisms on the performance of the Athena X-IFU instrument has been assessed with detailed end-to-end simulations. For the crosstalk in the electrical circuit, a detailed model has been developed. In this contribution, we test this model against measurements made with an FDM laboratory setup and discuss the assumption of deterministic crosstalk in the context of the weak link effect in the detectors. We conclude that crosstalk levels predicted by the model are conservative with respect to the observed levels.
NASA Astrophysics Data System (ADS)
Bavdaz, Marcos; Wille, Eric; Shortt, Brian; Fransen, Sebastiaan; Collon, Maximilien; Barriere, Nicolas; Yanson, Alexei; Vacanti, Giuseppe; Haneveld, Jeroen; van Baren, Coen; Zuknik, Karl-Heinz; Christensen, Finn; Della Monica Ferreira, Desiree; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Spiga, Daniele; Valsecchi, Giuseppe; Vernani, Dervis
2016-07-01
ATHENA (Advanced Telescope for High ENergy Astrophysics) is being studied by the European Space Agency (ESA) as the second large science mission, with a launch slot in 2028. System studies and technology preparation activities are on-going. The optics of the telescope is based on the modular Silicon Pore Optics (SPO), a novel X-ray optics technology significantly benefiting from spin-in from the semiconductor industry. Several technology development activities are being implemented by ESA in collaboration with European industry and institutions. The related programmatic background, technology development approach and the associated implementation planning are presented.
2007-08-01
respectievelijk 0 m (a,d), 1 m (b,e) en 2 m (c, f ) afstand van de deuropening. A. 1.2 Scenario 11: Orientatie De individuele beelden van de HV en...controlling office. ] DISTRIBUTION STATEMENT F . Further dissemination only as directed by controlling office or higher DoD authority. Distribution...Statement F is also used when a document does not contain a distribution statement and no distribution statement can be determined. • DISTRIBUTION
2001-05-31
KODIAK ISLAND, Alaska -- Technicians prepare the Athena I launch vehicle for flight at Kodiak Island, Alaska, as processing for the launch of Kodiak Star proceeds. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
GAMERA - The New Magnetospheric Code
NASA Astrophysics Data System (ADS)
Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.
2017-12-01
The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.
Athena Mars rover science investigation
NASA Astrophysics Data System (ADS)
Squyres, Steven W.; Arvidson, Raymond E.; Baumgartner, Eric T.; Bell, James F.; Christensen, Philip R.; Gorevan, Stephen; Herkenhoff, Kenneth E.; Klingelhöfer, Göstar; Madsen, Morten Bo; Morris, Richard V.; Rieder, Rudolf; Romero, Raul A.
2003-12-01
Each Mars Exploration Rover carries an integrated suite of scientific instruments and tools called the Athena science payload. The primary objective of the Athena science investigation is to explore two sites on the Martian surface where water may once have been present, and to assess past environmental conditions at those sites and their suitability for life. The remote sensing portion of the payload uses a mast called the Pancam Mast Assembly (PMA) that provides pointing for two instruments: the Panoramic Camera (Pancam), and the Miniature Thermal Emission Spectrometer (Mini-TES). Pancam provides high-resolution, color, stereo imaging, while Mini-TES provides spectral cubes at mid-infrared wavelengths. For in-situ study, a five degree-of-freedom arm called the Instrument Deployment Device (IDD) carries four more tools: a Microscopic Imager (MI) for close-up imaging, an Alpha Particle X-Ray Spectrometer (APXS) for elemental chemistry, a Mössbauer Spectrometer (MB) for the mineralogy of Fe-bearing materials, and a Rock Abrasion Tool (RAT) for removing dusty and weathered surfaces and exposing fresh rock underneath. The payload also includes magnets that allow the instruments to study the composition of magnetic Martian materials. All of the Athena instruments have undergone extensive calibration, both individually and using a set of geologic reference materials that are being measured with all the instruments. Using a MER-like rover and payload in a number of field settings, we have devised operations processes that will enable us to use the MER rovers to formulate and test scientific hypotheses concerning past environmental conditions and habitability at the landing sites.
NASA Astrophysics Data System (ADS)
Basso, Stefano; Civitani, Marta; Pareschi, Giovanni; Buratti, Enrico; Eder, Josef; Friedrich, Peter; Fürmetz, Maria
2015-09-01
The Athena mission was selected for the second large-class mission, due for launch in 2028, in ESA's Cosmic Vision program. The current solution for the optics is based on the Silicon Pore Optics (SPO) technology with the goal of 2m2 effective area at 1keV (aperture about 3m diameter) with a focal length of 12m. The SPO advantages are the compactness along the axial direction and the high conductivity of the Silicon. Recent development in the fabrication of mirror shells based on the Slumped Glass Optics (SGO) makes this technology an attractive solution for the mirror modules for Athena or similar telescopes. The SGO advantages are a potential high collecting area with a limited vignetting due to the lower shadowing and the aptitude to curve the glass plates up to small radius of curvature. This study shows an alternative mirror design based on SGO technology, tailored for Athena needs. The main challenges are the optimization of the manufacturing technology with respect to the required accuracy and the thermal control of the large surface in conjunction with the low conductivity of the glass. A concept has been elaborated which considers the specific benefits of the SGO technology and provides an efficient thermal control. The output of the study is a preliminary design substantiated by analyses and technological studies. The study proposes interfaces and predicts performances and budgets. It describes also how such a mirror system could be implemented as a modular assembly for X-ray telescope with a large collecting area.
Athena Mars rover science investigation
Squyres, S. W.; Arvidson, R. E.; Baumgartner, E.T.; Bell, J.F.; Christensen, P.R.; Gorevan, S.; Herkenhoff, K. E.; Klingelhofer, G.; Madsen, M.B.; Morris, R.V.; Rieder, R.; Romero, R.A.
2003-01-01
Each Mars Exploration Rover carries an integrated suite of scientific instruments and tools called the Athena science payload. The primary objective of the Athena science investigation is to explore two sites on the Martian surface where water may once have been present, and to assess past environmental conditions at those sites and their suitability for life. The remote sensing portion of the payload uses a mast called the Pancam Mast Assembly (PMA) that provides pointing for two instruments: the Panoramic Camera (Pancam), and the Miniature Thermal Emission Spectrometer (Mini-TES). Pancam provides high-resolution, color, stereo imaging, while Mini-TES provides spectral cubes at mid-infrared wavelengths. For in-situ study, a five degree-of-freedom arm called the Instrument Deployment Device (IDD) carries four more tools: a Microscopic Imager (MI) for close-up imaging, an Alpha Particle X-Ray Spectrometer (APXS) for elemental chemistry, a Mo??ssbauer Spectrometer (MB) for the mineralogy of Fe-bearing materials, and a Rock Abrasion Tool (RAT) for removing dusty and weathered surfaces and exposing fresh rock underneath. The payload also includes magnets that allow the instruments to study the composition of magnetic Martian materials. All of the Athena instruments have undergone extensive calibration, both individually and using a set of geologic reference materials that are being measured with all the instruments. Using a MER-like rover and payload in a number of field settings, we have devised operations processes that will enable us to use the MER rovers to formulate and test scientific hypotheses concerning past environmental conditions and habitability at the landing sites. Copyright 2003 by the American Geophysical Union.
Improving the Multi-Wavelength Capability of Chandra Large Programs
NASA Astrophysics Data System (ADS)
Pacucci, Fabio
2017-09-01
In order to fully exploit the joint Chandra/JWST/HST ventures to detect faint sources, we urgently need an advanced matching algorithm between optical/NIR and X-ray catalogs/images. This will be of paramount importance in bridging the gap between upcoming optical/NIR facilities (JWST) and later X-ray ones (Athena, Lynx). We propose to develop an advanced and automated tool to improve the identification of Chandra X-ray counterparts detected in deep optical/NIR fields based on T-PHOT, a software widely used in the community. The developed code will include more than 20 years in advancements of X-ray data analysis and will be released to the public. Finally, we will release an updated catalog of X-ray sources in the CANDELS regions: a leap forward in our endeavor of charting the Universe.
The X-ray Integral Field Unit (X-IFU) for Athena
NASA Technical Reports Server (NTRS)
Ravera, Laurent; Barret, Didier; Willem den Herder, Jan; Piro, Luigi; Cledassou, Rodolphe; Pointecouteau, Etienne; Peille, Philippe; Pajot, Francois; Arnaud, Monique; Pigot, Claude;
2014-01-01
Athena is designed to implement the Hot and Energetic Universe science theme selected by the European Space Agency for the second large mission of its Cosmic Vision program. The Athena science payload consists of a large aperture high angular resolution X-ray optics (2 m2 at 1 keV) and twelve meters away, two interchangeable focal plane instruments: the X-ray Integral Field Unit (X-IFU) and the Wide Field Imager. The X-IFU is a cryogenic X-ray spectrometer, based on a large array of Transition Edge Sensors (TES), oering 2.5 eV spectral resolution, with approximately 5" pixels, over a field of view of 5' in diameter. In this paper, we present the X-IFU detector and readout electronics principles, some elements of the current design for the focal plane assembly and the cooling chain. We describe the current performance estimates, in terms of spectral resolution, effective area, particle background rejection and count rate capability. Finally, we emphasize on the technology developments necessary to meet the demanding requirements of the X-IFU, both for the sensor, readout electronics and cooling chain.
Mass production of silicon pore optics for ATHENA
NASA Astrophysics Data System (ADS)
Wille, Eric; Bavdaz, Marcos; Collon, Maximilien
2016-07-01
Silicon Pore Optics (SPO) provide high angular resolution with low effective area density as required for the Advanced Telescope for High Energy Astrophysics (Athena). The x-ray telescope consists of several hundreds of SPO mirror modules. During the development of the process steps of the SPO technology, specific requirements of a future mass production have been considered right from the beginning. The manufacturing methods heavily utilise off-the-shelf equipment from the semiconductor industry, robotic automation and parallel processing. This allows to upscale the present production flow in a cost effective way, to produce hundreds of mirror modules per year. Considering manufacturing predictions based on the current technology status, we present an analysis of the time and resources required for the Athena flight programme. This includes the full production process starting with Si wafers up to the integration of the mirror modules. We present the times required for the individual process steps and identify the equipment required to produce two mirror modules per day. A preliminary timeline for building and commissioning the required infrastructure, and for flight model production of about 1000 mirror modules, is presented.
A systematic analysis of the XMM-Newton background: III. Impact of the magnetospheric environment
NASA Astrophysics Data System (ADS)
Ghizzardi, Simona; Marelli, Martino; Salvetti, David; Gastaldello, Fabio; Molendi, Silvano; De Luca, Andrea; Moretti, Alberto; Rossetti, Mariachiara; Tiengo, Andrea
2017-12-01
A detailed characterization of the particle induced background is fundamental for many of the scientific objectives of the Athena X-ray telescope, thus an adequate knowledge of the background that will be encountered by Athena is desirable. Current X-ray telescopes have shown that the intensity of the particle induced background can be highly variable. Different regions of the magnetosphere can have very different environmental conditions, which can, in principle, differently affect the particle induced background detected by the instruments. We present results concerning the influence of the magnetospheric environment on the background detected by EPIC instrument onboard XMM-Newton through the estimate of the variation of the in-Field-of-View background excess along the XMM-Newton orbit. An important contribution to the XMM background, which may affect the Athena background as well, comes from soft proton flares. Along with the flaring component a low-intensity component is also present. We find that both show modest variations in the different magnetozones and that the soft proton component shows a strong trend with the distance from Earth.
LC Filters for FDM Readout of the X-IFU TES Calorimeter Instrument on Athena
NASA Astrophysics Data System (ADS)
Bruijn, Marcel P.; van der Linden, Anton J.; Ferrari, Lorenza; Gottardi, Luciano; van der Kuur, Jan; den Hartog, Roland H.; Akamatsu, Hiroki; Jackson, Brian D.
2018-05-01
The current status of lithographic superconducting LC filters for use in the Athena-X-IFU instrument is described. We present the fabrication process and characterization results at room temperature, 4 K and 50 mK. We also present an optimization study of the quality topics, where finite element modeling is used together with experimental validation structures. For the a-Si:H-based capacitors and Nb-based coils, presently the component fabrication yield is about 99% and the effective series resistance at 50 mK is lower than 1.5 mΩ.
The Spirit Rover's Athena science investigation at Gusev Crater, Mars
NASA Technical Reports Server (NTRS)
Squyres, S. W.; Arvidson, R. E.; Bell, J. F., III; Brueckner, J.; Cabrol, N. A.; Calvin, W.; Carr, M. H.; Christensen, P. R.; Clark, B. C.; Crumpler, L.;
2004-01-01
The Mars Exploration Rover Spirit and its Athena science payload have been used to investigate a landing site in Gusev crater. Gusev is hypothesized to be the site of a former lake, but no clear evidence for lacustrine sedimentation has been found to date. Instead, the dominant lithology is basalt, and the dominant geologic processes are impact events and eolian transport. Many rocks exhibit coatings and other characteristics that may be evidence for minor aqueous alteration. Any lacustrine sediments that may exist at this location within Gusev apparently have been buried by lavas that have undergone subsequent impact disruption.
The Spirit Rover's Athena science investigation at Gusev crater, Mars
Squyres, S. W.; Arvidson, R. E.; Bell, J.F.; Brückner, J.; Cabrol, N.A.; Calvin, W.; Carr, M.H.; Christensen, P.R.; Clark, B. C.; Crumpler, L.; Des Marais, D.J.; D'Uston, C.; Economou, T.; Farmer, J.; Farrand, W.; Folkner, W.; Golombek, M.; Gorevan, S.; Grant, J. A.; Greeley, R.; Grotzinger, J.; Haskin, L.; Herkenhoff, K. E.; Hviid, S.; Johnson, J.; Klingelhofer, G.; Knoll, A.; Landis, G.; Lemmon, M.; Li, R.; Madsen, M.B.; Malin, M.C.; McLennan, S.M.; McSween, H.Y.; Ming, D. W.; Moersch, J.; Morris, R.V.; Parker, T.; Rice, J. W.; Richter, L.; Rieder, R.; Sims, M.; Smith, M.; Smith, P.; Soderblom, L.A.; Sullivan, R.; Wanke, H.; Wdowiak, T.; Wolff, M.; Yen, A.
2004-01-01
The Mars Exploration Rover Spirit and its Athena science payload have been used to investigate a landing site in Gusev crater. Gusev is hypothesized to be the site of a former take, but no clear evidence for lacustrine sedimentation has been found to date. Instead, the dominant lithology is basalt, and the dominant geologic processes are impact events and eolian transport. Many rocks exhibit coatings and other characteristics that may be evidence for minor aqueous alteration. Any lacustrine sediments that may exist at this location within Gusev apparently have been buried by lavas that have undergone subsequent impact disruption.
2001-08-09
KODIAK ISLAND, Alaska -- The PICSat and Starshine 3 (back) payloads wait for their launch aboard the Athena 1 launch vehicle at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. to 7 p.m. p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-31
KODIAK ISLAND, Alaska -- Castor 120, the first stage of the Athena 1 launch vehicle, is raised off a truck at the launch pad at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-08-09
KODIAK ISLAND, Alaska -- The PCSat payload waits for its launch aboard the Athena 1 launch vehicle at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. to 7 p.m. p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-07-31
KODIAK ISLAND, Alaska -- Technicians prepare the Starshine 3 payload for its launch aboard the Athena 1 launch vehicle at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-08-09
KODIAK ISLAND, Alaska -- Technicians prepare the PICSat payload for its launch aboard the Athena 1 launch vehicle at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. to 7 p.m. p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-31
KODIAK ISLAND, Alaska -- Technicians install Orbis 21D Equipment Section Boost Motor, the second stage of the Athena 1 launch vehicle, at Kodiak Island, Alaska, as processing for the launch of Kodiak Star proceeds. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-07-31
KODIAK ISLAND, Alaska -- Technicians prepare the Starshine 3 payload, while the payload fairing of the Athena 1 launch vehicle awaits servicing at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program
2001-05-31
KODIAK ISLAND, Alaska -- Castor 120, the first stage of the Athena 1 launch vehicle, is lowered into place at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-31
KODIAK ISLAND, Alaska -- Trucks transporting Orbis 21D Equipment Section Boost Motor, the second stage of the Athena 1 launch vehicle, arrive at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-29
KODIAK ISLAND, Alaska -- A convoy of trucks transports the stages of an Athena launch vehicle and supporting launch equipment to the pad at Kodiak Island, Alaska, as preparations to launch the Kodiak Star continue. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-31
KODIAK ISLAND, Alaska -- Castor 120, the first stage of the Athena 1 launch vehicle, is lifted into a vertical position at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-31
KODIAK ISLAND, Alaska -- Technicians inspect Castor 120, the first stage of the Athena 1 launch vehicle, at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
NASA Astrophysics Data System (ADS)
Simon, Jacob B.; Armitage, Philip J.; Li, Rixin; Youdin, Andrew N.
2016-05-01
We study the formation of planetesimals in protoplanetary disks from the gravitational collapse of solid over-densities generated via the streaming instability. To carry out these studies, we implement and test a particle-mesh self-gravity module for the Athena code that enables the simulation of aerodynamically coupled systems of gas and collisionless self-gravitating solid particles. Upon employment of our algorithm to planetesimal formation simulations, we find that (when a direct comparison is possible) the Athena simulations yield predicted planetesimal properties that agree well with those found in prior work using different numerical techniques. In particular, the gravitational collapse of streaming-initiated clumps leads to an initial planetesimal mass function that is well-represented by a power law, {dN}/{{dM}}p\\propto {M}p-p, with p≃ 1.6+/- 0.1, which equates to a differential size distribution of {dN}/{{dR}}p\\propto {R}p-q, with q≃ 2.8+/- 0.1. We find no significant trends with resolution from a convergence study of up to 5123 grid zones and {N}{{par}}≈ 1.5× {10}8 particles. Likewise, the power-law slope appears indifferent to changes in the relative strength of self-gravity and tidal shear, and to the time when (for reasons of numerical economy) self-gravity is turned on, though the strength of these claims is limited by small number statistics. For a typically assumed radial distribution of minimum mass solar nebula solids (assumed here to have dimensionless stopping time τ =0.3), our results support the hypothesis that bodies on the scale of large asteroids or Kuiper Belt Objects could have formed as the high-mass tail of a primordial planetesimal population.
Simulation and modeling of silicon pore optics for the ATHENA x-ray telescope
NASA Astrophysics Data System (ADS)
Spiga, D.; Christensen, F. E.; Bavdaz, M.; Civitani, M. M.; Conconi, P.; Della Monica Ferreira, D.; Knudsen, E. B.; Massahi, S.; Pareschi, G.; Salmaso, B.; Shortt, B.; Tayabaly, K.; Westergaard, N. J.; Wille, E.
2016-07-01
The ATHENA X-ray observatory is a large-class ESA approved mission, with launch scheduled in 2028. The technology of silicon pore optics (SPO) was selected as baseline to assemble ATHENA's optic with more than 1000 mirror modules, obtained by stacking wedged and ribbed silicon wafer plates onto silicon mandrels to form the Wolter-I configuration. Even if the current baseline design fulfills the required effective area of 2 m2 at 1 keV on-axis, alternative design solutions, e.g., privileging the field of view or the off-axis angular resolution, are also possible. Moreover, the stringent requirement of a 5 arcsec HEW angular resolution at 1 keV entails very small profile errors and excellent surface smoothness, as well as a precise alignment of the 1000 mirror modules to avoid imaging degradation and effective area loss. Finally, the stray light issue has to be kept under control. In this paper we show the preliminary results of simulations of optical systems based on SPO for the ATHENA X-ray telescope, from pore to telescope level, carried out at INAF/OAB and DTU Space under ESA contract. We show ray-tracing results, including assessment of the misalignments of mirror modules and the impact of stray light. We also deal with a detailed description of diffractive effects expected in an SPO module from UV light, where the aperture diffraction prevails, to X-rays where the surface diffraction plays a major role. Finally, we analyze the results of X-ray tests performed at the BESSY synchrotron, we compare them with surface finishing measurements, and we estimate the expected HEW degradation caused by the X-ray scattering.
Pisters, Ron; Hohnloser, Stefan H; Connolly, Stuart J; Torp-Pedersen, Christian; Naditch-Brûlé, Lisa; Page, Richard L; Crijns, Harry J G M
2014-02-01
This study aimed to assess safety and cardiovascular outcomes of dronedarone in patients with paroxysmal or persistent atrial fibrillation (AF) with coronary heart disease (CHD). Coronary heart disease is prevalent among AF patients and limits antiarrhythmic drug use because of their potentially life-threatening ventricular proarrhythmic effects. This post hoc analysis evaluated 1405 patients with paroxysmal or persistent AF and CHD from the ATHENA trial. Follow-up lasted 2.5 years, during which patients received either dronedarone (400 mg twice daily) or a double-blind matching placebo. Primary outcome was time to first cardiovascular hospitalization or death due to any cause. Secondary end points included first hospitalization due to cardiovascular events. The primary outcome occurred in 350 of 737 (47%) placebo patients vs. 252 of 668 (38%) dronedarone patients [hazard ratio (HR) = 0.73; 95% confidence interval (CI) = 0.62-0.86; P = 0.0002] without a significant increase in number of adverse events. In addition, 42 of 668 patients receiving dronedarone suffered from a first acute coronary syndrome compared with 67 of 737 patients from the placebo group (HR = 0.67; 95% CI = 0.46-0.99; P = 0.04). In this post hoc analysis, dronedarone on top of standard care in AF patients with CHD reduced cardiovascular hospitalization or death similar to that in the overall ATHENA population, and reduced a first acute coronary syndrome. Importantly, the safety profile in this subpopulation was also similar to that of the overall ATHENA population, with no excess in proarrhythmias. The mechanism of the cardiovascular protective effects is unclear and warrants further investigation.
2010-01-01
Background Growing interest and burgeoning technology for discovering genetic mechanisms that influence disease processes have ushered in a flood of genetic association studies over the last decade, yet little heritability in highly studied complex traits has been explained by genetic variation. Non-additive gene-gene interactions, which are not often explored, are thought to be one source of this "missing" heritability. Methods Stochastic methods employing evolutionary algorithms have demonstrated promise in being able to detect and model gene-gene and gene-environment interactions that influence human traits. Here we demonstrate modifications to a neural network algorithm in ATHENA (the Analysis Tool for Heritable and Environmental Network Associations) resulting in clear performance improvements for discovering gene-gene interactions that influence human traits. We employed an alternative tree-based crossover, backpropagation for locally fitting neural network weights, and incorporation of domain knowledge obtainable from publicly accessible biological databases for initializing the search for gene-gene interactions. We tested these modifications in silico using simulated datasets. Results We show that the alternative tree-based crossover modification resulted in a modest increase in the sensitivity of the ATHENA algorithm for discovering gene-gene interactions. The performance increase was highly statistically significant when backpropagation was used to locally fit NN weights. We also demonstrate that using domain knowledge to initialize the search for gene-gene interactions results in a large performance increase, especially when the search space is larger than the search coverage. Conclusions We show that a hybrid optimization procedure, alternative crossover strategies, and incorporation of domain knowledge from publicly available biological databases can result in marked increases in sensitivity and performance of the ATHENA algorithm for detecting and modelling gene-gene interactions that influence a complex human trait. PMID:20875103
X-ray mirror development and testing for the ATHENA mission
NASA Astrophysics Data System (ADS)
Della Monica Ferreira, Desiree; Jakobsen, Anders C.; Massahi, Sonny; Christensen, Finn E.; Shortt, Brian; Garnæs, Jørgen; Torras-Rosell, Antoni; Krumrey, Michael; Cibik, Levent; Marggraf, Stefanie
2016-07-01
This study reports development and testing of coatings on silicon pore optics (SPO) substrates including pre and post coating characterisation of the x-ray mirrors using Atomic Force Microscopy (AFM) and X-ray reflectometry (XRR) performed at the 8 keV X-ray facility at DTU Space and with synchrotron radiation in the laboratory of PTB at BESSY II. We report our findings on surface roughness and coating reflectivity of Ir/B4C coatings considering the grazing incidence angles and energies of ATHENA and long term stability of Ir/B4C, Pt/B4C, W/Si and W/B4C coatings.
A systematic analysis of the XMM-Newton background: I. Dataset and extraction procedures
NASA Astrophysics Data System (ADS)
Marelli, Martino; Salvetti, David; Gastaldello, Fabio; Ghizzardi, Simona; Molendi, Silvano; Luca, Andrea De; Moretti, Alberto; Rossetti, Mariachiara; Tiengo, Andrea
2017-12-01
XMM-Newton is the direct precursor of the future ESA ATHENA mission. A study of its particle-induced background provides therefore significant insight for the ATHENA mission design. We make use of ˜12 years of data, products from the third XMM-Newton catalog as well as FP7 EXTraS project to avoid celestial sources contamination and to disentangle the different components of the XMM-Newton particle-induced background. Within the ESA R&D AREMBES collaboration, we built new analysis pipelines to study the different components of this background: this covers time behavior as well as spectral and spatial characteristics.
Simulating x-ray telescopes with McXtrace: a case study of ATHENA's optics
NASA Astrophysics Data System (ADS)
Ferreira, Desiree D. M.; Knudsen, Erik B.; Westergaard, Niels J.; Christensen, Finn E.; Massahi, Sonny; Shortt, Brian; Spiga, Daniele; Solstad, Mathias; Lefmann, Kim
2016-07-01
We use the X-ray ray-tracing package McXtrace to simulate the performance of X-ray telescopes based on Silicon Pore Optics (SPO) technologies. We use as reference the design of the optics of the planned X-ray mission Advanced Telescope for High ENergy Astrophysics (ATHENA) which is designed as a single X-ray telescope populated with stacked SPO substrates forming mirror modules to focus X-ray photons. We show that is possible to simulate in detail the SPO pores and qualify the use of McXtrace for in-depth analysis of in-orbit performance and laboratory X-ray test results.
2001-05-29
KODIAK ISLAND, Alaska -- A special platform connects the barge with a ramp to allow Castor 120, the first stage of the Athena 1 launch vehicle, to safely move onto the dock at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-07-31
KODIAK ISLAND, Alaska -- Technicians prepare the Starshine 3 payload for its launch aboard the Athena 1 launch vehicle, while the payload fairing awaits processing, at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-31
KODIAK ISLAND, Alaska -- Technicians inspect and secure Castor 120, the first stage of the Athena 1 launch vehicle, on the launch mount at Kodiak Island, Alaska, as processing for the launch of Kodiak Star proceeds. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-29
KODIAK ISLAND, Alaska -- A boat moves a ramp into place that will allow Castor 120, the first stage of the Athena 1 launch vehicle, to safely move onto the dock at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
Development and production of a multilayer-coated x-ray reflecting stack for the Athena mission
NASA Astrophysics Data System (ADS)
Massahi, S.; Ferreira, D. D. M.; Christensen, F. E.; Shortt, B.; Girou, D. A.; Collon, M.; Landgraf, B.; Barriere, N.; Krumrey, M.; Cibik, L.; Schreiber, S.
2016-07-01
The Advanced Telescope for High-Energy Astrophysics, Athena, selected as the European Space Agency's second large-mission, is based on the novel Silicon Pore Optics X-ray mirror technology. DTU Space has been working for several years on the development of multilayer coatings on the Silicon Pore Optics in an effort to optimize the throughput of the Athena optics. A linearly graded Ir/B4C multilayer has been deposited on the mirrors, via the direct current magnetron sputtering technique, at DTU Space. This specific multilayer, has through simulations, been demonstrated to produce the highest reflectivity at 6 keV, which is a goal for the scientific objectives of the mission. A critical aspect of the coating process concerns the use of photolithography techniques upon which we will present the most recent developments in particular related to the cleanliness of the plates. Experiments regarding the lift-off and stacking of the mirrors have been performed and the results obtained will be presented. Furthermore, characterization of the deposited thin-films was performed with X-ray reflectometry at DTU Space and in the laboratory of the Physikalisch-Technische Bundesanstalt at the synchrotron radiation facility BESSY II.
ATLAS offline software performance monitoring and optimization
NASA Astrophysics Data System (ADS)
Chauhan, N.; Kabra, G.; Kittelmann, T.; Langenberg, R.; Mandrysch, R.; Salzburger, A.; Seuster, R.; Ritsch, E.; Stewart, G.; van Eldik, N.; Vitillo, R.; Atlas Collaboration
2014-06-01
In a complex multi-developer, multi-package software environment, such as the ATLAS offline framework Athena, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide the optimization work. The first tool we used to instrument the code is PAPI, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles, instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event results in a good understanding of the algorithm level performance of ATLAS code. Further data can be obtained using Pin, a dynamic binary instrumentation tool. Pin tools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is also possible. Pin tools can additionally interrogate the arguments to functions, like those in linear algebra libraries, so that a detailed usage profile can be obtained. These tools have characterized the extensive use of vector and matrix operations in ATLAS tracking. Currently, CLHEP is used here, which is not an optimal choice. To help evaluate replacement libraries a testbed has been setup allowing comparison of the performance of different linear algebra libraries (including CLHEP, Eigen and SMatrix/SVector). Results are then presented via the ATLAS Performance Management Board framework, which runs daily with the current development branch of the code and monitors reconstruction and Monte-Carlo jobs. This framework analyses the CPU and memory performance of algorithms and an overview of results are presented on a web page. These tools have provided the insight necessary to plan and implement performance enhancements in ATLAS code by identifying the most common operations, with the call parameters well understood, and allowing improvements to be quantified in detail.
NASA Astrophysics Data System (ADS)
Bavdaz, Marcos; Wille, Eric; Shortt, Brian; Fransen, Sebastiaan; Collon, Maximilien; Vacanti, Giuseppe; Günther, Ramses; Yanson, Alexei; Vervest, Mark; Haneveld, Jeroen; van Baren, Coen; Zuknik, Karl-Heinz; Christensen, Finn; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Valsecchi, Giuseppe
2015-09-01
The Advanced Telescope for High ENergy Astrophysics (Athena) was selected in 2014 as the second large class mission (L2) of the ESA Cosmic Vision Science Programme within the Directorate of Science and Robotic Exploration. The mission development is proceeding via the implementation of the system studies and in parallel a comprehensive series of technology preparation activities. [1-3]. The core enabling technology for the high performance mirror is the Silicon Pore Optics (SPO), a modular X-ray optics technology, which utilises processes and equipment developed for the semiconductor industry [4-31]. This paper provides an overview of the programmatic background, the status of SPO technology and give an outline of the development roadmap and activities undertaken and planned by ESA.
Development of ATHENA mirror modules
NASA Astrophysics Data System (ADS)
Collon, Maximilien J.; Vacanti, Giuseppe; Barrière, Nicolas M.; Landgraf, Boris; Günther, Ramses; Vervest, Mark; van der Hoeven, Roy; Dekker, Danielle; Chatbi, Abdel; Girou, David; Sforzini, Jessica; Beijersbergen, Marco W.; Bavdaz, Marcos; Wille, Eric; Fransen, Sebastiaan; Shortt, Brian; Haneveld, Jeroen; Koelewijn, Arenda; Booysen, Karin; Wijnperle, Maurice; van Baren, Coen; Eigenraam, Alexander; Müller, Peter; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Massahi, Sonny; Christensen, Finn E.; Della Monica Ferreira, Desirée.; Valsecchi, Giuseppe; Oliver, Paul; Checquer, Ian; Ball, Kevin; Zuknik, Karl-Heinz
2017-08-01
Silicon Pore Optics (SPO), developed at cosine with the European Space Agency (ESA) and several academic and industrial partners, provides lightweight, yet stiff, high-resolution x-ray optics. This technology enables ATHENA to reach an unprecedentedly large effective area in the 0.2 - 12 keV band with an angular resolution better than 5''. After developing the technology for 50 m and 20 m focal length, this year has witnessed the first 12 m focal length mirror modules being produced. The technology development is also gaining momentum with three different radii under study: mirror modules for the inner radii (Rmin = 250 mm), outer radii (Rmax = 1500 mm) and middle radii (Rmid = 737 mm) are being developed in parallel.
2001-06-19
KODIAK ISLAND, Alaska -- Technicians lower the fueled Orbit Adjust Model (OAM), which navigates payloads into the correct orbit, onto Orbis 21D Equipment Section Boost Motor, the second stage of the Athena 1 launch vehicle, at the launch pad at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-06-19
KODIAK ISLAND, Alaska -- The fueled Orbit Adjust Model (OAM), which navigates payloads into the correct orbit, is installed onto Orbis 21D Equipment Section Boost Motor, the second stage of the Athena 1 launch vehicle, at the launch pad at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-05-29
KODIAK ISLAND, Alaska -- The Orbis 21D Equipment Section Boost Motor, the second stage of the Athena 1 launch vehicle, waits for the first stage, Castor 120, to be towed up the steepest part of the road, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
NASA Astrophysics Data System (ADS)
Barbera, M.; Branduardi-Raymont, G.; Collura, A.; Comastri, A.; Eder, J.; Kamisiński, T.; Lo Cicero, U.; Meidinger, N.; Mineo, T.; Molendi, S.; Parodi, G.; Pilch, A.; Piro, L.; Rataj, M.; Rauw, G.; Sciortino, L.; Sciortino, S.; Wawer, P.
2015-08-01
ATHENA is the L2 mission selected by ESA to pursue the science theme "Hot and Energetic Universe" (launch scheduled in 2028). One of the key instruments of ATHENA is the Wide Field Imager (WFI) which will provide imaging in the 0.1-15 keV band over a 40'x40' large field of view, together with spectrally and time-resolved photon counting. The WFI camera, based on arrays of DEPFET active pixel sensors, is also sensitive to UV/Vis photons. Optically generated electron-hole pairs may degrade the spectral resolution as well as change the energy scale by introducing a signal offset. For this reason, the use of an X-ray transparent optical blocking filter is needed to allow the observation of all type of X-ray sources that present a UV/Visible bright counterpart. In this paper, we describe the main activities that we are carrying on for the conceptual design of the optical blocking filter, that will be mounted on the filter wheel, in order to satisfy the scientific requirements on optical load from bright UV/Vis astrophysical source, to maximize the X-ray transmission, and to withstand the severe acoustic and vibration loads foreseen during launch.
The Emission and Chemistry of Reactive Nitrogen Species in the Plume of an Athena II Rocket
NASA Astrophysics Data System (ADS)
Popp, P. J.; Gao, R. S.; Neuman, J. A.; Northway, M. J.; Holecek, J. C.; Fahey, D. W.; Wiedinmyer, C.; Brock, C. A.; Ridley, B. A.; Walega, J. G.; Grahek, F. E.; Wilson, J. C.; Reeves, J. M.; Toohey, D. W.; Avallone, L. M.; Thornton, B. F.; Gates, A. M.; Ross, M. N.; Zittel, P. F.
2001-12-01
In situ measurements of total reactive nitrogen (NOy), nitric acid (HNO3), and particles were conducted in the plume of an Athena II rocket launched from Vandenberg AFB on September 24, 1999. These measurements were obtained onboard the NASA WB-57F high-altitude research aircraft as part of the Atmospheric Chemistry of Combustion Emissions near the Tropopause (ACCENT) mission. The calculated NOy emission index, determined from measurements made during the first 3 of 6 plume intercepts, was 2.1\\pm1.0 g NO2/kg propellant, consistent with far-field rocket plume model calculations. Although nitric oxide (NO) is thought to be the primary NOy species formed in the Athena solid rocket motor (SRM) and by hot afterburning in the plume, measurements in the plume as soon as 4 minutes after emission indicate that HNO3 is the dominant NOy species. In the chlorine-rich plume, NO is converted to chlorine nitrate (ClONO2) which reacts with water on emitted alumina particles to form HNO3. The data suggest HNO3 remains absorbed on alumina particles. With the potential increase in launch vehicle traffic in the coming decades, accurate modeling of the global impact of current and future rocket fleets will require the use of emission indices validated by observations.
NASA Astrophysics Data System (ADS)
Kouhartsiouk, Demetris; Agapiou, Athos; Lynsadrou, Vasiliki; Themistocleous, Kyriacos; Nisantzi, Argyro; Hadjimitsis, Diofantos G.; Lasaponara, Rosa; Masini, Nicola; Brcic, Ramon; Eineder, Michael; Krauss, Thomas; Cerra, Daniele; Gessner, Ursula; Schreier, Gunter
2017-04-01
Non-invasive landscape investigation for archaeological purposes includes a wide range of survey techniques, most of which include in-situ methods. In the recent years, a major advance in the non-invasive surveying techniques has been the introduction of active remote sensing technologies. One of such technologies is spaceborne radar, known as Synthetic Aperture Radar (SAR). SAR has proven to be a valuable tool in the analysis of potential archaeological marks and in the systematic cultural heritage site monitoring. With the use of SAR, it is possible to monitor slight variations in vegetation and soil often interpreted as archaeological signs, while radar sensors frequently having penetrating capabilities offering an insight into shallow underground remains. Radar remote sensing for immovable cultural heritage and archaeological applications has been recently introduced to Cyprus through the currently ongoing ATHENA project. ATHENA project, under the Horizon 2020 programme, aims at building a bridge between research institutions of the low performing Member States and internationally-leading counterparts at EU level, mainly through training workshops and a series of knowledge transfer activities, frequently taking place on the basis of capacity development. The project is formed as the consortium of the Remote Sensing and Geo-Environment Research Laboratory of the Cyprus University of Technology (CUT), the National Research Council of Italy (CNR) and the German Aerospace Centre (DLR). As part of the project, a number of cultural heritage sites in Cyprus have been studied testing different methodologies involving SAR imagery such as Amplitude Change Detection, Coherence Calculation and fusion techniques. The ATHENA's prospective agenda includes the continuation of the capacity building programme with upcoming training workshops to take place while expanding the knowledge of radar applications on conservation and risk monitoring of cultural heritage sites through SAR Interferometry. The current paper presents some preliminary results from the archaeological site of "Nea Paphos", addressing the potential use of the radar technology.
Reference payload of the ESA L1 mission candidate ATHENA
NASA Astrophysics Data System (ADS)
Martin, Didier; Rando, Nicola; Lumb, David; Verhoeve, Peter; Oosterbroek, Tim; Bavdaz, Marcos
2012-09-01
The Advanced Telescope for High ENergy Astrophysics (ATHENA) is one of the three candidates that competed for the first large-class mission (L1) in ESA’s Cosmic Vision 2015-2025 programme, with a launch planned by 2022 and is the result of the IXO reformulation activities. ATHENA is an ESA-led project and is conceived as the next generation X-ray observatory. It is meant to address fundamental questions about accretion around black-holes, reveal the physics underpinning cosmic feedback, trace the large scale structure of baryons in galaxy clusters and the cosmic as well as a large number of astrophysics and fundamental physics phenomena. The observatory consists of two identical mirrors each illuminating a fixed focal plane instrument, providing collectively 1 m2 effective area at 1 keV. The reference payload consists of a medium resolution wide field imager (WFI) and a high resolution X-ray micro-calorimeter spectrometer (XMS). The WFI is based on a monolithic Si DepFET array providing imaging over a 24 × 24 arcmin2 field of view and a good PSF oversampling. The sensor will measure X-rays in the range 0.1-15 keV and provides near Fano limited energy resolution (150eV at 6keV). The XMS is based on a micro-calorimeter array operating at its transition temperature of ~100mK and provides <3eV resolution. The detector array consists of 32 × 32 pixels covering a 2.3 × 2.3 arcmin2 field of view, co-aligned with the WFI. This paper summarizes the results of the reformulation exercise and provides details on the payload complement and its accommodation on the spacecraft. Following the ESA Science Programme Committee decision on the L1 mission in May 2012, ATHENA was not selected to enter Definition Phase.
NASA Astrophysics Data System (ADS)
Spiga, D.; Della Monica Ferreira, D.; Shortt, B.; Bavdaz, M.; Bergback Knudsen, E.; Bianucci, G.; Christensen, F.; Civitani, M.; Collon, M.; Conconi, P.; Fransen, S.; Marioni, F.; Massahi, S.; Pareschi, G.; Salmaso, B.; Jegers, A. S.; Tayabaly, K.; Valsecchi, G.; Westergaard, N.; Wille, E.
2017-09-01
The ATHENA X-ray observatory is a large-class ESA approved mission, with launch scheduled in 2028. The technology of silicon pore optics (SPO) was selected as baseline to assemble ATHENA's optic with hundreds of mirror modules, obtained by stacking wedged and ribbed silicon wafer plates onto silicon mandrels to form the Wolter-I configuration. In the current configuration, the optical assembly has a 3 m diameter and a 2 m2 effective area at 1 keV, with a required angular resolution of 5 arcsec. The angular resolution that can be achieved is chiefly the combination of 1) the focal spot size determined by the pore diffraction, 2) the focus degradation caused by surface and profile errors, 3) the aberrations introduced by the misalignments between primary and secondary segments, 4) imperfections in the co-focality of the mirror modules in the optical assembly. A detailed simulation of these aspects is required in order to assess the fabrication and alignment tolerances; moreover, the achievable effective area and angular resolution depend on the mirror module design. Therefore, guaranteeing these optical performances requires: a fast design tool to find the most performing solution in terms of mirror module geometry and population, and an accurate point spread function simulation from local metrology and positioning information. In this paper, we present the results of simulations in the framework of ESA-financed projects (SIMPOSiuM, ASPHEA, SPIRIT), in preparation of the ATHENA X-ray telescope, analyzing the mentioned points: 1) we deal with a detailed description of diffractive effects in an SPO mirror module, 2) we show ray-tracing results including surface and profile defects of the reflective surfaces, 3) we assess the effective area and angular resolution degradation caused by alignment errors between SPO mirror module's segments, and 4) we simulate the effects of co-focality errors in X-rays and in the UV optical bench used to study the mirror module alignment and integration.
2003 Mars Exploration Rover Mission: Robotic Field Geologists for a Mars Sample Return Mission
NASA Technical Reports Server (NTRS)
Ming, Douglas W.
2008-01-01
The Mars Exploration Rover (MER) Spirit landed in Gusev crater on Jan. 4, 2004 and the rover Opportunity arrived on the plains of Meridiani Planum on Jan. 25, 2004. The rovers continue to return new discoveries after 4 continuous Earth years of operations on the surface of the red planet. Spirit has successfully traversed 7.5 km over the Gusev crater plains, ascended to the top of Husband Hill, and entered into the Inner Basin of the Columbia Hills. Opportunity has traveled nearly 12 km over flat plains of Meridiani and descended into several impact craters. Spirit and Opportunity carry an integrated suite of scientific instruments and tools called the Athena science payload. The Athena science payload consists of the 1) Panoramic Camera (Pancam) that provides high-resolution, color stereo imaging, 2) Miniature Thermal Emission Spectrometer (Mini-TES) that provides spectral cubes at mid-infrared wavelengths, 3) Microscopic Imager (MI) for close-up imaging, 4) Alpha Particle X-Ray Spectrometer (APXS) for elemental chemistry, 5) Moessbauer Spectrometer (MB) for the mineralogy of Fe-bearing materials, 6) Rock Abrasion Tool (RAT) for removing dusty and weathered surfaces and exposing fresh rock underneath, and 7) Magnetic Properties Experiment that allow the instruments to study the composition of magnetic martian materials [1]. The primary objective of the Athena science investigation is to explore two sites on the martian surface where water may once have been present, and to assess past environmental conditions at those sites and their suitability for life. The Athena science instruments have made numerous scientific discoveries over the 4 plus years of operations. The objectives of this paper are to 1) describe the major scientific discoveries of the MER robotic field geologists and 2) briefly summarize what major outstanding questions were not answered by MER that might be addressed by returning samples to our laboratories on Earth.
Preliminary Mechanical Characterization of Thermal Filters for the X-IFU Instrument on Athena
NASA Astrophysics Data System (ADS)
Barbera, Marco; Lo Cicero, Ugo; Sciortino, Luisa; Parodi, Giancarlo; D'Anca, Fabio; Giglio, Paolo; Ferruggia Bonura, Salvatore; Nuzzo, Flavio; Jimenez Escobar, Antonio; Ciaravella, Angela; Collura, Alfonso; Varisco, Salvatore; Samain, Valerie
2018-05-01
The X-ray Integral Field Unit (X-IFU) is one of the two instruments of the Athena astrophysics space mission approved by ESA in the Cosmic Vision Science Program. The X-IFU consists of a large array of TES microcalorimeters that will operate at 50 mK inside a sophisticated cryostat. A set of thin filters, highly transparent to X-rays, will be mounted on the cryostat thermal shields in order to attenuate the IR radiative load, to attenuate RF electromagnetic interferences, and to protect the detector from contamination. In this paper, we present the current thermal filters design, describe the filter samples developed/procured so far, and present preliminary results from the ongoing characterization tests.
2001-06-19
KODIAK ISLAND, Alaska -- Orbis 21D Equipment Section Boost Motor, the second stage of the Athena 1 launch vehicle, awaits the installation of the Orbit Adjust Model (OAM), which navigates the payloads into the correct orbit, at the launch pad at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
2001-09-05
KODIAK ISLAND, ALASKA - Inside the Launch Service Structure, Kodiak Launch Complex (KLC), the final stage of the Athena I launch vehicle, with the Kodiak Star spacecraft, is maneuvered into place. The first launch to take place from KLC, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. to 7 p.m. p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program. KLC is the newest commercial launch complex in the United States, ideal for launch payloads requiring low-Earth polar or sun-synchronous orbits
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1992-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.
NASA Astrophysics Data System (ADS)
van der Kuur, J.; Gottardi, L. G.; Akamatsu, H.; van Leeuwen, B. J.; den Hartog, R.; Haas, D.; Kiviranta, M.; Jackson, B. J.
2016-07-01
Athena is a space-based X-ray observatory intended for exploration of the hot and energetic universe. One of the science instruments on Athena will be the X-ray Integrated Field Unit (X-IFU), which is a cryogenic X-ray spectrometer, based on a large cryogenic imaging array of Transition Edge Sensors (TES) based microcalorimeters operating at a temperature of 100mK. The imaging array consists of 3800 pixels providing 2.5 eV spectral resolution, and covers a field of view with a diameter of of 5 arc minutes. Multiplexed readout of the cryogenic microcalorimeter array is essential to comply with the cooling power and complexity constraints on a space craft. Frequency domain multiplexing has been under development for the readout of TES-based detectors for this purpose, not only for the X-IFU detector arrays but also for TES-based bolometer arrays for the Safari instrument of the Japanese SPICA observatory. This paper discusses the design considerations which are applicable to optimise the multiplex factor within the boundary conditions as set by the space craft. More specifically, the interplay between the science requirements such as pixel dynamic range, pixel speed, and cross talk, and the space craft requirements such as the power dissipation budget, available bandwidth, and electromagnetic compatibility will be discussed.
The new Athena alpha particle X-ray spectrometer for the Mars Exploration Rovers
NASA Astrophysics Data System (ADS)
Rieder, R.; Gellert, R.; Brückner, J.; Klingelhöfer, G.; Dreibus, G.; Yen, A.; Squyres, S. W.
2003-11-01
The new alpha particle X-ray spectrometer (APXS) is part of the Athena payload of the two Mars Exploration Rovers (MER). The APXS sensor head is attached to the turret of the instrument deployment device (IDD) of the rover. The APXS is a very light-weight instrument for determining the major and minor elemental composition of Martian soils, rocks, and other geological materials at the MER landing sites. The sensor head has simply to be docked by the IDD on the surface of the selected sample. X-ray radiation, excited by alpha particles and X rays of the radioactive sources, is recorded by a high-resolution X-ray detector. The X-ray spectra show elements starting from sodium up to yttrium, depending on their concentrations. The backscattered alpha spectra, measured by a ring of detectors, provide additional data on carbon and oxygen. By means of a proper calibration, the elemental concentrations are derived. Together with data from the two other Athena instruments mounted on the IDD, the samples under investigation can be fully characterized. Key APXS objectives are the determination of the chemistry of crustal rocks and soils and the examination of water-related deposits, sediments, or evaporates. Using the rock abrasion tool attached to the IDD, issues of weathering can be addressed by measuring natural and abraded surfaces of rocks.
Athena X-IFU event reconstruction software: SIRENA
NASA Astrophysics Data System (ADS)
Ceballos, Maria Teresa; Cobo, Beatriz; Peille, Philippe; Wilms, Joern; Brand, Thorsten; Dauser, Thomas; Bandler, Simon; Smith, Stephen
2015-09-01
This contribution describes the status and technical details of the SIRENA package, the software currently in development to perform the on board event energy reconstruction for the Athena calorimeter X-IFU. This on board processing will be done in the X-IFU DRE unit and it will consist in an initial triggering of event pulses followed by an analysis (with the SIRENA package) to determine the energy content of such events.The current algorithm used by SIRENA is the optimal filtering technique (also used by ASTRO-H processor) although some other algorithms are also being tested.Here we present these studies and some preliminary results about the energy resolution of the instrument based on simulations done with the SIXTE simulator (http://www.sternwarte.uni-erlangen.de/research/sixte/) in which SIRENA is integrated.
The filter and calibration wheel for the ATHENA wide field imager
NASA Astrophysics Data System (ADS)
Rataj, M.; Polak, S.; Palgan, T.; Kamisiński, T.; Pilch, A.; Eder, J.; Meidinger, N.; Plattner, M.; Barbera, M.; Parodi, G.; D'Anca, Fabio
2016-07-01
The planned filter and calibration wheel for the Wide Field Imager (WFI) instrument on Athena is presented. With four selectable positions it provides the necessary functions, in particular an UV/VIS blocking filter for the WFI detectors and a calibration source. Challenges for the filter wheel design are the large volume and mass of the subsystem, the implementation of a robust mechanism and the protection of the ultra-thin filter with an area of 160 mm square. This paper describes performed trade-offs based on simulation results and describes the baseline design in detail. Reliable solutions are envisaged for the conceptual design of the filter and calibration wheel. Four different variant with different position of the filter are presented. Risk mitigation and the compliance to design requirements are demonstrated.
NASA Astrophysics Data System (ADS)
Lumb, D.
2016-07-01
Athena has been selected by ESA for its second large mission opportunity of the Cosmic Visions programme, to address the theme of the Hot and Energetic Universe. Following the submission of a proposal from the community, the technical and programmatic aspects of the mission design were reviewed in ESA's Concurrent Design Facility. The proposed concept was deemed to betechnically feasible, but with potential constraints from cost and schedule. Two parallel industry study contracts have been conducted to explore these conclusions more thoroughly, with the key aim of providing consolidated inputs to a Mission Consolidation Review that was conducted in April-May 2016. This MCR has recommended a baseline design, which allows the agency to solicit proposals for a community provided payload. Key design aspects arising from the studies are described, and the new reference design is summarised.
Optical integration of SPO mirror modules in the ATHENA telescope
NASA Astrophysics Data System (ADS)
Valsecchi, G.; Marioni, F.; Bianucci, G.; Zocchi, F. E.; Gallieni, D.; Parodi, G.; Ottolini, M.; Collon, M.; Civitani, M.; Pareschi, G.; Spiga, D.; Bavdaz, M.; Wille, E.
2017-08-01
ATHENA (Advanced Telescope for High-ENergy Astrophysics) is the next high-energy astrophysical mission selected by the European Space Agency for launch in 2028. The X-ray telescope consists of 1062 silicon pore optics mirror modules with a target angular resolution of 5 arcsec. Each module must be integrated on a 3 m structure with an accuracy of 1.5 arcsec for alignment and assembly. This industrial and scientific team is developing the alignment and integration process of the SPO mirror modules based on ultra-violet imaging at the 12 m focal plane. This technique promises to meet the accuracy requirement while, at the same time, allowing arbitrary integration sequence and mirror module exchangeability. Moreover, it enables monitoring the telescope point spread function during the planned 3-year integration phase.
1975-09-01
This report assumes a familiarity with the GIFT and MAGIC computer codes. The EDIT-COMGEOM code is a FORTRAN computer code. The EDIT-COMGEOM code...converts the target description data which was used in the MAGIC computer code to the target description data which can be used in the GIFT computer code
Overview of Athena Microscopic Imager Results
NASA Technical Reports Server (NTRS)
Herkenhoff, K.; Squyres, S.; Arvidson, R.; Bass, D.; Bell, J., III; Bertelsen, P.; Cabrol, N.; Ehlmann, B.; Farrand, W.; Gaddis, L.
2005-01-01
The Athena science payload on the Mars Exploration Rovers (MER) includes the Microscopic Imager (MI). The MI is a fixed-focus camera mounted on an extendable arm, the Instrument Deployment Device (IDD). The MI acquires images at a spatial resolution of 31 microns/pixel over a broad spectral range (400 - 700 nm). The MI uses the same electronics design as the other MER cameras but its optics yield a field of view of 32 32 mm across a 1024 1024 pixel CCD image. The MI acquires images using only solar or skylight illumination of the target surface. The MI science objectives, instrument design and calibration, operation, and data processing were described by Herkenhoff et al. Initial results of the MI experiment on both MER rovers (Spirit and Opportunity) have been published previously. Highlights of these and more recent results are described.
Optical design for ATHENA X-ray telescope based on slumped mirror segments
NASA Astrophysics Data System (ADS)
Proserpio, Laura; Breunig, Elias; Friedrich, Peter; Winter, Anita
2014-07-01
The Hot and Energetic Universe will be the focus of future ESA missions: in late 2013 the theme was selected for the second large-class mission in the Cosmic Vision science program. Fundamental questions on how and why ordinary matter assemble into galaxies and clusters, and how black holes grow and influence their surroundings can be addressed with an advanced X-ray observatory. The currently proposed ATHENA mission presents all the potentiality to answer the outstanding questions. It is based on the heritage of XMM-Newton and on the previous studies for IXO mission. The scientific payload will require state of the art instrumentations. In particular, the baseline for the X-ray optical system, delivering a combination of large area, high angular resolution, and large field of view, is the Silicon Pore Optics technology (SPO) developed by ESA in conjunction with the Cosine Measurement Systems. The slumping technology is also under development for the manufacturing of future X-ray telescopes: for several years the Max Planck Institute for Extraterrestrial physics (MPE) has been involved in the analysis of the indirect slumping approach, which foresees the manufacturing of segmented X-ray shells by shaping thin glass foils at high temperatures over concave moulds so to avoid any contact of the optical surface with other materials during the process, preserving in this way the original X-ray quality of the glass surface. The paper presents an alternative optical design for ATHENA based on the use of thin glass mirror segments obtained through slumping.
Naccarelli, Gerald V; Wolbrette, Deborah L; Samii, Soraya; Banchs, Javier E; Penny-Peterson, Erica; Gonzalez, Mario D
2010-12-01
Dronedarone is a multichannel blocker with electrophysiologic effects similar to amiodarone. Dronedarone has been documented to prevent atrial fibrillation recurrences and also has efficacy in slowing the ventricular response during episodes of atrial fibrillation. However, in the ANDROMEDA trial, dronedarone was associated with increased mortality when tested in New York Heart Association (NYHA) III/IV patients with left ventricular ejection fractions of less than 35%, who also had a recent hospitalization for decompensated heart failure. When such high-risk patients with heart failure were excluded in the ATHENA trial, dronedarone treatment resulted in a statistical reduction in the composite primary end point of all-cause mortality or cardiovascular hospitalization. In ATHENA, dronedarone reduced cardiovascular hospitalizations even though in the DIONY-SOS trial dronedarone had less effect than amiodarone on suppressing atrial fibrillation recurrences. The most appropriate patients for treatment with dronedarone would be patients with a recent history of paroxysmal or persistent atrial fibrillation/atrial flutter (AF/AFL) that have associated risk factors per the inclusion criteria of ATHENA. Inappropriate patients would be those with class IV heart failure or recently hospitalized for heart failure within the last month from an acute decompensation, the main inclusion criteria in ANDROMEDA. Dronedarone is a novel, multichannel blocking antiarrhythmic agent that may have some pleiotropic effects in addition to its ability to suppress and maintain sinus rhythm and control the rate during AF/AFL recurrences.
Event processing in X-IFU detector onboard Athena.
NASA Astrophysics Data System (ADS)
Ceballos, M. T.; Cobos, B.; van der Kuurs, J.; Fraga-Encinas, R.
2015-05-01
The X-ray Observatory ATHENA was proposed in April 2014 as the mission to implement the science theme "The Hot and Energetic Universe" selected by ESA for L2 (the second Large-class mission in ESA's Cosmic Vision science programme). One of the two X-ray detectors designed to be onboard ATHENA is X-IFU, a cryogenic microcalorimeter based on Transition Edge Sensor (TES) technology that will provide spatially resolved high-resolution spectroscopy. X-IFU will be developed by a consortium of European research institutions currently from France (leadership), Italy, The Netherlands, Belgium, UK, Germany and Spain. From Spain, IFCA (CSIC-UC) is involved in the Digital Readout Electronics (DRE) unit of the X-IFU detector, in particular in the Event Processor Subsytem. We at IFCA are in charge of the development and implementation in the DRE unit of the Event Processing algorithms, designed to recognize, from a noisy signal, the intensity pulses generated by the absorption of the X-ray photons, and lately extract their main parameters (coordinates, energy, arrival time, grade, etc.) Here we will present the design and performance of the algorithms developed for the event recognition (adjusted derivative), and pulse grading/qualification as well as the progress in the algorithms designed to extract the energy content of the pulses (pulse optimal filtering). IFCA will finally have the responsibility of the implementation on board in the (TBD) FPGAs or micro-processors of the DRE unit, where this Event Processing part will take place, to fit into the limited telemetry of the instrument.
The Interaction of High-Speed Turbulence with Flames
NASA Astrophysics Data System (ADS)
Poludnenko, Alexei Y.; Oran, E. S.
2010-01-01
Interaction of flames with turbulence occurs in systems ranging from chemical flames on Earth to thermonuclear burning fronts, which are presently believed to be the key component of the explosion mechanism powering the type Ia supernovae. A number of important questions remains concerning the dynamics of turbulent flames in the presence of high-speed turbulence, the flame structure and stability, as well as the ability of the turbulent cascade to penetrate and disrupt the flame creating the distributed mode of burning. We present results of a systematic study of the dynamics and properties of turbulent flames formed under the action of high-speed turbulence using a simplified one-step kinetics similar to the one used to describe hydrogen combustion. This approach makes large-scale highly resolved simulations computationally feasible and it allows one to focus on the process of the turbulence-flame interaction in a simplified controlled setting. Numerical simulations were performed using the massively parallel reactive-flow code Athena-RFX. We discuss global properties of the turbulent flame in this regime (flame width, speed, etc.) and the internal structure of the flame brush. A method is presented for directly reconstructing the internal flame structure and it is shown that correct characterization of the flame regime can be very sensitive to the proper choice of the diagnostic method. We discuss the ability of the turbulent cascade to penetrate the internal flame structure. Finally, we also consider the processes that determine the turbulent burning velocity and identify two distinct regimes of flame evolution. This work was supported in part by the National Research Council, Naval Research Laboratory, and the Office of Naval Research, and by the National Science Foundation through the TeraGrid resources.
Model-based POD study of manual ultrasound inspection and sensitivity analysis using metamodel
NASA Astrophysics Data System (ADS)
Ribay, Guillemette; Artusi, Xavier; Jenson, Frédéric; Reece, Christopher; Lhuillier, Pierre-Emile
2016-02-01
The reliability of NDE can be quantified by using the Probability of Detection (POD) approach. Former studies have shown the potential of the model-assisted POD (MAPOD) approach to replace expensive experimental determination of POD curves. In this paper, we make use of CIVA software to determine POD curves for a manual ultrasonic inspection of a heavy component, for which a whole experimental POD campaign was not available. The influential parameters were determined by expert analysis. The semi-analytical models used in CIVA for wave propagation and beam-defect interaction have been validated in the range of variation of the influential parameters by comparison with finite element modelling (Athena). The POD curves are computed for « hit/miss » and « â versus a » analysis. The verification of Berens hypothesis is evaluated by statistical tools. A sensitivity study is performed to measure the relative influence of parameters on the defect response amplitude variance, using the Sobol sensitivity index. A meta-model is also built to reduce computing cost and enhance the precision of estimated index.
Preliminary assessment of the ATHENA/WFI non-X-ray background
NASA Astrophysics Data System (ADS)
Perinati, Emanuele; Barbera, Marco; Diebold, Sebastian; Guzman, Alejandro; Santangelo, Andrea; Tenzer, Chris
2017-12-01
We present a preliminary assessment of the non-X-ray background for the WFI on board ATHENA conducted at IAAT in the context of the collaborative background and radiation damage working group activities. Our main result is that in the baseline configuration originally assumed for the camera the requirement on the level of non-X-ray background could not be met. In light of the results of Geant4 simulations we propose and discuss a possible optimization of the camera design and pinpoint some open issues to be addressed in the next phase of investigation. One of these concerns the possible contribution to the non-X-ray background from soft protons and ions funneled to the focal plane through the optics. This is not quantified at this stage, here we just briefly report on our ongoing activities aimed at validating the mechanisms of proton scattering at grazing incidence.
Investigation of photolithography process on SPOs for the Athena mission
NASA Astrophysics Data System (ADS)
Massahi, S.; Girou, D. A.; Ferreira, D. D. M.; Christensen, F. E.; Jakobsen, A. C.; Shortt, B.; Collon, M.; Landgraf, B.
2015-09-01
As part of the ongoing effort to optimize the throughput of the Athena optics we have produced mirrors with a state-of-the-art cleaning process. We report on the studies related to the importance of the photolithographic process. Pre-coating characterization of the mirrors has shown and still shows photoresist remnants on the SiO2- rib bonding zones, which influences the quality of the metallic coating and ultimately the mirror performance. The size of the photoresist remnants is on the order of 10 nm which is about half the thickness of final metallic coating. An improved photoresist process has been developed including cleaning with O2 plasma in order to remove the remaining photoresist remnants prior to coating. Surface roughness results indicate that the SiO2-rib bonding zones are as clean as before the photolithography process is performed.
ATHENA: system design and implementation for a next-generation x-ray telescope
NASA Astrophysics Data System (ADS)
Ayre, M.; Bavdaz, M.; Ferreira, I.; Wille, E.; Lumb, D.; Linder, M.; Stefanescu, A.
2017-08-01
ATHENA, Europe's next generation x-ray telescope, is currently under Assessment Phase study with parallel candidate industrial Prime contractors after selection for the 'L2' slot in ESA's Cosmic Vision Programme, with a mandate to address the 'Hot and Energetic Universe' Cosmic Vision science theme. This paper will consider the main technical requirements of the mission, and their mapping to resulting design choices at both mission and spacecraft level. The reference mission architecture and current reference spacecraft design will then be described, with particular emphasis given to description of the Science Instrument Module (SIM) design, currently under the responsibility of the ESA Study Team. The SIM is a very challenging item due primarily to the need to provide to the instruments (i) a soft ride during launch, and (ii) a very large ( 3 kW) heat dissipation capability at varying interface temperatures and locations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.
1995-12-31
In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less
Conceptual design of the X-IFU Instrument Control Unit on board the ESA Athena mission
NASA Astrophysics Data System (ADS)
Corcione, L.; Ligori, S.; Capobianco, V.; Bonino, D.; Valenziano, L.; Guizzo, G. P.
2016-07-01
Athena is one of L-class missions selected in the ESA Cosmic Vision 2015-2025 program for the science theme of the Hot and Energetic Universe. The Athena model payload includes the X-ray Integral Field Unit (X-IFU), an advanced actively shielded X-ray microcalorimeter spectrometer for high spectral resolution imaging, utilizing cooled Transition Edge Sensors. This paper describes the preliminary architecture of Instrument Control Unit (ICU), which is aimed at operating all XIFU's subsystems, as well as at implementing the main functional interfaces of the instrument with the S/C control unit. The ICU functions include the TC/TM management with S/C, science data formatting and transmission to S/C Mass Memory, housekeeping data handling, time distribution for synchronous operations and the management of the X-IFU components (i.e. CryoCoolers, Filter Wheel, Detector Readout Electronics Event Processor, Power Distribution Unit). ICU functions baseline implementation for the phase-A study foresees the usage of standard and Space-qualified components from the heritage of past and current space missions (e.g. Gaia, Euclid), which currently encompasses Leon2/Leon3 based CPU board and standard Space-qualified interfaces for the exchange commands and data between ICU and X-IFU subsystems. Alternative architecture, arranged around a powerful PowerPC-based CPU, is also briefly presented, with the aim of endowing the system with enhanced hardware resources and processing power capability, for the handling of control and science data processing tasks not defined yet at this stage of the mission study.
Development and application of the GIM code for the Cyber 203 computer
NASA Technical Reports Server (NTRS)
Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.
1982-01-01
The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.
Students, Teachers, and Scientists Partner to Explore Mars
NASA Astrophysics Data System (ADS)
Bowman, C. D.; Bebak, M.; Curtis, K.; Daniel, C.; Grigsby, B.; Herman, T.; Haynes, E.; Lineberger, D. H.; Pieruccini, S.; Ransom, S.; Reedy, K.; Spencer, C.; Steege, A.
2003-12-01
The Mars Exploration Rovers began their journey to the red planet in the summer of 2003 and, in early 2004, will begin an unprecedented level of scientific exploration on Mars, attracting the attention of scientists and the public worldwide. In an effort to engage students and teachers in this exciting endeavor, NASA's Mars Public Engagement Office, partnering with the Athena Science Investigation, coordinates a student-scientist research partnership program called the Athena Student Interns Program. The Athena Student Interns Program \\(ASIP\\) began in early 1999 as the LAPIS program, a pilot hands-on educational effort associated with the FIDO prototype Mars rover field tests \\(Arvidson, 2000\\). In ASIP, small groups of students and teachers selected through a national application process are paired with mentors from the mission's Athena Science Team to carry out an aspect of the mission. To prepare for actual operations during the landed rover mission, the students and teachers participate in one of the Science Team's Operational Readiness Tests \\(ORTs\\) at JPL using a prototype rover in a simulated Mars environment \\(Crisp, et al., in press. See also http://mars.jpl.nasa.gov/mer/fido/\\). Once the rovers have landed, each ASIP group will spend one week at JPL in mission operations, working as part of their mentor's own team to help manage and interpret data coming from Mars. To reach other teachers and students, each group gives school and community presentations, contributes to publications such as web articles and conference abstracts, and participates in NASA webcasts and webchats. Partnering with other groups and organizations, such as NASA's Solar System Ambassadors and the Housing and Urban Development Neighborhood Networks helps reach an even broader audience. ASIP is evaluated through the use of empowerment evaluation, a technique that actively involves participants in program assessment \\(Fetterman and Bowman, 2002\\). With the knowledge they gain through the ASIP program and their participation in the empowerment evaluation, ASIP members will help refine the current program and provide a model for student-scientist research partnerships associated with future space missions to Mars and beyond. Arvidson, R.E., et al. \\(2000\\) Students participate in Mars Sample Return Rover field tests. Eos, 81(11). Crisp, J.A., et al. \\(in press\\) The Mars Exploration Rover Mission. J. Geophys. Research-Planets. Fetterman, D. and C.D. Bowman. \\(2002\\) Experiential Education and Empowerment Evaluation: Mars Rover Educational Program Case Example. J. Experiential Education, 25(2).
Distributed Software for Observations in the Near Infrared
NASA Astrophysics Data System (ADS)
Gavryusev, V.; Baffa, C.; Giani, E.
We have developed an integrated system that performs astronomical observations in Near Infrared bands operating two-dimensional instruments at the Italian National Infrared Facility's \\htmllink{ARNICA}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/arnica/arnica.html} and \\htmllink{LONGSP}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/longsp/longsp.html}. This software consists of several communicating processes, generally executed across a network, as well as on a single computer. The user interface is organized as widget-based X11 client. The interprocess communication is provided by sockets and uses TCP/IP. The processes denoted for control of hardware (telescope and other instruments) should be executed currently on a PC dedicated for this task under DESQview/X, while all other components (user interface, tools for the data analysis, etc.) can also work under UNIX\\@. The hardware independent part of software is based on the Athena Widget Set and is compiled by GNU C to provide maximum portability.
Automating Guidelines for Clinical Decision Support: Knowledge Engineering and Implementation.
Tso, Geoffrey J; Tu, Samson W; Oshiro, Connie; Martins, Susana; Ashcraft, Michael; Yuen, Kaeli W; Wang, Dan; Robinson, Amy; Heidenreich, Paul A; Goldstein, Mary K
2016-01-01
As utilization of clinical decision support (CDS) increases, it is important to continue the development and refinement of methods to accurately translate the intention of clinical practice guidelines (CPG) into a computable form. In this study, we validate and extend the 13 steps that Shiffman et al. 5 identified for translating CPG knowledge for use in CDS. During an implementation project of ATHENA-CDS, we encoded complex CPG recommendations for five common chronic conditions for integration into an existing clinical dashboard. Major decisions made during the implementation process were recorded and categorized according to the 13 steps. During the implementation period, we categorized 119 decisions and identified 8 new categories required to complete the project. We provide details on an updated model that outlines all of the steps used to translate CPG knowledge into a CDS integrated with existing health information technology.
Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd
2005-01-01
Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.
Final Report on ITER Task Agreement 81-08
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard L. Moore
As part of an ITER Implementing Task Agreement (ITA) between the ITER US Participant Team (PT) and the ITER International Team (IT), the INL Fusion Safety Program was tasked to provide the ITER IT with upgrades to the fusion version of the MELCOR 1.8.5 code including a beryllium dust oxidation model. The purpose of this model is to allow the ITER IT to investigate hydrogen production from beryllium dust layers on hot surfaces inside the ITER vacuum vessel (VV) during in-vessel loss-of-cooling accidents (LOCAs). Also included in the ITER ITA was a task to construct a RELAP5/ATHENA model of themore » ITER divertor cooling loop to model the draining of the loop during a large ex-vessel pipe break followed by an in-vessel divertor break and compare the results to a simular MELCOR model developed by the ITER IT. This report, which is the final report for this agreement, documents the completion of the work scope under this ITER TA, designated as TA 81-08.« less
Radio-loud AGN Variability from Propagating Relativistic Jets
NASA Astrophysics Data System (ADS)
Li, Yutong; Schuh, Terance; Wiita, Paul J.
2018-06-01
The great majority of variable emission in radio-loud AGNs is understood to arise from the relativistic flows of plasma along two oppositely directed jets. We study this process using the Athena hydrodynamics code to simulate propagating three-dimensional relativistic jets for a wide range of input jet velocities and jet-to-ambient matter density ratios. We then focus on those simulations that remain essentially stable for extended distances (60-120 times the jet radius). Adopting results for the densities, pressures and velocities from these propagating simulations we estimate emissivities from each cell. The observed emissivity from each cell is strongly dependent upon its variable Doppler boosting factor, which depends upon the changing bulk velocities in those zones with respect to our viewing angle to the jet. We then sum the approximations to the fluxes from a large number of zones upstream of the primary reconfinement shock. The light curves so produced are similar to those of blazars, although turbulence on sub-grid scales is likely to be important for the variability on the shortest timescales.
AtomDB Progress Report: Atomic data and new models for X-ray spectroscopy.
NASA Astrophysics Data System (ADS)
Smith, Randall K.; Foster, Adam; Brickhouse, Nancy S.; Stancil, Phillip C.; Cumbee, Renata; Mullen, Patrick Dean; AtomDB Team
2018-06-01
The AtomDB project collects atomic data from both theoretical and observational/experimental sources, providing both a convenient interface (http://www.atomdb.org/Webguide/webguide.php) as well as providing input to spectral models for many types of astrophysical X-ray plasmas. We have released several updates to AtomDB in response to the Hitomi data, including new data for the Fe K complex, and have expanded the range of models available in AtomDB to include the Kronos charge exchange models from Mullen at al. (2016, ApJS, 224, 2). Combined with the previous AtomDB charge exchange model (http://www.atomdb.org/CX/), these data enable a velocity-dependent model for X-ray and EUV charge exchange spectra. We also present a new Kappa-distribution spectral model, enabling plasmas with non-Maxwellian electron distributions to be modeled with AtomDB. Tools are provided within pyAtomDB to explore and exploit these new plasma models. This presentation will review these enhancements and describe plans for the new few years of database and code development in preparation for XARM, Athena, and (hopefully) Arcus.
The Athena Microscopic Imager Investigation
NASA Technical Reports Server (NTRS)
Herkenhoff, K. E.; Aquyres, S. W.; Bell, J. F., III; Maki, J. N.; Arneson, H. M.; Brown, D. I.; Collins, S. A.; Dingizian, A.; Elliot, S. T.; Geotz, W.
2003-01-01
The Athena science payload on the Mars Exploration Rovers (MER) includes the Microscopic Imager (MI) [1]. The MI is a fixed-focus camera mounted on the end of an extendable instrument arm, the Instrument Deployment Device (IDD; see Figure 1).The MI was designed to acquire images at a spatial resolution of 30 microns/pixel over a broad spectral range (400 - 700 nm; see Table 1). Technically, the microscopic imager is not a microscope: it has a fixed magnification of 0.4 and is intended to produce images that simulate a geologist s view through a common hand lens. In photographers parlance, the system makes use of a macro lens. The MI uses the same electronics design as the other MER cameras [2, 3] but has optics that yield a field of view of 31 31 mm across a 1024 1024 pixel CCD image (Figure 2). The MI acquires images using only solar or skylightillumination of the target surface. A contact sensor is used to place the MI slightly closer to the target surface than its best focus distance (about 66 mm), allowing concave surfaces to be imaged in good focus. Because the MI has a relatively small depth of field (3 mm), a single MI image of a rough surface will contain both focused and unfocused areas. Coarse focusing will be achieved by moving the IDD away from a rock target after the contact sensor is activated. Multiple images taken at various distances will be acquired to ensure good focus on all parts of rough surfaces. By combining a set of images acquired in this way, a completely focused image can be assembled. Stereoscopic observations can be obtained by moving the MI laterally relative to its boresight. Estimates of the position and orientation of the MI for each acquired image will be stored in the rover computer and returned to Earth with the image data. The MI optics will be protected from the Martian environment by a retractable dust cover. The dust cover includes a Kapton window that is tinted orange to restrict the spectral bandpass to 500-700 nm, allowing color information to be obtained by taking images with the dust cover open and closed. The MI will image the same materials measured by other Athena instruments (including surfaces prepared by the Rock Abrasion Tool), as well as rock and soil targets of opportunity. Subsets of the full image array can be selected and/or pixels can be binned to reduce data volume. Image compression will be used to maximize the information contained in the data returned to Earth. The resulting MI data will place other MER instrument data in context and aid in petrologic and geologic interpretations of rocks and soils on Mars.
Update of GRASP/Ada reverse engineering tools for Ada
NASA Technical Reports Server (NTRS)
Cross, James H., II
1993-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. An overview of the GRASP/Ada project with an emphasis on the current update is provided.
Implementation of a 3D mixing layer code on parallel computers
NASA Technical Reports Server (NTRS)
Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.
1995-01-01
This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
40 CFR 194.23 - Models and computer codes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...
X-ray telescope mirrors made of slumped glass sheets
NASA Astrophysics Data System (ADS)
Winter, A.; Breunig, E.; Friedrich, P.; Proserpio, L.
2017-11-01
For several decades, the field of X-ray astronomy has been playing a major role in understanding the processes in our universe. From binary stars and black holes up to galaxy clusters and dark matter, high energetic events have been observed and analysed using powerful X-ray telescopes like e.g. Rosat, Chandra, and XMM-Newton [1,2,3], giving us detailed and unprecedented views of the high-energy universe. In November 2013, the theme of "The Hot and Energetic Universe" was rated as of highest importance for future exploration and in June 2014 the ATHENA Advanced Telescope for High Energy Astrophysics was selected by ESA for the second large science mission (L2) in the ESA Cosmic Vision program, with launch foreseen in 2028 [4]. By combining a large X-ray telescope with state-of-the-art scientific instruments, ATHENA will address key questions in astrophysics, including: How and why does ordinary matter assemble into the galaxies and galactic clusters that we see today? How do black holes grow and influence their surroundings? In order to answer these questions, ATHENA needs a powerful mirror system which exceed the capabilities of current missions, especially in terms of collecting area. However, current technologies have reached the mass limits of the launching rocket, creating the need for more light-weight mirror systems in order to enhance the effective area without increasing the telescope mass. Hence new mirror technologies are being developed which aim for low-weight systems with large collecting areas. Light material like glass can be used, which are shaped to form an X-ray reflecting system via the method of thermal glass slumping.
Impact of dronedarone in atrial fibrillation and flutter on stroke reduction.
Christiansen, Christine Benn; Torp-Pedersen, Christian; Køber, Lars
2010-04-07
Dronedarone has been developed for treatment of atrial fibrillation (AF) or atrial flutter (AFL). It is an amiodarone analogue but noniodinized and without the same adverse effects as amiodarone. This is a review of 7 studies (DAFNE, ADONIS, EURIDIS, ATHENA, ANDROMEDA, ERATO and DIONYSOS) on dronedarone focusing on efficacy, safety and prevention of stroke. There was a dose-finding study (DAFNE), 3 studies focusing on maintenance of sinus rhythm (ADONIS, EURIDIS and DIONYSOS), 1 study focusing on rate control (ERATO) and 2 studies investigating mortality and morbidity (ANDROMEDA and ATHENA). The target dose for dronedarone was established in the DAFNE study to be 400 mg twice daily. Both EURIDIS and ADONIS studies demonstrated that dronedarone was superior to placebo for maintaining sinus rhythm. However, DIONYSOS found that dronedarone is less efficient at maintaining sinus rhythm than amiodarone. ERATO concluded that dronedarone reduces ventricular rate in patients with chronic AF. The ANDROMEDA study in patients with severe heart failure was discontinued because of increased mortality in dronedarone group. Dronedarone reduced cardiovascular hospitalizations and mortality in patients with AF or AFL in the ATHENA trial. Secondly, according to a post hoc analysis a significant reduction in stroke was observed (annual rate 1.2% on dronedarone vs 1.8% on placebo, respectively [hazard ratio 0.66, confidence interval 0.46 to 0.96, P = 0.027]). In total, 54 cases of stroke occurred in 3439 patients (crude rate 1.6%) receiving dronedarone compared to 76 strokes in 3048 patients on placebo (crude rate 2.5%), respectively. Dronedarone can be used for maintenance of sinus rhythm and can reduce stroke in patients with AF who receive usual care, which includes antithrombotic therapy and heart rate control.
Shapiro, Shelley; Torres, Fernando; Feldman, Jeremy; Keogh, Anne; Allard, Martine; Blair, Christiana; Gillies, Hunter; Tislow, James; Oudiz, Ronald J
2017-05-01
Pulmonary arterial hypertension (PAH) is a condition which may lead to right ventricular failure and premature death. While recent data supports the initial combination of ambrisentan (a selective ERA) and tadalafil (a PDE5i) in functional class II or III patients, there is no published data describing the safety and efficacy of ambrisentan when added to patients currently receiving a PDE5i and exhibiting a suboptimal response. The ATHENA-1 study describes the safety and efficacy of the addition of ambrisentan in this patient population. PAH patients with a suboptimal response to current PDE5i monotherapy were assigned ambrisentan in an open-label fashion and evaluated for up to 48 weeks. Cardiopulmonary hemodynamics (change in PVR as primary endpoint) were evaluated at week 24 and functional parameters and biomarkers were measured through week 48. Time to clinical worsening (TTCW) and survival are also described. Thirty-three subjects were included in the analysis. At week 24, statistically significant improvements in PVR (-32%), mPAP (-11%), and CI (+25%) were observed. Hemodynamic improvements at week 24 were further supported by improvements in the secondary endpoints: 6-min walk distance (+18 m), NT-proBNP (-31%), and maintenance or improvement in WHO FC in 97% of patients. Adverse events were consistent with known effects of ambrisentan. The hemodynamic, functional, and biomarker improvements observed in the ATHENA-1 study suggests that the sequential addition of ambrisentan to patients not having a satisfactory response to established PDE5i monotherapy is a reasonable option. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Arvidson, R. E.; Lindemann, R.; Matijevic, J.; Richter, L.; Sullivan, R.; Haldemann, A.; Anderson, R.; Snider, N.
2003-01-01
The two 2003 Mars Exploration Rovers (MERs), in combination with the Athena Payload, will be used as virtual instrument systems to infer terrain properties during traverses, in addition to using the rover wheels to excavate trenches, exposing subsurface materials for remote and in-situ observations. The MERs are being modeled using finite element-based rover system transfer functions that utilize the distribution of masses associated with the vehicle, together with suspension and wheel dynamics, to infer surface roughness and mechanical properties from traverse time series data containing vehicle yaw, pitch, roll, encoder counts, and motor currents. These analyses will be supplemented with imaging and other Athena Payload measurements. The approach is being validated using Sojourner data, the FIDO rover, and experiments with MER testbed vehicles. In addition to conducting traverse science and associated analyses, trenches will be excavated by the MERs to depths of approximately 10-20 cm by locking all but one of the front wheels and rotating that wheel backwards so that the excavated material is piled up on the side of the trench away from the vehicle. Soil cohesion and angle of internal friction will be determined from the trench telemetry data. Emission spectroscopy and in-situ observations will be made using the Athena payload before and after imaging. Trenching and observational protocols have been developed using Sojourner results; data from the FIDO rover, including trenches dug into sand, mud cracks, and weakly indurated bedrock; and experiments with MER testbed rovers. Particular attention will be focused on Mini-TES measurements designed to determine the abundance and state of subsurface water (e.g. hydrated, in zeolites, residual pore ice?) predicted to be present from Odyssey GRS/NS/HEND data.
Computer Description of Black Hawk Helicopter
1979-06-01
Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents
The X-IFU end-to-end simulations performed for the TES array optimization exercise
NASA Astrophysics Data System (ADS)
Peille, Philippe; Wilms, J.; Brand, T.; Cobo, B.; Ceballos, M. T.; Dauser, T.; Smith, S. J.; Barret, D.; den Herder, J. W.; Piro, L.; Barcons, X.; Pointecouteau, E.; Bandler, S.; den Hartog, R.; de Plaa, J.
2015-09-01
The focal plane assembly of the Athena X-ray Integral Field Unit (X-IFU) includes as the baseline an array of ~4000 single size calorimeters based on Transition Edge Sensors (TES). Other sensor array configurations could however be considered, combining TES of different properties (e.g. size). In attempting to improve the X-IFU performance in terms of field of view, count rate performance, and even spectral resolution, two alternative TES array configurations to the baseline have been simulated, each combining a small and a large pixel array. With the X-IFU end-to-end simulator, a sub-sample of the Athena core science goals, selected by the X-IFU science team as potentially driving the optimal TES array configuration, has been simulated for the results to be scientifically assessed and compared. In this contribution, we will describe the simulation set-up for the various array configurations, and highlight some of the results of the test cases simulated.
User manual for semi-circular compact range reflector code: Version 2
NASA Technical Reports Server (NTRS)
Gupta, Inder J.; Burnside, Walter D.
1987-01-01
A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.
Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burk, K.W.; Andrews, G.L.
1989-02-01
The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less
User's manual for semi-circular compact range reflector code
NASA Technical Reports Server (NTRS)
Gupta, Inder J.; Burnside, Walter D.
1986-01-01
A computer code was developed to analyze a semi-circular paraboloidal reflector antenna with a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the antenna or its individual components at a given distance from the center of the paraboloid. Thus, it is very effective in computing the size of the sweet spot for RCS or antenna measurement. The operation of the code is described. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.
Highly fault-tolerant parallel computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spielman, D.A.
We re-introduce the coded model of fault-tolerant computation in which the input and output of a computational device are treated as words in an error-correcting code. A computational device correctly computes a function in the coded model if its input and output, once decoded, are a valid input and output of the function. In the coded model, it is reasonable to hope to simulate all computational devices by devices whose size is greater by a constant factor but which are exponentially reliable even if each of their components can fail with some constant probability. We consider fine-grained parallel computations inmore » which each processor has a constant probability of producing the wrong output at each time step. We show that any parallel computation that runs for time t on w processors can be performed reliably on a faulty machine in the coded model using w log{sup O(l)} w processors and time t log{sup O(l)} w. The failure probability of the computation will be at most t {center_dot} exp(-w{sup 1/4}). The codes used to communicate with our fault-tolerant machines are generalized Reed-Solomon codes and can thus be encoded and decoded in O(n log{sup O(1)} n) sequential time and are independent of the machine they are used to communicate with. We also show how coded computation can be used to self-correct many linear functions in parallel with arbitrarily small overhead.« less
An emulator for minimizing computer resources for finite element analysis
NASA Technical Reports Server (NTRS)
Melosh, R.; Utku, S.; Islam, M.; Salama, M.
1984-01-01
A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).
A generalized one-dimensional computer code for turbomachinery cooling passage flow calculations
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Roelke, Richard J.; Meitner, Peter L.
1989-01-01
A generalized one-dimensional computer code for analyzing the flow and heat transfer in the turbomachinery cooling passages was developed. This code is capable of handling rotating cooling passages with turbulators, 180 degree turns, pin fins, finned passages, by-pass flows, tip cap impingement flows, and flow branching. The code is an extension of a one-dimensional code developed by P. Meitner. In the subject code, correlations for both heat transfer coefficient and pressure loss computations were developed to model each of the above mentioned type of coolant passages. The code has the capability of independently computing the friction factor and heat transfer coefficient on each side of a rectangular passage. Either the mass flow at the inlet to the channel or the exit plane pressure can be specified. For a specified inlet total temperature, inlet total pressure, and exit static pressure, the code computers the flow rates through the main branch and the subbranches, flow through tip cap for impingement cooling, in addition to computing the coolant pressure, temperature, and heat transfer coefficient distribution in each coolant flow branch. Predictions from the subject code for both nonrotating and rotating passages agree well with experimental data. The code was used to analyze the cooling passage of a research cooled radial rotor.
The Owl of Athena: History, Philosophy, and Humanism in Comparative Education
ERIC Educational Resources Information Center
Kazamias, Andreas
2018-01-01
Since the 1960s, comparative education in the United States, Canada, and Europe has shown considerable growth and vitality, in terms of membership in professional organizations, participation in international conferences, research, and publications. Epistemologically and methodologically, new modernist and postmodernist paradigms have been…
Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †
Murdani, Muhammad Harist; Hong, Bonghee
2018-01-01
In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366
Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.
Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee
2018-03-24
In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.
Volume accumulator design analysis computer codes
NASA Technical Reports Server (NTRS)
Whitaker, W. D.; Shimazaki, T. T.
1973-01-01
The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.
"Hour of Code": Can It Change Students' Attitudes toward Programming?
ERIC Educational Resources Information Center
Du, Jie; Wimmer, Hayden; Rada, Roy
2016-01-01
The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…
Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses
ERIC Educational Resources Information Center
Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan
2013-01-01
Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…
Guidelines for developing vectorizable computer programs
NASA Technical Reports Server (NTRS)
Miner, E. W.
1982-01-01
Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
Enhanced fault-tolerant quantum computing in d-level systems.
Campbell, Earl T
2014-12-05
Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Nonuniform code concatenation for universal fault-tolerant quantum computing
NASA Astrophysics Data System (ADS)
Nikahd, Eesa; Sedighi, Mehdi; Saheb Zamani, Morteza
2017-09-01
Using transversal gates is a straightforward and efficient technique for fault-tolerant quantum computing. Since transversal gates alone cannot be computationally universal, they must be combined with other approaches such as magic state distillation, code switching, or code concatenation to achieve universality. In this paper we propose an alternative approach for universal fault-tolerant quantum computing, mainly based on the code concatenation approach proposed in [T. Jochym-O'Connor and R. Laflamme, Phys. Rev. Lett. 112, 010505 (2014), 10.1103/PhysRevLett.112.010505], but in a nonuniform fashion. The proposed approach is described based on nonuniform concatenation of the 7-qubit Steane code with the 15-qubit Reed-Muller code, as well as the 5-qubit code with the 15-qubit Reed-Muller code, which lead to two 49-qubit and 47-qubit codes, respectively. These codes can correct any arbitrary single physical error with the ability to perform a universal set of fault-tolerant gates, without using magic state distillation.
Athena's Daughters: Women's Perceptions of Mentoring and the Workplace.
ERIC Educational Resources Information Center
Lash, Christine F.
The purpose of this study was to determine if Egan's theory of women's mentoring styles, and related attitudes toward mentoring and the workplace, generalize to women in higher education administration and to women of color. Egan's theory of women's mentoring, based upon the epistemologies conceptualized by Belenky, Clinchy, Goldberger, and…
Green's function methods in heavy ion shielding
NASA Technical Reports Server (NTRS)
Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.
1993-01-01
An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.
1982-01-01
A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.
Automated apparatus and method of generating native code for a stitching machine
NASA Technical Reports Server (NTRS)
Miller, Jeffrey L. (Inventor)
2000-01-01
A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.
Computer codes developed and under development at Lewis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1992-01-01
The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru
2010-12-15
The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less
NASA Astrophysics Data System (ADS)
Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.
2010-12-01
The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.
NASA Astrophysics Data System (ADS)
Goodson, Matthew D.; Heitsch, Fabian; Eklund, Karl; Williams, Virginia A.
2017-07-01
Turbulence models attempt to account for unresolved dynamics and diffusion in hydrodynamical simulations. We develop a common framework for two-equation Reynolds-averaged Navier-Stokes turbulence models, and we implement six models in the athena code. We verify each implementation with the standard subsonic mixing layer, although the level of agreement depends on the definition of the mixing layer width. We then test the validity of each model into the supersonic regime, showing that compressibility corrections can improve agreement with experiment. For models with buoyancy effects, we also verify our implementation via the growth of the Rayleigh-Taylor instability in a stratified medium. The models are then applied to the ubiquitous astrophysical shock-cloud interaction in three dimensions. We focus on the mixing of shock and cloud material, comparing results from turbulence models to high-resolution simulations (up to 200 cells per cloud radius) and ensemble-averaged simulations. We find that the turbulence models lead to increased spreading and mixing of the cloud, although no two models predict the same result. Increased mixing is also observed in inviscid simulations at resolutions greater than 100 cells per radius, which suggests that the turbulent mixing begins to be resolved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng Wei; Zhang Bing; Li Hui
The early optical afterglow emission of several gamma-ray bursts (GRBs) shows a high linear polarization degree (PD) of tens of percent, suggesting an ordered magnetic field in the emission region. The light curves are consistent with being of a reverse shock (RS) origin. However, the magnetization parameter, σ , of the outflow is unknown. If σ is too small, an ordered field in the RS may be quickly randomized due to turbulence driven by various perturbations so that the PD may not be as high as observed. Here we use the “Athena++” relativistic MHD code to simulate a relativistic jetmore » with an ordered magnetic field propagating into a clumpy ambient medium, with a focus on how density fluctuations may distort the ordered magnetic field and reduce PD in the RS emission for different σ values. For a given density fluctuation, we discover a clear power-law relationship between the relative PD reduction and the σ value of the outflow. Such a relation may be applied to estimate σ of the GRB outflows using the polarization data of early afterglows.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Wei; Zhang, Bing; Li, Hui
We report that the early optical afterglow emission of several gamma-ray bursts (GRBs) shows a high linear polarization degree (PD) of tens of percent, suggesting an ordered magnetic field in the emission region. The light curves are consistent with being of a reverse shock (RS) origin. However, the magnetization parameter, σ, of the outflow is unknown. If σ is too small, an ordered field in the RS may be quickly randomized due to turbulence driven by various perturbations so that the PD may not be as high as observed. Here we use the "Athena++" relativistic MHD code to simulate amore » relativistic jet with an ordered magnetic field propagating into a clumpy ambient medium, with a focus on how density fluctuations may distort the ordered magnetic field and reduce PD in the RS emission for different σ values. For a given density fluctuation, we discover a clear power-law relationship between the relative PD reduction and the σ value of the outflow. Finally, such a relation may be applied to estimate σ of the GRB outflows using the polarization data of early afterglows.« less
Density diagnostics of ionized outflows in active galacitc nuclei
NASA Astrophysics Data System (ADS)
Mao, J.; Kaastra, J.; Mehdipour, M.; Raassen, T.; Gu, L.
2017-10-01
Ionized outflows in Active Galactic Nuclei are thought to influence their nuclear and local galactic environment. However, the distance of outflows with respect to the central engine is poorly constrained, which limits our understanding of the kinetic power by the outflows. Therefore, the impact of AGN outflows on their host galaxies is uncertain. Given the density of the outflows, their distance can be immediately obtained by the definition of the ionization parameter. Here we carry out a theoretical study of density diagnostics of AGN outflows using absorption lines from metastable levels in Be-like to F-like ions. With the new self-consistent photoionization model (PION) in the SPEX code, we are able to calculate ground and metastable level populations. This enable us to determine under what physical conditions these levels are significantly populated. We then identify characteristic transitions from these metastable levels in the X-ray band. Firm detections of absorption lines from such metastable levels are challenging for current grating instruments. The next generation of spectrometers like X-IFU onboard Athena will certainly identify the presence/absence of these density- sensitive absorption lines, thus tightly constraining the location and the kinetic power of AGN outflows.
Deng, Wei; Zhang, Bing; Li, Hui; ...
2017-08-03
We report that the early optical afterglow emission of several gamma-ray bursts (GRBs) shows a high linear polarization degree (PD) of tens of percent, suggesting an ordered magnetic field in the emission region. The light curves are consistent with being of a reverse shock (RS) origin. However, the magnetization parameter, σ, of the outflow is unknown. If σ is too small, an ordered field in the RS may be quickly randomized due to turbulence driven by various perturbations so that the PD may not be as high as observed. Here we use the "Athena++" relativistic MHD code to simulate amore » relativistic jet with an ordered magnetic field propagating into a clumpy ambient medium, with a focus on how density fluctuations may distort the ordered magnetic field and reduce PD in the RS emission for different σ values. For a given density fluctuation, we discover a clear power-law relationship between the relative PD reduction and the σ value of the outflow. Finally, such a relation may be applied to estimate σ of the GRB outflows using the polarization data of early afterglows.« less
Simulations of a dense plasma focus on a high impedance generator
NASA Astrophysics Data System (ADS)
Beresnyak, Andrey; Giuliani, John; Jackson, Stuart; Richardson, Steve; Swanekamp, Steve; Schumer, Joe; Commisso, Robert; Mosher, Dave; Weber, Bruce; Velikovich, Alexander
2017-10-01
We study the connection between plasma instabilities and fast ion acceleration for neutron production on a Dense Plasma Focus (DPF). The experiments will be performed on the HAWK generator (665 kA), which has fast rise time, 1.2 μs, and a high inductance, 607 nH. It is hypothesized that high impedance may enhance the neutron yield because the current will not be reduced during the collapse resulting in higher magnetization. To prevent upstream breakdown, we will inject plasma far from the insulator stack. We simulated rundown and collapse dynamics with Athena - Eulerian 3D, unsplit finite volume MHD code that includes shock capturing with Riemann solvers, resistive diffusion and the Hall term. The simulations are coupled to an equivalent circuit model for HAWK. We will report the dynamics and implosion time as a function of the initial injected plasma distribution and the implications of non-ideal effects. We also traced test particles in MHD fields and confirmed the presence of stochastic acceleration, which was limited by the size of the system and the strength of the magnetic field. Supported by DOE/NNSA and the Naval Research Laboratory Base Program.
Users manual and modeling improvements for axial turbine design and performance computer code TD2-2
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.
An Object-Oriented Approach to Writing Computational Electromagnetics Codes
NASA Technical Reports Server (NTRS)
Zimmerman, Martin; Mallasch, Paul G.
1996-01-01
Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.
Computer Description of the Field Artillery Ammunition Supply Vehicle
1983-04-01
Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2011 CFR
2011-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2012 CFR
2012-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2014 CFR
2014-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2010 CFR
2010-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...
NASA Technical Reports Server (NTRS)
Harper, Warren
1989-01-01
Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.
Non-axisymmetric line-driven disc winds - I. Disc perturbations
NASA Astrophysics Data System (ADS)
Dyda, Sergei; Proga, Daniel
2018-04-01
We study mass outflows driven from accretion discs by radiation pressure due to spectral lines. To investigate non-axisymmetric effects, we use the ATHENA++ code and develop a new module to account for radiation pressure driving. In 2D, our new simulations are consistent with previous 2D axisymmetric solutions by Proga et al., who used the ZEUS 2D code. Specifically, we find that the disc winds are time dependent, characterized by a dense stream confined to ˜45° relative to the disc mid-plane and bounded on the polar side by a less dense, fast stream. In 3D, we introduce a vertical, ϕ-dependent, subsonic velocity perturbation in the disc mid-plane. The perturbation does not change the overall character of the solution but global outflow properties such as the mass, momentum, and kinetic energy fluxes are altered by up to 100 per cent. Non-axisymmetric density structures develop and persist mainly at the base of the wind. They are relatively small, and their densities can be a few times higher than the azimuthal average. The structure of the non-axisymmetric and axisymmetric solutions differ also in other ways. Perhaps most importantly from the observational point of view are the differences in the so-called clumping factors, that serve as a proxy for emissivity due to two body processes. In particular, the spatially averaged clumping factor over the entire fast stream, while it is of a comparable value in both solutions, it varies about 10 times faster in the non-axisymmetric case.
48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.
Code of Federal Regulations, 2013 CFR
2013-10-01
... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...
Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB
NASA Technical Reports Server (NTRS)
Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.
2017-01-01
Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.
2004-09-14
This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.
Defense Threat Reduction Agency > About > Media > News > News Archive
into pulmonary research -- DTRA's ATHENA developing cutting-edge technology that benefits soldiers and civilians Dec 15 - Germany to fund more surveillance along Tunisia-Libya border -- DTRA project aims to help 27 - Modern border facility opens thanks to US support -- DTRA working with Georgian Ministry of
Gender roles for Alice and Bob
NASA Astrophysics Data System (ADS)
Harris, Philip
2013-04-01
As the head of a department that is striving to achieve bronze status under the Athena SWAN (Scientific Women's Academic Network) programme, I have become extremely sensitive to gender stereotyping, and I am afraid that the "Alice and Bob" image on the cover of your March issue on quantum frontiers set off some alarm bells.
Preventing Substance Use among High School Athletes: The ATLAS and ATHENA Programs
ERIC Educational Resources Information Center
Goldberg, Linn; Eliot, Diane
2005-01-01
This article will provide information about two worthwhile programs that deal with education of high school athletes about use and abuse of steroids and other areas. Based on rationale and expressed need, program descriptions will be provided including summaries of relevant program results. Guidelines for what practitioners need to consider when…
Environmental testing of the ATHENA mirror modules (Conference Presentation)
NASA Astrophysics Data System (ADS)
Landgraf, Boris; Girou, David; Collon, Maximilien J.; Vacanti, Giuseppe; Barrière, Nicolas M.; Günther, Ramses; Vervest, Mark; van der Hoeven, Roy; Beijersbergen, Marco W.; Bavdaz, Marcos; Wille, Eric; Fransen, Sebastiaan; Shortt, Brian; van Baren, Coen; Eigenraam, Alexander
2017-09-01
The European Space Agency (ESA) is studying the ATHENA (Advanced Telescope for High ENergy Astrophysics) X-ray telescope, the second L-class mission in their Cosmic Vision 2015 - 2025 program with a launch spot in 2028. The baseline technology for the X-ray lens is the newly developed high-performance, light-weight and modular Silicon Pore Optics (SPO). As part of the technology preparation, ruggedisation and environmental testing studies are being conducted to ensure mechanical stability and optical performance of the optics during and after launch, respectively. At cosine, a facility with shock, vibration, tensile strength, long time storage and thermal testing equipment has been set up in order to test SPO mirror module (MM) materials for compliance with an Ariane launch vehicle and the mission requirements. In this paper, we report on the progress of our ongoing investigations regarding tests on mechanical and thermal stability of MM components like single SPO stacks with and without multilayer coatings and complete MMs of inner (R = 250 mm), middle (R = 737 mm) and outer (R = 1500 mm) radii.
The light up and early evolution of high redshift Supermassive Black Holes
NASA Astrophysics Data System (ADS)
Comastri, Andrea; Brusa, Marcella; Aird, James; Lanzuisi, Giorgio
2016-07-01
The known AGN population at z > 6 is made by luminous optical QSO hosting Supermassive Black Holes (M > 10 ^{9}solar masses), likely to represent the tip of the iceberg of the luminosity and mass function. According to theoretical models for structure formation, Massive Black Holes (M _{BH} 10^{4-7} solar masses) are predicted to be abundant in the early Universe (z > 6). The majority of these lower luminosity objects are expected to be obscured and severely underepresented in current optical near-infrared surveys. The detection of such a population would provide unique constraints on the Massive Black Holes formation mechanism and subsequent growth and is within the capabilities of deep and large area ATHENA surveys. After a summary of the state of the art of present deep XMM and Chandra surveys, at z >3-6 also mentioning the expectations for the forthcoming eROSITA all sky survey; I will present the observational strategy of future multi-cone ATHENA Wide Field Imager (WFI) surveys and the expected breakthroughs in the determination of the luminosity function and its evolution at high (> 4) and very high (>6) redshifts.
A deep-branching clade of retrovirus-like retrotransposons in bdelloid rotifers
Gladyshev, Eugene A.; Meselson, Matthew; Arkhipova, Irina R.
2007-01-01
Rotifers of class Bdelloidea, a group of aquatic invertebrates in which males and meiosis have never been documented, are also unusual in their lack of multicopy LINE-like and gypsy-like retrotransposons, groups inhabiting the genomes of nearly all other metazoans. Bdelloids do contain numerous DNA transposons, both intact and decayed, and domesticated Penelope-like retroelements Athena, concentrated at telomeric regions. Here we describe two LTR retrotransposons, each found at low copy number in a different bdelloid species, which define a clade different from previously known clades of LTR retrotransposons. Like bdelloid DNA transposons and Athena, these elements have been found preferentially in telomeric regions. Unlike bdelloid DNA transposons, many of which are decayed, the newly described elements, named Vesta and Juno, inhabiting the genomes of Philodina roseola and Adineta vaga, respectively, appear to be intact and to represent recent insertions, possibly from an exogenous source. We describe the retrovirus-like structure of the new elements, containing gag, pol, and env-like open reading frames, and discuss their possible origins, transmission, and behavior in bdelloid genomes. PMID:17129685
ATHENA: system design and implementation for a next generation x-ray telescope
NASA Astrophysics Data System (ADS)
Ayre, M.; Bavdaz, M.; Ferreira, I.; Wille, E.; Lumb, D.; Linder, M.
2015-08-01
ATHENA, Europe's next generation x-ray telescope, has recently been selected for the 'L2' slot in ESA's Cosmic Vision Programme, with a mandate to address the 'Hot and Energetic Universe' Cosmic Vision science theme. The mission is currently in the Assessment/Definition Phase (A/B1), with a view to formal adoption after a successful System Requirements Review in 2019. This paper will describe the reference mission architecture and spacecraft design produced during Phase 0 by the ESA Concurrent Design Facility (CDF), in response to the technical requirements and programmatic boundary conditions. The main technical requirements and their mapping to resulting design choices will be presented, at both mission and spacecraft level. An overview of the spacecraft design down to subsystem level will then be presented (including the telescope and instruments), remarking on the critically-enabling technologies where appropriate. Finally, a programmatic overview will be given of the on-going Assessment Phase, and a snapshot of the prospects for securing the `as-proposed' mission within the cost envelope will be given.
Development of a model and computer code to describe solar grade silicon production processes
NASA Technical Reports Server (NTRS)
Gould, R. K.; Srivastava, R.
1979-01-01
Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.
Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO
NASA Technical Reports Server (NTRS)
Stallworth, R.; Meyers, C. A.; Stinson, H. C.
1989-01-01
Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.
Computational Predictions of the Performance Wright 'Bent End' Propellers
NASA Technical Reports Server (NTRS)
Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)
2002-01-01
Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.
Understanding Accretion Disks through Three Dimensional Radiation MHD Simulations
NASA Astrophysics Data System (ADS)
Jiang, Yan-Fei
I study the structures and thermal properties of black hole accretion disks in the radiation pressure dominated regime. Angular momentum transfer in the disk is provided by the turbulence generated by the magneto-rotational instability (MRI), which is calculated self-consistently with a recently developed 3D radiation magneto-hydrodynamics (MHD) code based on Athena. This code, developed by my collaborators and myself, couples both the radiation momentum and energy source terms with the ideal MHD equations by modifying the standard Godunov method to handle the stiff radiation source terms. We solve the two momentum equations of the radiation transfer equations with a variable Eddington tensor (VET), which is calculated with a time independent short characteristic module. This code is well tested and accurate in both optically thin and optically thick regimes. It is also accurate for both radiation pressure and gas pressure dominated flows. With this code, I find that when photon viscosity becomes significant, the ratio between Maxwell stress and Reynolds stress from the MRI turbulence can increase significantly with radiation pressure. The thermal instability of the radiation pressure dominated disk is then studied with vertically stratified shearing box simulations. Unlike the previous results claiming that the radiation pressure dominated disk with MRI turbulence can reach a steady state without showing any unstable behavior, I find that the radiation pressure dominated disks always either collapse or expand until we have to stop the simulations. During the thermal runaway, the heating and cooling rates from the simulations are consistent with the general criterion of thermal instability. However, details of the thermal runaway are different from the predictions of the standard alpha disk model, as many assumptions in that model are not satisfied in the simulations. We also identify the key reasons why previous simulations do not find the instability. The thermal instability has many important implications for understanding the observations of both X-ray binaries and Active Galactic Nuclei (AGNs). However, direct comparisons between observations and the simulations require global radiation MHD simulations, which will be the main focus of my future work.
Proceduracy: Computer Code Writing in the Continuum of Literacy
ERIC Educational Resources Information Center
Vee, Annette
2010-01-01
This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…
Computer Code Aids Design Of Wings
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1993-01-01
AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.
The Wide Field Imager instrument for Athena
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Barbera, Marco; Emberger, Valentin; Fürmetz, Maria; Manhart, Markus; Müller-Seidlitz, Johannes; Nandra, Kirpal; Plattner, Markus; Rau, Arne; Treberspurg, Wolfgang
2017-08-01
ESA's next large X-ray mission ATHENA is designed to address the Cosmic Vision science theme 'The Hot and Energetic Universe'. It will provide answers to the two key astrophysical questions how does ordinary matter assemble into the large-scale structures we see today and how do black holes grow and shape the Universe. The ATHENA spacecraft will be equipped with two focal plane cameras, a Wide Field Imager (WFI) and an X-ray Integral Field Unit (X-IFU). The WFI instrument is optimized for state-of-the-art resolution spectroscopy over a large field of view of 40 amin x 40 amin and high count rates up to and beyond 1 Crab source intensity. The cryogenic X-IFU camera is designed for high-spectral resolution imaging. Both cameras share alternately a mirror system based on silicon pore optics with a focal length of 12 m and large effective area of about 2 m2 at an energy of 1 keV. Although the mission is still in phase A, i.e. studying the feasibility and developing the necessary technology, the definition and development of the instrumentation made already significant progress. The herein described WFI focal plane camera covers the energy band from 0.2 keV to 15 keV with 450 μm thick fully depleted back-illuminated silicon active pixel sensors of DEPFET type. The spatial resolution will be provided by one million pixels, each with a size of 130 μm x 130 μm. The time resolution requirement for the WFI large detector array is 5 ms and for the WFI fast detector 80 μs. The large effective area of the mirror system will be completed by a high quantum efficiency above 90% for medium and higher energies. The status of the various WFI subsystems to achieve this performance will be described and recent changes will be explained here.
Thermal analysis of the WFI on the ATHENA observatory
NASA Astrophysics Data System (ADS)
Fürmetz, Maria; Pietschner, Daniel; Meidinger, Norbert
2016-07-01
The WFI (Wide-Field Imager) instrument is one of two instruments of the ATHENA (Advanced Telescope for High- ENergy Astrophysics) mission. ATHENA is the second L-class mission in ESA's Cosmic Vision plan with launch in 2028 and will address the science theme "The Hot and Energetic Universe" by measuring hot gas in clusters and groups of galaxies as well as matter flow in black holes. A moveable mirror assembly focusses the X-ray light to the focal plane of the WFI. The instrument consists of two separate detectors, one with a large DEPFET array of 512x512 pixels and one small and fast detector with 64x64 DEPFET pixels and a readout time of only 80 μs. The mirror system will achieve an angular resolution of 5" HEW. The rather large field of view of 40'x40' in combination with rather high power consumption is challenging not only for the thermal control system. DEPFET sensors as well as front-end electronics and electronics boxes have to be cooled, where a completely passive cooling system with radiators and heat pipes is highly favored. In order to reduce the necessary radiator area, three separate cooling chains with three different temperature levels have been foreseen. So only the DEPFET sensors are cooled down to the lowest temperature of about 190K, while the front-end electronics is supposed to be operated between 250K and 290K. The electronics boxes can be operated at room temperature, nevertheless the excess heat has to be removed. After first estimations of heat loads and radiator areas, a more detailed model of the camera head has been used to identify gradients between the cooling interfaces and the components to be cooled. This information is used within phase A1 of the project to further optimize the design of the instrument, e.g. material selection.
NASA Astrophysics Data System (ADS)
Carpano, Stefania; Wilms, Jörn; Rau, Arne
2016-07-01
One of the science goal of the Athena mission is to detect and characterise, in the X-ray domain, transits of hot Jupiter-like planets orbiting their parent stars. To date, the only candidate for this kind of studies is HD 189733b, a Jupiter-size planet in a 2d orbit, for which a transit depth of 6-8% has been observed accumulating several Chandra and XMM-Newton observations. We simulate in this work realistic light curves of exoplanet transits using the Athena end-to-end simulator, SIXTE, and derive the expected signal-to-noise ratios (SNR) for different instrument configurations and planetary system parameters. We first produce at light curves for the currently existing WFI instrument designs and for different source fluxes to extract the expected (white noise) standard deviation. Next, moderate levels of correlated noise and transits of different depths are added to the light curves. As expected, for pure white noise the SNR is proportional to the square root of the flux, to the light curve bin size and to the number of co-added transits, and by definition proportional to the transit depth. When correlated noise starts to be significant, rebinning the data will only slightly increase the SNR, depending on the noise characteristics. Considering only white noise, a transit observed in a source like HD 189733, that has a flux around 5x10-13 erg s-1 cm-2 and a transit depth of about 5% can be detected with a SNR>3 in a unique transit. With correlated noise, several transits might be necessary. We also simulate trapezoidal shaped transits and try to recover the ingress/egress times after addition of noise. The relative error on the fitted ingress times is below 10% for most of the light curves with SNR>1.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
APC: A New Code for Atmospheric Polarization Computations
NASA Technical Reports Server (NTRS)
Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.
2014-01-01
A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.
Internal Wave Generation by Convection
NASA Astrophysics Data System (ADS)
Lecoanet, Daniel Michael
In nature, it is not unusual to find stably stratified fluid adjacent to convectively unstable fluid. This can occur in the Earth's atmosphere, where the troposphere is convective and the stratosphere is stably stratified; in lakes, where surface solar heating can drive convection above stably stratified fresh water; in the oceans, where geothermal heating can drive convection near the ocean floor, but the water above is stably stratified due to salinity gradients; possible in the Earth's liquid core, where gradients in thermal conductivity and composition diffusivities maybe lead to different layers of stable or unstable liquid metal; and, in stars, as most stars contain at least one convective and at least one radiative (stably stratified) zone. Internal waves propagate in stably stratified fluids. The characterization of the internal waves generated by convection is an open problem in geophysical and astrophysical fluid dynamics. Internal waves can play a dynamically important role via nonlocal transport. Momentum transport by convectively excited internal waves is thought to generate the quasi-biennial oscillation of zonal wind in the equatorial stratosphere, an important physical phenomenon used to calibrate global climate models. Angular momentum transport by convectively excited internal waves may play a crucial role in setting the initial rotation rates of neutron stars. In the last year of life of a massive star, convectively excited internal waves may transport even energy to the surface layers to unbind them, launching a wind. In each of these cases, internal waves are able to transport some quantity--momentum, angular momentum, energy--across large, stable buoyancy gradients. Thus, internal waves represent an important, if unusual, transport mechanism. This thesis advances our understanding of internal wave generation by convection. Chapter 2 provides an underlying theoretical framework to study this problem. It describes a detailed calculation of the internal gravity wave spectrum, using the Lighthill theory of wave excitation by turbulence. We use a Green's function approach, in which we convolve a convective source term with the Green's function of different internal gravity waves. The remainder of the thesis is a circuitous attempt to verify these analytical predictions. I test the predictions of Chapter 2 via numerical simulation. The first step is to identify a code suitable for this study. I helped develop the Dedalus code framework to study internal wave generation by convection. Dedalus can solve many different partial differential equations using the pseudo-spectral numerical method. In Chapter 3, I demonstrate Dedalus' ability to solve different equations used to model convection in astrophysics. I consider both the propagation and damping of internal waves, and the properties of low Rayleigh number convective steady states, in six different equation sets used in the astrophysics literature. This shows that Dedalus can be used to solve the equations of interest. Next, in Chapter 4, I verify the high accuracy of Dedalus by comparing it to the popular astrophysics code Athena in a standard Kelvin-Helmholtz instability test problem. Dedalus performs admirably in comparison to Athena, and provides a high standard for other codes solving the fully compressible Navier-Stokes equations. Chapter 5 demonstrates that Dedalus can simulate convective adjacent to a stably stratified region, by studying convective mixing near carbon flames. The convective overshoot and mixing is well-resolved, and is able to generate internal waves. Confident in Dedalus' ability to study the problem at hand, Chapter 6 describes simulations inspired by water experiments of internal wave generation by convection. The experiments exploit water's unusual property that its density maximum is at 4°C, rather than at 0°C. We use a similar equation of state in Dedalus, and study internal gravity waves generation by convection in a water-like fluid. We test two models of wave generation: bulk excitation (equivalent to the Lighthill theory described in Chapter 2), and surface excitation. We find the bulk excitation model accurately reproduces the waves generated in the simulations, validating the calculations of Chapter 2.
The Impact of Crosstalk in the X-IFU Instrument on Athena Science Cases
NASA Technical Reports Server (NTRS)
Hartog, R. Den; Peille, P.; Dauser, T.; Jackson, B.; Bandler, S.; Barret, D.; Brand, T.; Herder, J-W Den; Kiviranta, M.; Kuur, J. Van Der;
2016-01-01
In this paper we present a first assessment of the impact of various forms of instrumental crosstalk on the science performance of the X-ray Integral Field Unit (X-IFU) on the Athena X-ray mission. This assessment is made using the SIXTE end-to-end simulator in the context of one of the more technically challenging science cases for the XIFU instrument. Crosstalk considerations may influence or drive various aspects of the design of the array of high-count-rate Transition Edge Sensor (TES) detectors and its Frequency Domain Multiplexed (FDM) readout architecture. The Athena X-ray mission was selected as the second L-class mission in ESA's Cosmic Vision 2015–25 plan, with alaunch foreseen in 2028, to address the theme ''Hot and Energetic Universe"1. One of the two instruments on boardAthena is the X-ray Integral Field Unit2 (X-IFU) which is based on an array of 3800 Transition Edge Sensors (TES's)operated at a temperature of 90 mK. The science cases pose an interesting challenge for this instrument, as they requirea combination of high energy resolution (2.5 eV FWHM or better), high spatial resolution (5 arcsec or better) and highcount rate capability (several tens of counts per second per detector for point sources as bright as 10 mCrab).The performance at the single sensor level has been demonstrated3, but the operation of such detectors in an array, usingmultiplexed readout, brings additional challenges, both for the design of the array in which the sensors are placed and forthe readout of the sensors. The readout of the detector array will be based on Frequency Domain Multiplexing (FDM)4.In this system of detectors and readout, crosstalk can arise through various mechanisms: on the TES array, neighboringsensors can couple through thermal crosstalk. Detectors adjacent in carrier frequency may suffer from electrical crosstalkdue to the finite width of the bandpass filters, and shared sources of impedance in their signal lines. The signals from theindividual detectors are summed and then amplified by a pair of SQUID amplifiers before being sent to warm front-endelectronics. The transfer function of the SQUID amplifiers is non-linear, which will give rise to higher harmonics ofcarriers and intermodulation products when multiple signal pulses are simultaneously present in the SQUID. Under highcount rate conditions this is another source of crosstalk. The effect of all these crosstalk sources is that parasitic pulseswill appear in the record of a signal pulse which will create a stochastic offset of the measured energy and thus adegradation of the energy resolution.
Sommerer, Claudia; Suwelack, Barbara; Dragun, Duska; Schenker, Peter; Hauser, Ingeborg A; Nashan, Björn; Thaiss, Friedrich
2016-02-17
Immunosuppression with calcineurin inhibitors remains the mainstay of treatment after kidney transplantation; however, long-term use of these drugs may be associated with nephrotoxicity. In this regard, the current approach is to optimise available immunosuppressive regimens to reduce the calcineurin inhibitor dose while protecting renal function without affecting the efficacy. The ATHENA study is designed to evaluate renal function in two regimens: an everolimus and reduced calcineurin inhibitor-based regimen versus a standard treatment protocol with mycophenolic acid and tacrolimus in de novo kidney transplant recipients. ATHENA is a 12-month, multicentre, open-label, prospective, randomised, parallel-group study in de novo kidney transplant recipients (aged 18 years or older) receiving renal allografts from deceased or living donors. Eligible patients are randomised (1:1:1) prior to transplantation to one of the following three treatment arms: everolimus (starting dose 1.5 mg/day; C0 3-8 ng/mL) with cyclosporine or everolimus (starting dose 3 mg/day; C0 3-8 ng/mL) with tacrolimus or mycophenolic acid (enteric-coated mycophenolate sodium at 1.44 g/day or mycophenolate mofetil at 2 g/day) with tacrolimus; in combination with corticosteroids. All patients receive induction therapy with basiliximab. The primary objective is to demonstrate non-inferiority of renal function (eGFR by the Nankivell formula) in one of the everolimus arms compared with the standard group at month 12 post transplantation. The key secondary objective is to assess the incidence of treatment failure, defined as biopsy-proven acute rejection, graft loss, or death, among the treatment groups. Other objectives include assessment of the individual components of treatment failure, incidence and severity of viral infections, incidence and duration of delayed graft function, incidence of indication biopsies, slow graft function and wound healing complications, and overall safety and tolerability. Exploratory objectives include evaluation of left ventricular hypertrophy assessed by the left ventricular mass index, evolution of human leukocyte antigen and non-human leukocyte antigen antibodies, and a cytomegalovirus substudy. As one of the largest European multicentre kidney transplant studies, ATHENA will determine whether a de novo everolimus-based regimen can preserve renal function versus the standard of care. This study further assesses a number of clinical issues which impact long-term outcomes post transplantation; hence, its results will have a major clinical impact. Clinicaltrials.gov: NCT01843348, date of registration--18 April 2013; EUDRACT number: 2011-005238-21, date of registration--20 March 2012.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
NASA Technical Reports Server (NTRS)
Norment, H. G.
1980-01-01
Calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Any subsonic, external, non-lifting flow can be accommodated; flow into, but not through, inlets also can be simulated. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Code descriptions include operating instructions, card inputs and printouts for example problems, and listing of the FORTRAN codes. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.
Utilizing GPUs to Accelerate Turbomachinery CFD Codes
NASA Technical Reports Server (NTRS)
MacCalla, Weylin; Kulkarni, Sameer
2016-01-01
GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.
PASCO: Structural panel analysis and sizing code: Users manual - Revised
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.
1981-01-01
A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.
Computation of Reacting Flows in Combustion Processes
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Chen, Kuo-Huey
1997-01-01
The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.
NASA Rotor 37 CFD Code Validation: Glenn-HT Code
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2010-01-01
In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.
Final report for the Tera Computer TTI CRADA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, G.S.; Pavlakos, C.; Silva, C.
1997-01-01
Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less
Operations analysis (study 2.1). Program listing for the LOVES computer code
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1974-01-01
A listing of the LOVES computer program is presented. The program is coded partially in SIMSCRIPT and FORTRAN. This version of LOVES is compatible with both the CDC 7600 and the UNIVAC 1108 computers. The code has been compiled, loaded, and executed successfully on the EXEC 8 system for the UNIVAC 1108.
ERIC Educational Resources Information Center
Knowlton, Marie; Wetzel, Robin
2006-01-01
This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…
A MATLAB based 3D modeling and inversion code for MT data
NASA Astrophysics Data System (ADS)
Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.
2017-07-01
The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.
Applications of automatic differentiation in computational fluid dynamics
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.
1994-01-01
Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.
NASA Astrophysics Data System (ADS)
Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.
2016-02-01
The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.
Performance assessment of KORAT-3D on the ANL IBM-SP computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.
1999-09-01
The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Thomas; Hamilton, Steven; Slattery, Stuart
Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less
ERIC Educational Resources Information Center
Yoder, Jan D.; And Others
In recent years, there has been a dramatic increase in the volume of empirical research directed toward the issue of sex-roles, including the development of evaluative instruments such as the Attitudes Toward Women Scale (AWS) and the Personal Attributes Questionnaire (PAQ). The United States Military Academy's Project Athena, designed to examine…
In Athena’s Camp; Preparing for a Conflict in the Information Age
1997-01-01
capital, as man remains the purest, richest information-hurling system. In the words of pulp cinema icon, John Rambo, "the mind is the greatest...tion camp guards shortly after World War II; the crash of a hijacked Malaysian passen- ger plane in 1977; the arson attack at an Aba’dan movie theater
A Case Study of Project ATHENA: Tactical Level Technological Innovation Aboard the USS Benfold
2014-12-01
case studies , grounded theory research , phenomenological research and narrative research . 13 3. Qualitative Methods : Selection The researcher’s...12 B. THE CASE STUDY METHOD 1. Introduction The case study is a type of qualitative research that enables the researcher to chronicle and...objectives and the research environment ultimately determine the
2012-01-01
training encom- passes several concepts, including cognitive knowledge, a performance assessment or pretest , training, a re- peat assessment or posttest ...significantly decreased mor- tality. For the lessons learned in ca- sualty care to be passed on to the next group of surgeons, the training for deployed...unpaid consultant to Athena GTX, Blackhawk Products Group , CHI Systems, Combat Medical Systems, Composite Resources, Compression Works, Creative
ERIC Educational Resources Information Center
Elliot, Diane L.; Goldberg, Linn; Moe, Esther L.; DeFrancesco, Carol A.; Durham, Melissa B.; McGinnis, Wendy; Lockwood, Chondra
2008-01-01
Adolescence and emerging adulthood are critical windows for establishing life-long behaviors. We assessed long-term outcomes of a prospective randomized harm reduction/health promotion program for female high school athletes. The intervention's immediate beneficial effects on diet pill use and unhealthy eating behaviors have been reported;…
Fast H.264/AVC FRExt intra coding using belief propagation.
Milani, Simone
2011-01-01
In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.
2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries
ERIC Educational Resources Information Center
Colby, Jennifer
2015-01-01
This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.
1991-01-01
Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.
User's Manual for FEMOM3DR. Version 1.0
NASA Technical Reports Server (NTRS)
Reddy, C. J.
1998-01-01
FEMoM3DR is a computer code written in FORTRAN 77 to compute radiation characteristics of antennas on 3D body using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. The code is written to handle different feeding structures like coaxial line, rectangular waveguide, and circular waveguide. This code uses the tetrahedral elements, with vector edge basis functions for FEM and triangular elements with roof-top basis functions for MoM. By virtue of FEM, this code can handle any arbitrary shaped three dimensional bodies with inhomogeneous lossy materials; and due to MoM the computational domain can be terminated in any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.
Selection of a computer code for Hanford low-level waste engineered-system performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGrail, B.P.; Mahoney, L.A.
Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less
User's manual for a material transport code on the Octopus Computer Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naymik, T.G.; Mendez, G.D.
1978-09-15
A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.
NASA Technical Reports Server (NTRS)
Logan, Terry G.
1994-01-01
The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.
Computer Description of the M561 Utility Truck
1984-10-01
GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eyler, L L; Trent, D S; Budden, M J
During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.
Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August
2018-07-01
Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.
Adiabatic topological quantum computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesare, Chris; Landahl, Andrew J.; Bacon, Dave
Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less
Adiabatic topological quantum computing
Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...
2015-07-31
Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less
Fast Computation of the Two-Point Correlation Function in the Age of Big Data
NASA Astrophysics Data System (ADS)
Pellegrino, Andrew; Timlin, John
2018-01-01
We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.
Design of convolutional tornado code
NASA Astrophysics Data System (ADS)
Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu
2017-09-01
As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.
Three-dimensional turbopump flowfield analysis
NASA Technical Reports Server (NTRS)
Sharma, O. P.; Belford, K. A.; Ni, R. H.
1992-01-01
A program was conducted to develop a flow prediction method applicable to rocket turbopumps. The complex nature of a flowfield in turbopumps is described and examples of flowfields are discussed to illustrate that physics based models and analytical calculation procedures based on computational fluid dynamics (CFD) are needed to develop reliable design procedures for turbopumps. A CFD code developed at NASA ARC was used as the base code. The turbulence model and boundary conditions in the base code were modified, respectively, to: (1) compute transitional flows and account for extra rates of strain, e.g., rotation; and (2) compute surface heat transfer coefficients and allow computation through multistage turbomachines. Benchmark quality data from two and three-dimensional cascades were used to verify the code. The predictive capabilities of the present CFD code were demonstrated by computing the flow through a radial impeller and a multistage axial flow turbine. Results of the program indicate that the present code operated in a two-dimensional mode is a cost effective alternative to full three-dimensional calculations, and that it permits realistic predictions of unsteady loadings and losses for multistage machines.
NASA Technical Reports Server (NTRS)
Smith, S. D.
1984-01-01
A users manual for the RAMP2 computer code is provided. The RAMP2 code can be used to model the dominant phenomena which affect the prediction of liquid and solid rocket nozzle and orbital plume flow fields. The general structure and operation of RAMP2 are discussed. A user input/output guide for the modified TRAN72 computer code and the RAMP2F code is given. The application and use of the BLIMPJ module are considered. Sample problems involving the space shuttle main engine and motor are included.
NASA Technical Reports Server (NTRS)
Chan, William M.
1995-01-01
Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.
NASA Technical Reports Server (NTRS)
Wigton, Larry
1996-01-01
Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.
ISSYS: An integrated synergistic Synthesis System
NASA Technical Reports Server (NTRS)
Dovi, A. R.
1980-01-01
Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.
User's manual for a two-dimensional, ground-water flow code on the Octopus computer network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naymik, T.G.
1978-08-30
A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.
Interactive Synthesis of Code Level Security Rules
2017-04-01
Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University
NASA Technical Reports Server (NTRS)
1986-01-01
AGDISP, a computer code written for Langley by Continuum Dynamics, Inc., aids crop dusting airplanes in targeting pesticides. The code is commercially available and can be run on a personal computer by an inexperienced operator. Called SWA+H, it is used by the Forest Service, FAA, DuPont, etc. DuPont uses the code to "test" equipment on the computer using a laser system to measure particle characteristics of various spray compounds.
The adaption and use of research codes for performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebetrau, A.M.
1987-05-01
Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less
Topological color codes on Union Jack lattices: a stable implementation of the whole Clifford group
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katzgraber, Helmut G.; Theoretische Physik, ETH Zurich, CH-8093 Zurich; Bombin, H.
We study the error threshold of topological color codes on Union Jack lattices that allow for the full implementation of the whole Clifford group of quantum gates. After mapping the error-correction process onto a statistical mechanical random three-body Ising model on a Union Jack lattice, we compute its phase diagram in the temperature-disorder plane using Monte Carlo simulations. Surprisingly, topological color codes on Union Jack lattices have a similar error stability to color codes on triangular lattices, as well as to the Kitaev toric code. The enhanced computational capabilities of the topological color codes on Union Jack lattices with respectmore » to triangular lattices and the toric code combined with the inherent robustness of this implementation show good prospects for future stable quantum computer implementations.« less
Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter
2007-08-31
latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced
Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation
NASA Technical Reports Server (NTRS)
Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.
2000-01-01
An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.
NASA Technical Reports Server (NTRS)
Hartenstein, Richard G., Jr.
1985-01-01
Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.
NASA Astrophysics Data System (ADS)
Wei, Xiaohui; Li, Weishan; Tian, Hailong; Li, Hongliang; Xu, Haixiao; Xu, Tianfu
2015-07-01
The numerical simulation of multiphase flow and reactive transport in the porous media on complex subsurface problem is a computationally intensive application. To meet the increasingly computational requirements, this paper presents a parallel computing method and architecture. Derived from TOUGHREACT that is a well-established code for simulating subsurface multi-phase flow and reactive transport problems, we developed a high performance computing THC-MP based on massive parallel computer, which extends greatly on the computational capability for the original code. The domain decomposition method was applied to the coupled numerical computing procedure in the THC-MP. We designed the distributed data structure, implemented the data initialization and exchange between the computing nodes and the core solving module using the hybrid parallel iterative and direct solver. Numerical accuracy of the THC-MP was verified through a CO2 injection-induced reactive transport problem by comparing the results obtained from the parallel computing and sequential computing (original code). Execution efficiency and code scalability were examined through field scale carbon sequestration applications on the multicore cluster. The results demonstrate successfully the enhanced performance using the THC-MP on parallel computing facilities.
Code of Ethical Conduct for Computer-Using Educators: An ICCE Policy Statement.
ERIC Educational Resources Information Center
Computing Teacher, 1987
1987-01-01
Prepared by the International Council for Computers in Education's Ethics and Equity Committee, this code of ethics for educators using computers covers nine main areas: curriculum issues, issues relating to computer access, privacy/confidentiality issues, teacher-related issues, student issues, the community, school organizational issues,…
ERIC Educational Resources Information Center
Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler
2018-01-01
Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…
Searching for the Soul: Athena's Owl in the Comparative Education Cosmos
ERIC Educational Resources Information Center
Silova, Iveta
2018-01-01
Professor Kazamias has argued that comparative education has lost its "soul," by abandoning its historical and humanist episteme in the first half of the 20th century and turning to an ahistorical and nonhumanist social science today. This essay takes the readers on a journey across time and space in search of comparative education's…
Report of the Admission of Women to the U.S. Military Academy. Project Athena. 2 September 1977.
ERIC Educational Resources Information Center
Vitters, Alan G.; Kinzer, Nora Scott
Significant actions taken from June 1975 to June 1977 to integrate women into the U.S. Military Academy (USMA) and results of research on coeducation are summarized. Three time periods are discussed: pre-admission phase; cadet basic training, and the initial academic year. Data are presented on characteristics of entering cadets; resignation…
Report of the Admission of Women to the U.S. Military Academy. Project Athena II. 1 June 1978.
ERIC Educational Resources Information Center
Vitters, Alan G.
Coeducation at the U.S. Military Academy from June 1977 to April 1978 is analyzed. Summaries of individual research projects conducted to understand and evaluate specific aspects of coeducation are included. An open-systems model served as a conceptual guide for the study. The following areas were explored: individual adjustment, attitudes, social…
Fahim, Muhammad; Idris, Muhammad; Ali, Rahman; Nugent, Christopher; Kang, Byeong; Huh, Eui-Nam; Lee, Sungyoung
2014-01-01
Technology provides ample opportunities for the acquisition and processing of physical, mental and social health primitives. However, several challenges remain for researchers as how to define the relationship between reported physical activities, mood and social interaction to define an active lifestyle. We are conducting a project, ATHENA(activity-awareness for human-engaged wellness applications) to design and integrate the relationship between these basic health primitives to approximate the human lifestyle and real-time recommendations for wellbeing services. Our goal is to develop a system to promote an active lifestyle for individuals and to recommend to them valuable interventions by making comparisons to their past habits. The proposed system processes sensory data through our developed machine learning algorithms inside smart devices and utilizes cloud infrastructure to reduce the cost. We exploit big data infrastructure for massive sensory data storage and fast retrieval for recommendations. Our contributions include the development of a prototype system to promote an active lifestyle and a visual design capable of engaging users in the goal of increasing self-motivation. We believe that our study will impact the design of future ubiquitous wellness applications. PMID:24859031
New x-ray parallel beam facility XPBF 2.0 for the characterization of silicon pore optics
NASA Astrophysics Data System (ADS)
Krumrey, Michael; Müller, Peter; Cibik, Levent; Collon, Max; Barrière, Nicolas; Vacanti, Giuseppe; Bavdaz, Marcos; Wille, Eric
2016-07-01
A new X-ray parallel beam facility (XPBF 2.0) has been installed in the laboratory of the Physikalisch-Technische Bundesanstalt at the synchrotron radiation facility BESSY II in Berlin to characterize silicon pore optics (SPOs) for the future X-ray observatory ATHENA. As the existing XPBF which is operated since 2005, the new beamline provides a pencil beam of very low divergence, a vacuum chamber with a hexapod system for accurate positioning of the SPO to be investigated, and a vertically movable CCD-based camera system to register the direct and the reflected beam. In contrast to the existing beamline, a multilayer-coated toroidal mirror is used for beam monochromatization at 1.6 keV and collimation, enabling the use of beam sizes between about 100 μm and at least 5 mm. Thus the quality of individual pores as well as the focusing properties of large groups of pores can be investigated. The new beamline also features increased travel ranges for the hexapod to cope with larger SPOs and a sample to detector distance of 12 m corresponding to the envisaged focal length of ATHENA.
XMM-Newton, powerful AGN winds and galaxy feedback
NASA Astrophysics Data System (ADS)
Pounds, K.; King, A.
2016-06-01
The discovery that ultra-fast ionized winds - sufficiently powerful to disrupt growth of the host galaxy - are a common feature of luminous AGN is major scientific breakthrough led by XMM-Newton. An extended observation in 2014 of the prototype UFO, PG1211+143, has revealed an unusually complex outflow, with distinct and persisting velocities detected in both hard and soft X-ray spectra. While the general properties of UFOs are consistent with being launched - at the local escape velocity - from the inner disc where the accretion rate is modestly super-Eddington (King and Pounds, Ann Rev Astron Astro- phys 2015), these more complex flows have raised questions about the outflow geometry and the importance of shocks and enhanced cooling. XMM-Newton seems likely to remain the best Observatory to study UFOs prior to Athena, and further extended observations, of PG1211+143 and other bright AGN, have the exciting potential to establish the typical wind dynamics, while providing new insights on the accretion geometry and continuum source structure. An emphasis on such large, coordinated observing programmes with XMM-Newton over the next decade will continue the successful philosophy pioneered by EXOSAT, while helping to inform the optimum planning for Athena
Hoy, Sheridan M; Keam, Susan J
2009-08-20
Oral dronedarone is a non-iodinated benzofurane derivative structurally related to amiodarone. Although it is considered a class III antiarrhythmic agent like amiodarone, it demonstrates multi-class electrophysiological activity. Data from the ATHENA study demonstrated that patients receiving oral dronedarone 400 mg twice daily for 12-30 months had a significantly lower risk of experiencing first hospitalization due to a cardiovascular event or death from any cause than those receiving placebo. Dronedarone exhibited rate- and rhythm-controlling properties in patients with atrial fibrilation (AF) or atrial flutter, significantly reducing the risk of a first recurrence of AF versus placebo following 12 months' therapy in the ADONIS and EURIDIS studies. In the ERATO study, dronedarone was also significantly more effective than placebo in terms of ventricular rate control. Furthermore, the beneficial effects of oral dronedarone on ventricular rate control were maintained during exercise and sustained with continued therapy. Oral dronedarone was generally well tolerated in the treatment of adult patients with AF and/or atrial flutter in clinical studies. The incidence of diarrhoea, nausea, bradycardia, rash and QT-interval prolongation was significantly higher with oral dronedarone than placebo in the large ATHENA study; however, serious cardiac-related adverse events were observed in <1% of oral dronedarone recipients.
Drug safety evaluation of dronedarone in atrial fibrillation.
De Ferrari, Gaetano M; Dusi, Veronica
2012-11-01
Dronedarone was developed with the intent of replicating the antiarrhythmic effects of amiodarone, while minimizing its side effects. Side effects reported in eight randomized clinical trials are discussed, comparing dronedarone and placebo (DAFNE, EURIDIS, ADONIS, ERATO, ANDROMEDA, ATHENA, PALLAS, total number of patients treated with dronedarone 5347), or dronedarone and amiodarone (DIONYSOS, total number of patients treated with dronedarone 249). The results of the first trials, including ATHENA, set high expectations by suggesting that dronedarone may decrease the risk of hospitalization (and even cardiovascular mortality) among patients with paroxysmal and persistent atrial fibrillation (AF), and that it could be regarded as an easy-to-use drug that could be prescribed by general practitioners; unfortunately, dronedarone has not met these expectations. Dronedarone may increase mortality and heart failure hospitalization in patients with advanced NYHA class and in patients with permanent AF, preventing its use in these settings. In addition to gastrointestinal side effects that may lead to discontinuation in 5 - 10% of patients, dronedarone may induce very rare but severe liver and lung toxicity. Despite these limitations and its relatively limited antiarrhythmic potency, dronedarone may still be a useful drug for well-selected patients.
Mitigation strategies against radiation-induced background for space astronomy missions
NASA Astrophysics Data System (ADS)
Davis, C. S. W.; Hall, D.; Keelan, J.; O'Farrell, J.; Leese, M.; Holland, A.
2018-01-01
The Advanced Telescope for High ENergy Astrophysics (ATHENA) mission is a major upcoming space-based X-ray observatory due to be launched in 2028 by ESA, with the purpose of mapping the early universe and observing black holes. Background radiation is expected to constitute a large fraction of the total system noise in the Wide Field Imager (WFI) instrument on ATHENA, and designing an effective system to reduce the background radiation impacting the WFI will be crucial for maximising its sensitivity. Significant background sources are expected to include high energy protons, X-ray fluorescence lines, 'knock-on' electrons and Compton electrons. Due to the variety of the different background sources, multiple shielding methods may be required to achieve maximum sensitivity in the WFI. These techniques may also be of great interest for use in future space-based X-ray experiments. Simulations have been developed to model the effect of a graded-Z shield on the X-ray fluorescence background. In addition the effect of a 90nm optical blocking filter on the secondary electron background has been investigated and shown to modify the requirements of any secondary electron shielding that is to be used.
The Focal Plane Assembly for the Athena X-Ray Integral Field Unit Instrument
NASA Technical Reports Server (NTRS)
Jackson, B. D.; Van Weers, H.; van der Kuur, J.; den Hartog, R.; Akamatsu, H.; Argan, A.; Bandler, S. R.; Barbera, M.; Barret, D.; Bruijn, M. P.;
2016-01-01
This paper summarizes a preliminary design concept for the focal plane assembly of the X-ray Integral Field Unit on the Athena spacecraft, an imaging microcalorimeter that will enable high spectral resolution imaging and point-source spectroscopy. The instrument's sensor array will be a 3840-pixel transition edge sensor (TES) microcalorimeter array, with a frequency domain multiplexed SQUID readout system allowing this large-format sensor array to be operated within the thermal constraints of the instrument's cryogenic system. A second TES detector will be operated in close proximity to the sensor array to detect cosmic rays and secondary particles passing through the sensor array for off-line coincidence detection to identify and reject events caused by the in-orbit high-energy particle background. The detectors, operating at 55 mK, or less, will be thermally isolated from the instrument cryostat's 2 K stage, while shielding and filtering within the FPA will allow the instrument's sensitive sensor array to be operated in the expected environment during both on-ground testing and in-flight operation, including stray light from the cryostat environment, low-energy photons entering through the X-ray aperture, low-frequency magnetic fields, and high-frequency electric fields.
NASA Technical Reports Server (NTRS)
Norment, H. G.
1985-01-01
Subsonic, external flow about nonlifting bodies, lifting bodies or combinations of lifting and nonlifting bodies is calculated by a modified version of the Hess lifting code. Trajectory calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Inlet flow can be accommodated, and high Mach number compressibility effects are corrected for approximately. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
A COTS-Based Replacement Strategy for Aging Avionics Computers
2001-12-01
Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace
PARAVT: Parallel Voronoi tessellation code
NASA Astrophysics Data System (ADS)
González, R. E.
2016-10-01
In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Brogan, F. A.
1978-01-01
Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.
Holonomic surface codes for fault-tolerant quantum computation
NASA Astrophysics Data System (ADS)
Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco
2018-02-01
Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.
NASA Technical Reports Server (NTRS)
Chima, R. V.; Strazisar, A. J.
1982-01-01
Two and three dimensional inviscid solutions for the flow in a transonic axial compressor rotor at design speed are compared with probe and laser anemometers measurements at near-stall and maximum-flow operating points. Experimental details of the laser anemometer system and computational details of the two dimensional axisymmetric code and three dimensional Euler code are described. Comparisons are made between relative Mach number and flow angle contours, shock location, and shock strength. A procedure for using an efficient axisymmetric code to generate downstream pressure input for computationally expensive Euler codes is discussed. A film supplement shows the calculations of the two operating points with the time-marching Euler code.
Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution
NASA Astrophysics Data System (ADS)
Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi
2015-05-01
In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.
EAC: A program for the error analysis of STAGS results for plates
NASA Technical Reports Server (NTRS)
Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.
1989-01-01
A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.
CFD Modeling of Free-Piston Stirling Engines
NASA Technical Reports Server (NTRS)
Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.
2001-01-01
NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.
Indirect and Direct Signatures of Young Planets in Protoplanetary Disks
NASA Astrophysics Data System (ADS)
Zhu, Zhaohuan; Stone, James M.; Dong, Ruobing; Rafikov, Roman; Bai, Xue-Ning
2015-12-01
Directly finding young planets around protostars is challenging since protostars are highly variable and obscured by dust. However, young planets will interact with protoplanetary disks, inducing disk features such as gaps, spiral arms, and asymmetric features, which are much easier to be detected. Transitional disks, which are protoplanetary disks with gaps and holes, are excellent candidates for finding young planets. Although these disks have been studied extensively in observations (e.g. using Subaru, VLT, ALMA, EVLA), theoretical models still need to be developed to explain observations. We have constructed numerical simulations, including dust particle dynamics and MHD effects, to study planet-disk interaction, with an emphasis on explaining observations. Our simulations have successfully reproduced spiral arms, gaps and asymmetric features observed in transitional disks. Furthermore, by comparing with observations, we have constrained protoplanetary disk properties and pinpoint potential planets in these disks. We will present progress in constructing global simulations to study transitional disks, including using our recently developed Athena++ code with static-mesh-refinement for MHD. Finally we suggest that accreting circumplanetary disks can release an observable amount of energy and could be the key to detect young planets directly. We will discuss how JWST and next generation telescopes can help to find these young planets with circumplanetary disks.
On the error statistics of Viterbi decoding and the performance of concatenated codes
NASA Technical Reports Server (NTRS)
Miller, R. L.; Deutsch, L. J.; Butman, S. A.
1981-01-01
Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.
New double-byte error-correcting codes for memory systems
NASA Technical Reports Server (NTRS)
Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.
1996-01-01
Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.
SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacMillan, D.B.
1960-06-01
>A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)
Computer-assisted coding and clinical documentation: first things first.
Tully, Melinda; Carmichael, Angela
2012-10-01
Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.
1987-01-01
The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.
Bistatic radar cross section of a perfectly conducting rhombus-shaped flat plate
NASA Astrophysics Data System (ADS)
Fenn, Alan J.
1990-05-01
The bistatic radar cross section of a perfectly conducting flat plate that has a rhombus shape (equilateral parallelogram) is investigated. The Ohio State University electromagnetic surface patch code (ESP version 4) is used to compute the theoretical bistatic radar cross section of a 35- x 27-in rhombus plate at 1.3 GHz over the bistatic angles 15 deg to 142 deg. The ESP-4 computer code is a method of moments FORTRAN-77 program which can analyze general configurations of plates and wires. This code has been installed and modified at Lincoln Laboratory on a SUN 3 computer network. Details of the code modifications are described. Comparisons of the method of moments simulations and measurements of the rhombus plate are made. It is shown that the ESP-4 computer code provides a high degree of accuracy in the calculation of copolarized and cross-polarized bistatic radar cross section patterns.
ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less
Navier-Stokes Simulation of Homogeneous Turbulence on the CYBER 205
NASA Technical Reports Server (NTRS)
Wu, C. T.; Ferziger, J. H.; Chapman, D. R.; Rogallo, R. S.
1984-01-01
A computer code which solves the Navier-Stokes equations for three dimensional, time-dependent, homogenous turbulence has been written for the CYBER 205. The code has options for both 64-bit and 32-bit arithmetic. With 32-bit computation, mesh sizes up to 64 (3) are contained within core of a 2 million 64-bit word memory. Computer speed timing runs were made for various vector lengths up to 6144. With this code, speeds a little over 100 Mflops have been achieved on a 2-pipe CYBER 205. Several problems encountered in the coding are discussed.
The investigation of tethered satellite system dynamics
NASA Technical Reports Server (NTRS)
Lorenzini, E.
1985-01-01
The tether control law to retrieve the satellite was modified in order to have a smooth retrieval trajectory of the satellite that minimizes the thruster activation. The satellite thrusters were added to the rotational dynamics computer code and a preliminary control logic was implemented to simulate them during the retrieval maneuver. The high resolution computer code for modelling the three dimensional dynamics of untensioned tether, SLACK3, was made fully operative and a set of computer simulations of possible tether breakages was run. The distribution of the electric field around an electrodynamic tether in vacuo severed at some length from the shuttle was computed with a three dimensional electrodynamic computer code.
Experimental and computational surface and flow-field results for an all-body hypersonic aircraft
NASA Technical Reports Server (NTRS)
Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.
1990-01-01
The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.
Bezemer, Daniela; Cori, Anne; Ratmann, Oliver; van Sighem, Ard; Hermanides, Hillegonda S; Dutilh, Bas E; Gras, Luuk; Rodrigues Faria, Nuno; van den Hengel, Rob; Duits, Ashley J; Reiss, Peter; de Wolf, Frank; Fraser, Christophe
2015-11-01
The HIV-1 subtype B epidemic amongst men who have sex with men (MSM) is resurgent in many countries despite the widespread use of effective combination antiretroviral therapy (cART). In this combined mathematical and phylogenetic study of observational data, we aimed to find out the extent to which the resurgent epidemic is the result of newly introduced strains or of growth of already circulating strains. As of November 2011, the ATHENA observational HIV cohort of all patients in care in the Netherlands since 1996 included HIV-1 subtype B polymerase sequences from 5,852 patients. Patients who were diagnosed between 1981 and 1995 were included in the cohort if they were still alive in 1996. The ten most similar sequences to each ATHENA sequence were selected from the Los Alamos HIV Sequence Database, and a phylogenetic tree was created of a total of 8,320 sequences. Large transmission clusters that included ≥10 ATHENA sequences were selected, with a local support value ≥ 0.9 and median pairwise patristic distance below the fifth percentile of distances in the whole tree. Time-varying reproduction numbers of the large MSM-majority clusters were estimated through mathematical modeling. We identified 106 large transmission clusters, including 3,061 (52%) ATHENA and 652 Los Alamos sequences. Half of the HIV sequences from MSM registered in the cohort in the Netherlands (2,128 of 4,288) were included in 91 large MSM-majority clusters. Strikingly, at least 54 (59%) of these 91 MSM-majority clusters were already circulating before 1996, when cART was introduced, and have persisted to the present. Overall, 1,226 (35%) of the 3,460 diagnoses among MSM since 1996 were found in these 54 long-standing clusters. The reproduction numbers of all large MSM-majority clusters were around the epidemic threshold value of one over the whole study period. A tendency towards higher numbers was visible in recent years, especially in the more recently introduced clusters. The mean age of MSM at diagnosis increased by 0.45 years/year within clusters, but new clusters appeared with lower mean age. Major strengths of this study are the high proportion of HIV-positive MSM with a sequence in this study and the combined application of phylogenetic and modeling approaches. Main limitations are the assumption that the sampled population is representative of the overall HIV-positive population and the assumption that the diagnosis interval distribution is similar between clusters. The resurgent HIV epidemic amongst MSM in the Netherlands is driven by several large, persistent, self-sustaining, and, in many cases, growing sub-epidemics shifting towards new generations of MSM. Many of the sub-epidemics have been present since the early epidemic, to which new sub-epidemics are being added.
Computer search for binary cyclic UEP codes of odd length up to 65
NASA Technical Reports Server (NTRS)
Lin, Mao-Chao; Lin, Chi-Chang; Lin, Shu
1990-01-01
Using an exhaustive computation, the unequal error protection capabilities of all binary cyclic codes of odd length up to 65 that have minimum distances at least 3 are found. For those codes that can only have upper bounds on their unequal error protection capabilities computed, an analytic method developed by Dynkin and Togonidze (1976) is used to show that the upper bounds meet the exact unequal error protection capabilities.
A Combinatorial Geometry Computer Description of the MEP-021A Generator Set
1979-02-01
Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] *7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack
Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)
NASA Technical Reports Server (NTRS)
Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.
1972-01-01
A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.
Unaligned instruction relocation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less
Unaligned instruction relocation
Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.
2018-01-23
In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.
Computer algorithm for coding gain
NASA Technical Reports Server (NTRS)
Dodd, E. E.
1974-01-01
Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Design and optimization of a portable LQCD Monte Carlo code using OpenACC
NASA Astrophysics Data System (ADS)
Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele
The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.
Development of a thermal and structural analysis procedure for cooled radial turbines
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Deanna, Russell G.
1988-01-01
A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.
COMPUTATION OF GLOBAL PHOTOCHEMISTRY WITH SMVGEAR II (R823186)
A computer model was developed to simulate global gas-phase photochemistry. The model solves chemical equations with SMVGEAR II, a sparse-matrix, vectorized Gear-type code. To obtain SMVGEAR II, the original SMVGEAR code was modified to allow computation of different sets of chem...
NASA Technical Reports Server (NTRS)
Weed, Richard Allen; Sankar, L. N.
1994-01-01
An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1979-01-01
The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.
A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle
1984-12-01
program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit
Richard D. Bergman
2015-01-01
Developing wood product LCI data helps construct product LCAs that are then incorporated into developing whole building LCAs in environmental footprint software such as the Athena Impact Estimator for Buildings (ASMI 2015). Conducting whole building LCAs provide for points that go toward green building certification in rating systems such as LEED v4, Green Globes, and...
The Athena Microscopic Imager on the Mars Exploration Rovers
NASA Astrophysics Data System (ADS)
Herkenhoff, K. E.; Squyres, S. W.; Bell, J. F.; Maki, J. N.; Schwochert, M. A.
2002-12-01
The Athena science payload on the Mars Exploration Rovers (MER) includes the Microscopic Imager (MI). The MI is a fixed-focus camera mounted on the end of the Instrument Deployment Device (IDD). The MI was designed to acquire images at a spatial resolution of 30 microns/pixel over a broad spectral range (400-700 nm). Technically speaking, the ''microscopic'' imager is not a microscope: it has a fixed magnification of 0.4, and is intended to produce images that simulate a geologist's view when using a common hand lens. The MI uses the same electronics design as the other MER cameras, but has optics that yield a field of view of 31 x 31 mm. The MI will acquire images using only solar or skylight illumination of the target surface. A contact sensor will be used to place the MI slightly closer to the target surface than its best focus distance (about 66 mm), allowing concave surfaces to be imaged in good focus. Because the MI has a relatively small depth of field (+/- 3 mm), a single MI image of a rough surface will contain both focused and unfocused areas. Coarse (~2 mm precision) focusing will be achieved by moving the IDD away from a target after the contact sensor is activated. Multiple images taken at various distances will be acquired to ensure good focus on all parts of rough surfaces. By combining a set of images acquired in this way, a completely focused image will be assembled. The MI optics will be protected from the martian environment by a dust cover. The dust cover includes a polycarbonate window that is tinted yellow to restrict the spectral bandpass to 500-700 nm and allow color information to be obtained by taking images with the dust cover open and closed. The MI will be used to image the same materials measured by other Athena instruments, as well as targets of opportunity (before rover traverses). The resulting images will be used to place other instrumental data in context and to aid in the petrologic interpretation of rocks and soils on Mars.
Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.
1999-01-01
A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.
Fault tolerant computing: A preamble for assuring viability of large computer systems
NASA Technical Reports Server (NTRS)
Lim, R. S.
1977-01-01
The need for fault-tolerant computing is addressed from the viewpoints of (1) why it is needed, (2) how to apply it in the current state of technology, and (3) what it means in the context of the Phoenix computer system and other related systems. To this end, the value of concurrent error detection and correction is described. User protection, program retry, and repair are among the factors considered. The technology of algebraic codes to protect memory systems and arithmetic codes to protect memory systems and arithmetic codes to protect arithmetic operations is discussed.
The Advanced Software Development and Commercialization Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallopoulos, E.; Canfield, T.R.; Minkoff, M.
1990-09-01
This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less
Source Code Plagiarism--A Student Perspective
ERIC Educational Resources Information Center
Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.
2011-01-01
This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2004-01-01
This viewgraph presentation provides samples of computer code which have characteristics of poetic verse, and addresses the theoretical underpinnings of artistic coding, as well as how computer language influences software style, and the possible style of future coding.
Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing
NASA Astrophysics Data System (ADS)
Salamone, Joseph A., III
Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Visual Computing Environment Workshop
NASA Technical Reports Server (NTRS)
Lawrence, Charles (Compiler)
1998-01-01
The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.
Force user's manual: A portable, parallel FORTRAN
NASA Technical Reports Server (NTRS)
Jordan, Harry F.; Benten, Muhammad S.; Arenstorf, Norbert S.; Ramanan, Aruna V.
1990-01-01
The use of Force, a parallel, portable FORTRAN on shared memory parallel computers is described. Force simplifies writing code for parallel computers and, once the parallel code is written, it is easily ported to computers on which Force is installed. Although Force is nearly the same for all computers, specific details are included for the Cray-2, Cray-YMP, Convex 220, Flex/32, Encore, Sequent, Alliant computers on which it is installed.
Monte Carlo simulation of Ising models by multispin coding on a vector computer
NASA Astrophysics Data System (ADS)
Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus
1984-11-01
Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.
NASA Technical Reports Server (NTRS)
Chan, J. S.; Freeman, J. A.
1984-01-01
The viscous, axisymmetric flow in the thrust chamber of the space shuttle main engine (SSME) was computed on the CRAY 205 computer using the general interpolants method (GIM) code. Results show that the Navier-Stokes codes can be used for these flows to study trends and viscous effects as well as determine flow patterns; but further research and development is needed before they can be used as production tools for nozzle performance calculations. The GIM formulation, numerical scheme, and computer code are described. The actual SSME nozzle computation showing grid points, flow contours, and flow parameter plots is discussed. The computer system and run times/costs are detailed.
Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials
NASA Technical Reports Server (NTRS)
Luebbers, Raymond J.; Beggs, John H.
1991-01-01
Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.
Multitasking the code ARC3D. [for computational fluid dynamics
NASA Technical Reports Server (NTRS)
Barton, John T.; Hsiung, Christopher C.
1986-01-01
The CRAY multitasking system was developed in order to utilize all four processors and sharply reduce the wall clock run time. This paper describes the techniques used to modify the computational fluid dynamics code ARC3D for this run and analyzes the achieved speedup. The ARC3D code solves either the Euler or thin-layer N-S equations using an implicit approximate factorization scheme. Results indicate that multitask processing can be used to achieve wall clock speedup factors of over three times, depending on the nature of the program code being used. Multitasking appears to be particularly advantageous for large-memory problems running on multiple CPU computers.
Addressing the challenges of standalone multi-core simulations in molecular dynamics
NASA Astrophysics Data System (ADS)
Ocaya, R. O.; Terblans, J. J.
2017-07-01
Computational modelling in material science involves mathematical abstractions of force fields between particles with the aim to postulate, develop and understand materials by simulation. The aggregated pairwise interactions of the material's particles lead to a deduction of its macroscopic behaviours. For practically meaningful macroscopic scales, a large amount of data are generated, leading to vast execution times. Simulation times of hours, days or weeks for moderately sized problems are not uncommon. The reduction of simulation times, improved result accuracy and the associated software and hardware engineering challenges are the main motivations for many of the ongoing researches in the computational sciences. This contribution is concerned mainly with simulations that can be done on a "standalone" computer based on Message Passing Interfaces (MPI), parallel code running on hardware platforms with wide specifications, such as single/multi- processor, multi-core machines with minimal reconfiguration for upward scaling of computational power. The widely available, documented and standardized MPI library provides this functionality through the MPI_Comm_size (), MPI_Comm_rank () and MPI_Reduce () functions. A survey of the literature shows that relatively little is written with respect to the efficient extraction of the inherent computational power in a cluster. In this work, we discuss the main avenues available to tap into this extra power without compromising computational accuracy. We also present methods to overcome the high inertia encountered in single-node-based computational molecular dynamics. We begin by surveying the current state of the art and discuss what it takes to achieve parallelism, efficiency and enhanced computational accuracy through program threads and message passing interfaces. Several code illustrations are given. The pros and cons of writing raw code as opposed to using heuristic, third-party code are also discussed. The growing trend towards graphical processor units and virtual computing clouds for high-performance computing is also discussed. Finally, we present the comparative results of vacancy formation energy calculations using our own parallelized standalone code called Verlet-Stormer velocity (VSV) operating on 30,000 copper atoms. The code is based on the Sutton-Chen implementation of the Finnis-Sinclair pairwise embedded atom potential. A link to the code is also given.
Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing
2008-01-01
complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING
Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D
NASA Technical Reports Server (NTRS)
Carle, Alan; Fagan, Mike; Green, Lawrence L.
1998-01-01
This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Garg Vijay; Ameri, Ali
2005-01-01
The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.
Development of a 3-D upwind PNS code for chemically reacting hypersonic flowfields
NASA Technical Reports Server (NTRS)
Tannehill, J. C.; Wadawadigi, G.
1992-01-01
Two new parabolized Navier-Stokes (PNS) codes were developed to compute the three-dimensional, viscous, chemically reacting flow of air around hypersonic vehicles such as the National Aero-Space Plane (NASP). The first code (TONIC) solves the gas dynamic and species conservation equations in a fully coupled manner using an implicit, approximately-factored, central-difference algorithm. This code was upgraded to include shock fitting and the capability of computing the flow around complex body shapes. The revised TONIC code was validated by computing the chemically-reacting (M(sub infinity) = 25.3) flow around a 10 deg half-angle cone at various angles of attack and the Ames All-Body model at 0 deg angle of attack. The results of these calculations were in good agreement with the results from the UPS code. One of the major drawbacks of the TONIC code is that the central-differencing of fluxes across interior flowfield discontinuities tends to introduce errors into the solution in the form of local flow property oscillations. The second code (UPS), originally developed for a perfect gas, has been extended to permit either perfect gas, equilibrium air, or nonequilibrium air computations. The code solves the PNS equations using a finite-volume, upwind TVD method based on Roe's approximate Riemann solver that was modified to account for real gas effects. The dissipation term associated with this algorithm is sufficiently adaptive to flow conditions that, even when attempting to capture very strong shock waves, no additional smoothing is required. For nonequilibrium calculations, the code solves the fluid dynamic and species continuity equations in a loosely-coupled manner. This code was used to calculate the hypersonic, laminar flow of chemically reacting air over cones at various angles of attack. In addition, the flow around the McDonnel Douglas generic option blended-wing-body was computed and comparisons were made between the perfect gas, equilibrium air, and the nonequilibrium air results.
Linear chirp phase perturbing approach for finding binary phased codes
NASA Astrophysics Data System (ADS)
Li, Bing C.
2017-05-01
Binary phased codes have many applications in communication and radar systems. These applications require binary phased codes to have low sidelobes in order to reduce interferences and false detection. Barker codes are the ones that satisfy these requirements and they have lowest maximum sidelobes. However, Barker codes have very limited code lengths (equal or less than 13) while many applications including low probability of intercept radar, and spread spectrum communication, require much higher code lengths. The conventional techniques of finding binary phased codes in literatures include exhaust search, neural network, and evolutionary methods, and they all require very expensive computation for large code lengths. Therefore these techniques are limited to find binary phased codes with small code lengths (less than 100). In this paper, by analyzing Barker code, linear chirp, and P3 phases, we propose a new approach to find binary codes. Experiments show that the proposed method is able to find long low sidelobe binary phased codes (code length >500) with reasonable computational cost.
Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Bartels, Robert E.
2002-01-01
A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.
NASA Technical Reports Server (NTRS)
Baumeister, Joseph F.
1994-01-01
A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.
Manual for obscuration code with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Takacs, L.
1986-01-01
The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Johnson, J.D.; Blond, R.M.
The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.
Progressive fracture of fiber composites
NASA Technical Reports Server (NTRS)
Irvin, T. B.; Ginty, C. A.
1983-01-01
Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.
Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1994-01-01
An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.
Design geometry and design/off-design performance computer codes for compressors and turbines
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1995-01-01
This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.
PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations
NASA Astrophysics Data System (ADS)
Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.
2017-12-01
Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.
Multiple grid problems on concurrent-processing computers
NASA Technical Reports Server (NTRS)
Eberhardt, D. S.; Baganoff, D.
1986-01-01
Three computer codes were studied which make use of concurrent processing computer architectures in computational fluid dynamics (CFD). The three parallel codes were tested on a two processor multiple-instruction/multiple-data (MIMD) facility at NASA Ames Research Center, and are suggested for efficient parallel computations. The first code is a well-known program which makes use of the Beam and Warming, implicit, approximate factored algorithm. This study demonstrates the parallelism found in a well-known scheme and it achieved speedups exceeding 1.9 on the two processor MIMD test facility. The second code studied made use of an embedded grid scheme which is used to solve problems having complex geometries. The particular application for this study considered an airfoil/flap geometry in an incompressible flow. The scheme eliminates some of the inherent difficulties found in adapting approximate factorization techniques onto MIMD machines and allows the use of chaotic relaxation and asynchronous iteration techniques. The third code studied is an application of overset grids to a supersonic blunt body problem. The code addresses the difficulties encountered when using embedded grids on a compressible, and therefore nonlinear, problem. The complex numerical boundary system associated with overset grids is discussed and several boundary schemes are suggested. A boundary scheme based on the method of characteristics achieved the best results.
Binary weight distributions of some Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Pollara, F.; Arnold, S.
1992-01-01
The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.
NASA Technical Reports Server (NTRS)
Mcgaw, Michael A.; Saltsman, James F.
1993-01-01
A recently developed high-temperature fatigue life prediction computer code is presented and an example of its usage given. The code discussed is based on the Total Strain version of Strainrange Partitioning (TS-SRP). Included in this code are procedures for characterizing the creep-fatigue durability behavior of an alloy according to TS-SRP guidelines and predicting cyclic life for complex cycle types for both isothermal and thermomechanical conditions. A reasonably extensive materials properties database is included with the code.
Turbomachinery Heat Transfer and Loss Modeling for 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Ameri, Ali
2005-01-01
This report's contents focus on making use of NASA Glenn on-site computational facilities,to develop, validate, and apply models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes to enhance the capability to compute heat transfer and losses in turbomachiney.
Real-time computer treatment of THz passive device images with the high image quality
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2012-06-01
We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.
Fingerprinting Communication and Computation on HPC Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean
2010-06-02
How do we identify what is actually running on high-performance computing systems? Names of binaries, dynamic libraries loaded, or other elements in a submission to a batch queue can give clues, but binary names can be changed, and libraries provide limited insight and resolution on the code being run. In this paper, we present a method for"fingerprinting" code running on HPC machines using elements of communication and computation. We then discuss how that fingerprint can be used to determine if the code is consistent with certain other types of codes, what a user usually runs, or what the user requestedmore » an allocation to do. In some cases, our techniques enable us to fingerprint HPC codes using runtime MPI data with a high degree of accuracy.« less
Practices in source code sharing in astrophysics
NASA Astrophysics Data System (ADS)
Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly
2013-02-01
While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.
Development of V/STOL methodology based on a higher order panel method
NASA Technical Reports Server (NTRS)
Bhateley, I. C.; Howell, G. A.; Mann, H. W.
1983-01-01
The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.
Lattice surgery on the Raussendorf lattice
NASA Astrophysics Data System (ADS)
Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco
2018-07-01
Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.
40 CFR 1033.110 - Emission diagnostics-general requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...
Airfoil Vibration Dampers program
NASA Technical Reports Server (NTRS)
Cook, Robert M.
1991-01-01
The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.
Computer optimization of reactor-thermoelectric space power systems
NASA Technical Reports Server (NTRS)
Maag, W. L.; Finnegan, P. M.; Fishbach, L. H.
1973-01-01
A computer simulation and optimization code that has been developed for nuclear space power systems is described. The results of using this code to analyze two reactor-thermoelectric systems are presented.
A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.
1994-01-01
Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.
1990-09-01
13 Bart Kuhn, GM-14 Samantha K. Maddox , GS-04 Mike Nakada, GM- 13 John Wolfe, GM-14 Reynaldo I. Monzon, GS- 12 Jose G. Suarez, GS- 11 19 Product...1410-09 GS-334-09 Janice Whiting Procurement Clerk Code 21 GS-1106-05 Separations Samantha Maddox Hoa T. Lu Supply Clerk Computer Specialist Code 21...Jennifer Thorp Royal S. Magnus Student Aide Personnel Research Psychologist Code 23 Code 12 GW-322-03 GS-180-11 Linda L. Turnmire Yvonne S. Baker Computer
Ascent Aerodynamic Pressure Distributions on WB001
NASA Technical Reports Server (NTRS)
Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.
1996-01-01
To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.
NASA Technical Reports Server (NTRS)
Kumar, A.; Graves, R. A., Jr.; Weilmuenster, K. J.
1980-01-01
A vectorized code, EQUIL, was developed for calculating the equilibrium chemistry of a reacting gas mixture on the Control Data STAR-100 computer. The code provides species mole fractions, mass fractions, and thermodynamic and transport properties of the mixture for given temperature, pressure, and elemental mass fractions. The code is set up for the electrons H, He, C, O, N system of elements. In all, 24 chemical species are included.
Computer code for charge-exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Kaufman, H. R.
1981-01-01
The propagation of the charge-exchange plasma from an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ASNI Standard FORTRAN.
Self-Scheduling Parallel Methods for Multiple Serial Codes with Application to WOPWOP
NASA Technical Reports Server (NTRS)
Long, Lyle N.; Brentner, Kenneth S.
2000-01-01
This paper presents a scheme for efficiently running a large number of serial jobs on parallel computers. Two examples are given of computer programs that run relatively quickly, but often they must be run numerous times to obtain all the results needed. It is very common in science and engineering to have codes that are not massive computing challenges in themselves, but due to the number of instances that must be run, they do become large-scale computing problems. The two examples given here represent common problems in aerospace engineering: aerodynamic panel methods and aeroacoustic integral methods. The first example simply solves many systems of linear equations. This is representative of an aerodynamic panel code where someone would like to solve for numerous angles of attack. The complete code for this first example is included in the appendix so that it can be readily used by others as a template. The second example is an aeroacoustics code (WOPWOP) that solves the Ffowcs Williams Hawkings equation to predict the far-field sound due to rotating blades. In this example, one quite often needs to compute the sound at numerous observer locations, hence parallelization is utilized to automate the noise computation for a large number of observers.
Computer Code for Transportation Network Design and Analysis
DOT National Transportation Integrated Search
1977-01-01
This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...
Current and anticipated uses of the thermal hydraulics codes at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caruso, R.
1997-07-01
The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support thesemore » needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.« less
Analyzing Pulse-Code Modulation On A Small Computer
NASA Technical Reports Server (NTRS)
Massey, David E.
1988-01-01
System for analysis pulse-code modulation (PCM) comprises personal computer, computer program, and peripheral interface adapter on circuit board that plugs into expansion bus of computer. Functions essentially as "snapshot" PCM decommutator, which accepts and stores thousands of frames of PCM data, sifts through them repeatedly to process according to routines specified by operator. Enables faster testing and involves less equipment than older testing systems.
A fast technique for computing syndromes of BCH and RS codes. [deep space network
NASA Technical Reports Server (NTRS)
Reed, I. S.; Truong, T. K.; Miller, R. L.
1979-01-01
A combination of the Chinese Remainder Theorem and Winograd's algorithm is used to compute transforms of odd length over GF(2 to the m power). Such transforms are used to compute the syndromes needed for decoding CBH and RS codes. The present scheme requires substantially fewer multiplications and additions than the conventional method of computing the syndromes directly.
Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs
NASA Technical Reports Server (NTRS)
Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.
1977-01-01
The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.
Numerical computation of space shuttle orbiter flow field
NASA Technical Reports Server (NTRS)
Tannehill, John C.
1988-01-01
A new parabolized Navier-Stokes (PNS) code has been developed to compute the hypersonic, viscous chemically reacting flow fields around 3-D bodies. The flow medium is assumed to be a multicomponent mixture of thermally perfect but calorically imperfect gases. The new PNS code solves the gas dynamic and species conservation equations in a coupled manner using a noniterative, implicit, approximately factored, finite difference algorithm. The space-marching method is made well-posed by special treatment of the streamwise pressure gradient term. The code has been used to compute hypersonic laminar flow of chemically reacting air over cones at angle of attack. The results of the computations are compared with the results of reacting boundary-layer computations and show excellent agreement.
NASA Technical Reports Server (NTRS)
Warren, Gary
1988-01-01
The SOS code is used to compute the resonance modes (frequency-domain information) of sample devices and separately to compute the transient behavior of the same devices. A code, DOT, is created to compute appropriate dot products of the time-domain and frequency-domain results. The transient behavior of individual modes in the device is then plotted. Modes in a coupled-cavity traveling-wave tube (CCTWT) section excited beam in separate simulations are analyzed. Mode energy vs. time and mode phase vs. time are computed and it is determined whether the transient waves are forward or backward waves for each case. Finally, the hot-test mode frequencies of the CCTWT section are computed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, D.G.: Watkins, J.C.
This report documents an evaluation of the TRAC-PF1/MOD1 reactor safety analysis computer code during computer simulations of feedwater line break transients. The experimental data base for the evaluation included the results of three bottom feedwater line break tests performed in the Semiscale Mod-2C test facility. The tests modeled 14.3% (S-FS-7), 50% (S-FS-11), and 100% (S-FS-6B) breaks. The test facility and the TRAC-PF1/MOD1 model used in the calculations are described. Evaluations of the accuracy of the calculations are presented in the form of comparisons of measured and calculated histories of selected parameters associated with the primary and secondary systems. In additionmore » to evaluating the accuracy of the code calculations, the computational performance of the code during the simulations was assessed. A conclusion was reached that the code is capable of making feedwater line break transient calculations efficiently, but there is room for significant improvements in the simulations that were performed. Recommendations are made for follow-on investigations to determine how to improve future feedwater line break calculations and for code improvements to make the code easier to use.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji
A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less
Prediction of sound radiated from different practical jet engine inlets
NASA Technical Reports Server (NTRS)
Zinn, B. T.; Meyer, W. L.
1980-01-01
Existing computer codes for calculating the far field radiation patterns surrounding various practical jet engine inlet configurations under different excitation conditions were upgraded. The computer codes were refined and expanded so that they are now more efficient computationally by a factor of about three and they are now capable of producing accurate results up to nondimensional wave numbers of twenty. Computer programs were also developed to help generate accurate geometrical representations of the inlets to be investigated. This data is required as input for the computer programs which calculate the sound fields. This new geometry generating computer program considerably reduces the time required to generate the input data which was one of the most time consuming steps in the process. The results of sample runs using the NASA-Lewis QCSEE inlet are presented and comparison of run times and accuracy are made between the old and upgraded computer codes. The overall accuracy of the computations is determined by comparison of the results of the computations with simple source solutions.
Dynamic Decision Making under Uncertainty and Partial Information
2013-11-14
integral under the natural filtration generated by the Brownian motions . This compact expression potentially enables us to design sub- optimal penalties...bounds on bermudan option price under jump diffusion processes. Quantitative Finance , 2013. Under review, available at http://arxiv.org/abs/1305.4321... Finance , 19:53 – 71, 2009. [3] D.P. Bertsekas. Dynamic Programming and Optimal Control. Athena Scientific, 4th edition, 2012. [4] D.B. Brown and J.E
Assessing Consequential Scenarios in a Complex Operational Environment Using Agent Based Simulation
2017-03-16
RWISE) 93 5.1.5 Conflict Modeling, Planning, and Outcomes Experimentation Program (COMPOEX) 94 5.1.6 Joint Non -Kinetic Effects Model (JNEM)/Athena... experimental design and testing. 4.3.8 Types and Attributes of Agent-Based Model Design Patterns Using the aforementioned ABM flowchart design methodology ...speed, or flexibility during tactical US Army wargaming. The report considers methodologies to improve analysis of the human domain, identifies
mirPub: a database for searching microRNA publications.
Vergoulis, Thanasis; Kanellos, Ilias; Kostoulas, Nikos; Georgakilas, Georgios; Sellis, Timos; Hatzigeorgiou, Artemis; Dalamagas, Theodore
2015-05-01
Identifying, amongst millions of publications available in MEDLINE, those that are relevant to specific microRNAs (miRNAs) of interest based on keyword search faces major obstacles. References to miRNA names in the literature often deviate from standard nomenclature for various reasons, since even the official nomenclature evolves. For instance, a single miRNA name may identify two completely different molecules or two different names may refer to the same molecule. mirPub is a database with a powerful and intuitive interface, which facilitates searching for miRNA literature, addressing the aforementioned issues. To provide effective search services, mirPub applies text mining techniques on MEDLINE, integrates data from several curated databases and exploits data from its user community following a crowdsourcing approach. Other key features include an interactive visualization service that illustrates intuitively the evolution of miRNA data, tag clouds summarizing the relevance of publications to particular diseases, cell types or tissues and access to TarBase 6.0 data to oversee genes related to miRNA publications. mirPub is freely available at http://www.microrna.gr/mirpub/. vergoulis@imis.athena-innovation.gr or dalamag@imis.athena-innovation.gr Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Error Suppression for Hamiltonian-Based Quantum Computation Using Subsystem Codes
NASA Astrophysics Data System (ADS)
Marvian, Milad; Lidar, Daniel A.
2017-01-01
We present general conditions for quantum error suppression for Hamiltonian-based quantum computation using subsystem codes. This involves encoding the Hamiltonian performing the computation using an error detecting subsystem code and the addition of a penalty term that commutes with the encoded Hamiltonian. The scheme is general and includes the stabilizer formalism of both subspace and subsystem codes as special cases. We derive performance bounds and show that complete error suppression results in the large penalty limit. To illustrate the power of subsystem-based error suppression, we introduce fully two-local constructions for protection against local errors of the swap gate of adiabatic gate teleportation and the Ising chain in a transverse field.
Error Suppression for Hamiltonian-Based Quantum Computation Using Subsystem Codes.
Marvian, Milad; Lidar, Daniel A
2017-01-20
We present general conditions for quantum error suppression for Hamiltonian-based quantum computation using subsystem codes. This involves encoding the Hamiltonian performing the computation using an error detecting subsystem code and the addition of a penalty term that commutes with the encoded Hamiltonian. The scheme is general and includes the stabilizer formalism of both subspace and subsystem codes as special cases. We derive performance bounds and show that complete error suppression results in the large penalty limit. To illustrate the power of subsystem-based error suppression, we introduce fully two-local constructions for protection against local errors of the swap gate of adiabatic gate teleportation and the Ising chain in a transverse field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
Navier-Stokes and Comprehensive Analysis Performance Predictions of the NREL Phase VI Experiment
NASA Technical Reports Server (NTRS)
Duque, Earl P. N.; Burklund, Michael D.; Johnson, Wayne
2003-01-01
A vortex lattice code, CAMRAD II, and a Reynolds-Averaged Navier-Stoke code, OVERFLOW-D2, were used to predict the aerodynamic performance of a two-bladed horizontal axis wind turbine. All computations were compared with experimental data that was collected at the NASA Ames Research Center 80- by 120-Foot Wind Tunnel. Computations were performed for both axial as well as yawed operating conditions. Various stall delay models and dynamics stall models were used by the CAMRAD II code. Comparisons between the experimental data and computed aerodynamic loads show that the OVERFLOW-D2 code can accurately predict the power and spanwise loading of a wind turbine rotor.
Fault-tolerance in Two-dimensional Topological Systems
NASA Astrophysics Data System (ADS)
Anderson, Jonas T.
This thesis is a collection of ideas with the general goal of building, at least in the abstract, a local fault-tolerant quantum computer. The connection between quantum information and topology has proven to be an active area of research in several fields. The introduction of the toric code by Alexei Kitaev demonstrated the usefulness of topology for quantum memory and quantum computation. Many quantum codes used for quantum memory are modeled by spin systems on a lattice, with operators that extract syndrome information placed on vertices or faces of the lattice. It is natural to wonder whether the useful codes in such systems can be classified. This thesis presents work that leverages ideas from topology and graph theory to explore the space of such codes. Homological stabilizer codes are introduced and it is shown that, under a set of reasonable assumptions, any qubit homological stabilizer code is equivalent to either a toric code or a color code. Additionally, the toric code and the color code correspond to distinct classes of graphs. Many systems have been proposed as candidate quantum computers. It is very desirable to design quantum computing architectures with two-dimensional layouts and low complexity in parity-checking circuitry. Kitaev's surface codes provided the first example of codes satisfying this property. They provided a new route to fault tolerance with more modest overheads and thresholds approaching 1%. The recently discovered color codes share many properties with the surface codes, such as the ability to perform syndrome extraction locally in two dimensions. Some families of color codes admit a transversal implementation of the entire Clifford group. This work investigates color codes on the 4.8.8 lattice known as triangular codes. I develop a fault-tolerant error-correction strategy for these codes in which repeated syndrome measurements on this lattice generate a three-dimensional space-time combinatorial structure. I then develop an integer program that analyzes this structure and determines the most likely set of errors consistent with the observed syndrome values. I implement this integer program to find the threshold for depolarizing noise on small versions of these triangular codes. Because the threshold for magic-state distillation is likely to be higher than this value and because logical
System, methods and apparatus for program optimization for multi-threaded processor architectures
Bastoul, Cedric; Lethin, Richard A; Leung, Allen K; Meister, Benoit J; Szilagyi, Peter; Vasilache, Nicolas T; Wohlford, David E
2015-01-06
Methods, apparatus and computer software product for source code optimization are provided. In an exemplary embodiment, a first custom computing apparatus is used to optimize the execution of source code on a second computing apparatus. In this embodiment, the first custom computing apparatus contains a memory, a storage medium and at least one processor with at least one multi-stage execution unit. The second computing apparatus contains at least two multi-stage execution units that allow for parallel execution of tasks. The first custom computing apparatus optimizes the code for parallelism, locality of operations and contiguity of memory accesses on the second computing apparatus. This Abstract is provided for the sole purpose of complying with the Abstract requirement rules. This Abstract is submitted with the explicit understanding that it will not be used to interpret or to limit the scope or the meaning of the claims.
NASA Technical Reports Server (NTRS)
Liu, D. D.; Kao, Y. F.; Fung, K. Y.
1989-01-01
A transonic equivalent strip (TES) method was further developed for unsteady flow computations of arbitrary wing planforms. The TES method consists of two consecutive correction steps to a given nonlinear code such as LTRAN2; namely, the chordwise mean flow correction and the spanwise phase correction. The computation procedure requires direct pressure input from other computed or measured data. Otherwise, it does not require airfoil shape or grid generation for given planforms. To validate the computed results, four swept wings of various aspect ratios, including those with control surfaces, are selected as computational examples. Overall trends in unsteady pressures are established with those obtained by XTRAN3S codes, Isogai's full potential code and measured data by NLR and RAE. In comparison with these methods, the TES has achieved considerable saving in computer time and reasonable accuracy which suggests immediate industrial applications.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Development Of A Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Yoon, Seokkwan; Kwak, Dochan
1993-01-01
Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
"SMART": A Compact and Handy FORTRAN Code for the Physics of Stellar Atmospheres
NASA Astrophysics Data System (ADS)
Sapar, A.; Poolamäe, R.
2003-01-01
A new computer code SMART (Spectra from Model Atmospheres by Radiative Transfer) for computing the stellar spectra, forming in plane-parallel atmospheres, has been compiled by us and A. Aret. To guarantee wide compatibility of the code with shell environment, we chose FORTRAN-77 as programming language and tried to confine ourselves to common part of its numerous versions both in WINDOWS and LINUX. SMART can be used for studies of several processes in stellar atmospheres. The current version of the programme is undergoing rapid changes due to our goal to elaborate a simple, handy and compact code. Instead of linearisation (being a mathematical method of recurrent approximations) we propose to use the physical evolutionary changes or in other words relaxation of quantum state populations rates from LTE to NLTE has been studied using small number of NLTE states. This computational scheme is essentially simpler and more compact than the linearisation. This relaxation scheme enables using instead of the Λ-iteration procedure a physically changing emissivity (or the source function) which incorporates in itself changing Menzel coefficients for NLTE quantum state populations. However, the light scattering on free electrons is in the terms of Feynman graphs a real second-order quantum process and cannot be reduced to consequent processes of absorption and emission as in the case of radiative transfer in spectral lines. With duly chosen input parameters the code SMART enables computing radiative acceleration to the matter of stellar atmosphere in turbulence clumps. This also enables to connect the model atmosphere in more detail with the problem of the stellar wind triggering. Another problem, which has been incorporated into the computer code SMART, is diffusion of chemical elements and their isotopes in the atmospheres of chemically peculiar (CP) stars due to usual radiative acceleration and the essential additional acceleration generated by the light-induced drift. As a special case, using duly chosen pixels on the stellar disk, the spectrum of rotating star can be computed. No instrumental broadening has been incorporated in the code of SMART. To facilitate study of stellar spectra, a GUI (Graphical User Interface) with selection of labels by ions has been compiled to study the spectral lines of different elements and ions in the computed emergent flux. An amazing feature of SMART is that its code is very short: it occupies only 4 two-sided two-column A4 sheets in landscape format. In addition, if well commented, it is quite easily readable and understandable. We have used the tactics of writing the comments on the right-side margin (columns starting from 73). Such short code has been composed widely using the unified input physics (for example the ionisation cross-sections for bound-free transitions and the electron and ion collision rates). As current restriction to the application area of the present version of the SMART is that molecules are since ignored. Thus, it can be used only for luke and hot stellar atmospheres. In the computer code we have tried to avoid bulky often over-optimised methods, primarily meant to spare the time of computations. For instance, we compute the continuous absorption coefficient at every wavelength. Nevertheless, during an hour by the personal computer in our disposal AMD Athlon XP 1700+, 512MB DDRAM) a stellar spectrum with spectral step resolution λ / dλ = 3D100,000 for spectral interval 700 -- 30,000 Å is computed. The model input data and the line data used by us are both the ones computed and compiled by R. Kurucz. In order to follow presence and representability of quantum states and to enumerate them for NLTE studies a C++ code, transforming the needed data to the LATEX version, has been compiled. Thus we have composed a quantum state list for all neutrals and ions in the Kurucz file 'gfhyperall.dat'. The list enables more adequately to compose the concept of super-states, including partly correlating super-states. We are grateful to R. Kurucz for making available by CD-ROMs and Internet his computer codes ATLAS and SYNTHE used by us as a starting point in composing of the new computer code. We are also grateful to Estonian Science Foundation for grant ESF-4701.
Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold
1997-01-01
The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.
Transferring ecosystem simulation codes to supercomputers
NASA Technical Reports Server (NTRS)
Skiles, J. W.; Schulbach, C. H.
1995-01-01
Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.
Duct flow nonuniformities for Space Shuttle Main Engine (SSME)
NASA Technical Reports Server (NTRS)
1987-01-01
A three-duct Space Shuttle Main Engine (SSME) Hot Gas Manifold geometry code was developed for use. The methodology of the program is described, recommendations on its implementation made, and an input guide, input deck listing, and a source code listing provided. The code listing is strewn with an abundance of comments to assist the user in following its development and logic. A working source deck will be provided. A thorough analysis was made of the proper boundary conditions and chemistry kinetics necessary for an accurate computational analysis of the flow environment in the SSME fuel side preburner chamber during the initial startup transient. Pertinent results were presented to facilitate incorporation of these findings into an appropriate CFD code. The computation must be a turbulent computation, since the flow field turbulent mixing will have a profound effect on the chemistry. Because of the additional equations demanded by the chemistry model it is recommended that for expediency a simple algebraic mixing length model be adopted. Performing this computation for all or selected time intervals of the startup time will require an abundance of computer CPU time regardless of the specific CFD code selected.
War of Ontology Worlds: Mathematics, Computer Code, or Esperanto?
Rzhetsky, Andrey; Evans, James A.
2011-01-01
The use of structured knowledge representations—ontologies and terminologies—has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies. PMID:21980276
Verifying a computational method for predicting extreme ground motion
Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.
2011-01-01
In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.
An evaluation of four single element airfoil analytic methods
NASA Technical Reports Server (NTRS)
Freuler, R. J.; Gregorek, G. M.
1979-01-01
A comparison of four computer codes for the analysis of two-dimensional single element airfoil sections is presented for three classes of section geometries. Two of the computer codes utilize vortex singularities methods to obtain the potential flow solution. The other two codes solve the full inviscid potential flow equation using finite differencing techniques, allowing results to be obtained for transonic flow about an airfoil including weak shocks. Each program incorporates boundary layer routines for computing the boundary layer displacement thickness and boundary layer effects on aerodynamic coefficients. Computational results are given for a symmetrical section represented by an NACA 0012 profile, a conventional section illustrated by an NACA 65A413 profile, and a supercritical type section for general aviation applications typified by a NASA LS(1)-0413 section. The four codes are compared and contrasted in the areas of method of approach, range of applicability, agreement among each other and with experiment, individual advantages and disadvantages, computer run times and memory requirements, and operational idiosyncrasies.
War of ontology worlds: mathematics, computer code, or Esperanto?
Rzhetsky, Andrey; Evans, James A
2011-09-01
The use of structured knowledge representations-ontologies and terminologies-has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies.
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
48 CFR 1819.1005 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... System (NAICS) codes are: NAICS code Industry category 334111 Electronic Computer Manufacturing. 334418... Manufacturing. 334119 Other Computer Peripheral Equipment Manufacturing. 33422 Radio and Television Broadcasting and Wireless Communication Equipment Manufacturing. 336415 Guided Missile and Space Vehicle Propulsion...
40 CFR 1048.110 - How must my engines diagnose malfunctions?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., the MIL may stay off during later engine operation. (d) Store trouble codes in computer memory. Record and store in computer memory any diagnostic trouble codes showing a malfunction that should illuminate...
Recent applications of the transonic wing analysis computer code, TWING
NASA Technical Reports Server (NTRS)
Subramanian, N. R.; Holst, T. L.; Thomas, S. D.
1982-01-01
An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.
Response surface method in geotechnical/structural analysis, phase 1
NASA Astrophysics Data System (ADS)
Wong, F. S.
1981-02-01
In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.
User's Manual for FEMOM3DS. Version 1.0
NASA Technical Reports Server (NTRS)
Reddy, C.J.; Deshpande, M. D.
1997-01-01
FEMOM3DS is a computer code written in FORTRAN 77 to compute electromagnetic(EM) scattering characteristics of a three dimensional object with complex materials using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity and the triangular elements with the basis functions similar to that described for MoM at the outer boundary. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.
Performance measures for transform data coding.
NASA Technical Reports Server (NTRS)
Pearl, J.; Andrews, H. C.; Pratt, W. K.
1972-01-01
This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.
ERIC Educational Resources Information Center
Holbrook, M. Cay; MacCuspie, P. Ann
2010-01-01
Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…
ERIC Educational Resources Information Center
Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela
2015-01-01
Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
NASA Technical Reports Server (NTRS)
Rathjen, K. A.
1977-01-01
A digital computer code CAVE (Conduction Analysis Via Eigenvalues), which finds application in the analysis of two dimensional transient heating of hypersonic vehicles is described. The CAVE is written in FORTRAN 4 and is operational on both IBM 360-67 and CDC 6600 computers. The method of solution is a hybrid analytical numerical technique that is inherently stable permitting large time steps even with the best of conductors having the finest of mesh size. The aerodynamic heating boundary conditions are calculated by the code based on the input flight trajectory or can optionally be calculated external to the code and then entered as input data. The code computes the network conduction and convection links, as well as capacitance values, given basic geometrical and mesh sizes, for four generations (leading edges, cooled panels, X-24C structure and slabs). Input and output formats are presented and explained. Sample problems are included. A brief summary of the hybrid analytical-numerical technique, which utilizes eigenvalues (thermal frequencies) and eigenvectors (thermal mode vectors) is given along with aerodynamic heating equations that have been incorporated in the code and flow charts.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Method for rapid high-frequency seismogram calculation
NASA Astrophysics Data System (ADS)
Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo
2009-02-01
We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).
Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor
NASA Technical Reports Server (NTRS)
Ajmani, Kumud; Liu, Nan-Suey
2005-01-01
The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.
A supersonic three-dimensional code for flow over blunt bodies: Program documentation and test cases
NASA Technical Reports Server (NTRS)
Chaussee, D. S.; Mcmillan, O. J.
1980-01-01
The use of a computer code for the calculation of steady, supersonic, three dimensional, inviscid flow over blunt bodies is illustrated. Input and output are given and explained for two cases: a pointed code of 20 deg half angle at 15 deg angle of attack in a free stream with M sub infinite = 7, and a cone-ogive-cylinder at 10 deg angle of attack with M sub infinite = 2.86. A source listing of the computer code is provided.
PLASIM: A computer code for simulating charge exchange plasma propagation
NASA Technical Reports Server (NTRS)
Robinson, R. S.; Deininger, W. D.; Winder, D. R.; Kaufman, H. R.
1982-01-01
The propagation of the charge exchange plasma for an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ANSI Standard FORTRAN.
Extension, validation and application of the NASCAP code
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.
1979-01-01
Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.
NASA Technical Reports Server (NTRS)
Weilmuenster, K. J.; Hamilton, H. H., II
1983-01-01
A computer code HALIS, designed to compute the three dimensional flow about shuttle like configurations at angles of attack greater than 25 deg, is described. Results from HALIS are compared where possible with an existing flow field code; such comparisons show excellent agreement. Also, HALIS results are compared with experimental pressure distributions on shuttle models over a wide range of angle of attack. These comparisons are excellent. It is demonstrated that the HALIS code can incorporate equilibrium air chemistry in flow field computations.
Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.
Henry, R; Tiselj, I; Snoj, L
2015-03-01
New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.
A comparison of two central difference schemes for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Maksymiuk, C. M.; Swanson, R. C.; Pulliam, T. H.
1990-01-01
Five viscous transonic airfoil cases were computed by two significantly different computational fluid dynamics codes: An explicit finite-volume algorithm with multigrid, and an implicit finite-difference approximate-factorization method with Eigenvector diagonalization. Both methods are described in detail, and their performance on the test cases is compared. The codes utilized the same grids, turbulence model, and computer to provide the truest test of the algorithms. The two approaches produce very similar results, which, for attached flows, also agree well with experimental results; however, the explicit code is considerably faster.
Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D; Volz, Kerstin
2017-06-01
We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space. Copyright © 2017 Elsevier B.V. All rights reserved.
Evaluation of the efficiency and fault density of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1993-01-01
Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.
NASA Technical Reports Server (NTRS)
White, P. R.; Little, R. R.
1985-01-01
A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.
1984-11-01
TERRA is a computer code which calculates concentrations of radionuclides and ingrowing daughters in surface and root-zone soil, produce and feed, beef, and milk from a given deposition rate at any location in the conterminous United States. The code is fully integrated with seven other computer codes which together comprise a Computerized Radiological Risk Investigation System, CRRIS. Output from either the long range (> 100 km) atmospheric dispersion code RETADD-II or the short range (<80 km) atmospheric dispersion code ANEMOS, in the form of radionuclide air concentrations and ground deposition rates by downwind location, serves as input to TERRA. User-definedmore » deposition rates and air concentrations may also be provided as input to TERRA through use of the PRIMUS computer code. The environmental concentrations of radionuclides predicted by TERRA serve as input to the ANDROS computer code which calculates population and individual intakes, exposures, doses, and risks. TERRA incorporates models to calculate uptake from soil and atmospheric deposition on four groups of produce for human consumption and four groups of livestock feeds. During the environmental transport simulation, intermediate calculations of interception fraction for leafy vegetables, produce directly exposed to atmospherically depositing material, pasture, hay, and silage are made based on location-specific estimates of standing crop biomass. Pasture productivity is estimated by a model which considers the number and types of cattle and sheep, pasture area, and annual production of other forages (hay and silage) at a given location. Calculations are made of the fraction of grain imported from outside the assessment area. TERRA output includes the above calculations and estimated radionuclide concentrations in plant produce, milk, and a beef composite by location.« less
A survey to identify the clinical coding and classification systems currently in use across Europe.
de Lusignan, S; Minmagh, C; Kennedy, J; Zeimet, M; Bommezijn, H; Bryant, J
2001-01-01
This is a survey to identify what clinical coding systems are currently in use across the European Union, and the states seeking membership to it. We sought to identify what systems are currently used and to what extent they were subject to local adaptation. Clinical coding should facilitate identifying key medical events in a computerised medical record, and aggregating information across groups of records. The emerging new driver is as the enabler of the life-long computerised medical record. A prerequisite for this level of functionality is the transfer of information between different computer systems. This transfer can be facilitated either by working on the interoperability problems between disparate systems or by harmonising the underlying data. This paper examines the extent to which the latter has occurred across Europe. Literature and Internet search. Requests for information via electronic mail to pan-European mailing lists of health informatics professionals. Coding systems are now a de facto part of health information systems across Europe. There are relatively few coding systems in existence across Europe. ICD9 and ICD 10, ICPC and Read were the most established. However the local adaptation of these classification systems either on a by country or by computer software manufacturer basis; significantly reduces the ability for the meaning coded with patients computer records to be easily transferred from one medical record system to another. There is no longer any debate as to whether a coding or classification system should be used. Convergence of different classifications systems should be encouraged. Countries and computer manufacturers within the EU should be encouraged to stop making local modifications to coding and classification systems, as this practice risks significantly slowing progress towards easy transfer of records between computer systems.
PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)
NASA Astrophysics Data System (ADS)
Vincenti, Henri
2016-03-01
The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.
Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers
NASA Technical Reports Server (NTRS)
Skiles, J. W.; Schulbach, C. H.
1994-01-01
Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.
The wide field imager instrument for Athena
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Eder, Josef; Eraerds, Tanja; Nandra, Kirpal; Pietschner, Daniel; Plattner, Markus; Rau, Arne; Strecker, Rafael
2016-07-01
The WFI (Wide Field Imager) instrument is planned to be one of two complementary focal plane cameras on ESA's next X-ray observatory Athena. It combines unprecedented survey power through its large field of view of 40 amin x 40 amin together with excellent count rate capability (>= 1 Crab). The energy resolution of the silicon sensor is state-of-the-art in the energy band of interest from 0.2 keV to 15 keV, e.g. the full width at half maximum of a line at 7 keV will be <= 170 eV until the end of the nominal mission phase. This performance is accomplished by using DEPFET active pixel sensors with a pixel size of 130 μm x 130 μm well suited to the on-axis angular resolution of 5 arcsec half energy width (HEW) of the mirror system. Each DEPFET pixel is a combined sensor-amplifier structure with a MOSFET integrated onto a fully depleted 450 μm thick silicon bulk. Two detectors are planned for the WFI instrument: A large-area detector comprising four sensors with a total of 1024 x 1024 pixels and a fast detector optimized for high count rate observations. This high count rate capable detector permits for bright point sources with an intensity of 1 Crab a throughput of more than 80% and a pile-up of less than 1%. The fast readout of the DEPFET pixel matrices is facilitated by an ASIC development, called VERITAS-2. Together with the Switcher-A, a control ASIC that allows for operation of the DEPFET in rolling shutter mode, these elements form the key components of the WFI detectors. The detectors are surrounded by a graded-Z shield, which has in particular the purpose to avoid fluorescence lines that would contribute to the instrument background. Together with ultra-thin coating of the sensor and particle identification by the detector itself, the particle induced background shall be minimized in order to achieve the scientific requirement of a total instrumental background value smaller than 5 x 10-3 cts/cm2/s/keV. Each detector has its dedicated detector electronics (DE) for supply and data acquisition. Due to the high frame rate in combination with the large pixel array, signal correction and event filtering have to be done on-board and in real-time as the raw data rate would by far exceed the feasible telemetry rate. The data streams are merged and compressed in the Instrument Control and Power distribution Unit (ICPU). The ICPU is the data, control and power interface of the WFI to the Athena spacecraft. The WFI instrument comprises in addition a filter wheel (FW) in front of the camera as well as an optical stray-light baffle. In the current phase A of the Athena project, the technology development is performed. At its end, breadboard models will be developed and tested to demonstrate a technical readiness level (TRL) of at least 5 for the various WFI subsystems before mission adoption in 2020.
User's manual: Subsonic/supersonic advanced panel pilot code
NASA Technical Reports Server (NTRS)
Moran, J.; Tinoco, E. N.; Johnson, F. T.
1978-01-01
Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.
Experimental aerothermodynamic research of hypersonic aircraft
NASA Technical Reports Server (NTRS)
Cleary, Joseph W.
1987-01-01
The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.
Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Additional extensions to the NASCAP computer code, volume 3
NASA Technical Reports Server (NTRS)
Mandell, M. J.; Cooke, D. L.
1981-01-01
The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.
Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers
NASA Astrophysics Data System (ADS)
Ogino, T.
High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... analysis and design, and computer software design and coding. Given the fact that over $500 million were... acoustic algorithms, computer architecture, and source code that dated to the 1970s. Since that time... 2012. Version 3.0 is an entirely new, state-of-the-art computer program used for predicting noise...
ERIC Educational Resources Information Center
Good, Jonathon; Keenan, Sarah; Mishra, Punya
2016-01-01
The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…
NASA Technical Reports Server (NTRS)
Athavale, Mahesh; Przekwas, Andrzej
2004-01-01
The objectives of the program were to develop computational fluid dynamics (CFD) codes and simpler industrial codes for analyzing and designing advanced seals for air-breathing and space propulsion engines. The CFD code SCISEAL is capable of producing full three-dimensional flow field information for a variety of cylindrical configurations. An implicit multidomain capability allow the division of complex flow domains to allow optimum use of computational cells. SCISEAL also has the unique capability to produce cross-coupled stiffness and damping coefficients for rotordynamic computations. The industrial codes consist of a series of separate stand-alone modules designed for expeditious parametric analyses and optimization of a wide variety of cylindrical and face seals. Coupled through a Knowledge-Based System (KBS) that provides a user-friendly Graphical User Interface (GUI), the industrial codes are PC based using an OS/2 operating system. These codes were designed to treat film seals where a clearance exists between the rotating and stationary components. Leakage is inhibited by surface roughness, small but stiff clearance films, and viscous pumping devices. The codes have demonstrated to be a valuable resource for seal development of future air-breathing and space propulsion engines.
Establishment of a Beta Test Center for the NPARC Code at Central State University
NASA Technical Reports Server (NTRS)
Okhio, Cyril B.
1996-01-01
Central State University has received a supplementary award to purchase computer workstations for the NPARC (National Propulsion Ames Research Center) computational fluid dynamics code BETA Test Center. The computational code has also been acquired for installation on the workstations. The acquisition of this code is an initial step for CSU in joining an alliance composed of NASA, AEDC, The Aerospace Industry, and academia. A post-Doctoral research Fellow from a neighboring university will assist the PI in preparing a template for Tutorial documents for the BETA test center. The major objective of the alliance is to establish a national applications-oriented CFD capability, centered on the NPARC code. By joining the alliance, the BETA test center at CSU will allow the PI, as well as undergraduate and post-graduate students to test the capability of the NPARC code in predicting the physics of aerodynamic/geometric configurations that are of interest to the alliance. Currently, CSU is developing a once a year, hands-on conference/workshop based upon the experience acquired from running other codes similar to the NPARC code in the first year of this grant.
Microgravity computing codes. User's guide
NASA Astrophysics Data System (ADS)
1982-01-01
Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.
Computer access security code system
NASA Technical Reports Server (NTRS)
Collins, Earl R., Jr. (Inventor)
1990-01-01
A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.
Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.M.; Hochstedler, R.D.
1997-02-01
Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less
Quantum computing with Majorana fermion codes
NASA Astrophysics Data System (ADS)
Litinski, Daniel; von Oppen, Felix
2018-05-01
We establish a unified framework for Majorana-based fault-tolerant quantum computation with Majorana surface codes and Majorana color codes. All logical Clifford gates are implemented with zero-time overhead. This is done by introducing a protocol for Pauli product measurements with tetrons and hexons which only requires local 4-Majorana parity measurements. An analogous protocol is used in the fault-tolerant setting, where tetrons and hexons are replaced by Majorana surface code patches, and parity measurements are replaced by lattice surgery, still only requiring local few-Majorana parity measurements. To this end, we discuss twist defects in Majorana fermion surface codes and adapt the technique of twist-based lattice surgery to fermionic codes. Moreover, we propose a family of codes that we refer to as Majorana color codes, which are obtained by concatenating Majorana surface codes with small Majorana fermion codes. Majorana surface and color codes can be used to decrease the space overhead and stabilizer weight compared to their bosonic counterparts.
Computational Nuclear Physics and Post Hartree-Fock Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.
We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less
Computer codes for thermal analysis of a solid rocket motor nozzle
NASA Technical Reports Server (NTRS)
Chauhan, Rajinder Singh
1988-01-01
A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.
New Parallel computing framework for radiation transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostin, M.A.; /Michigan State U., NSCL; Mokhov, N.V.
A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility canmore » be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less
Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.
2005-01-01
In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.
User's manual for CBS3DS, version 1.0
NASA Astrophysics Data System (ADS)
Reddy, C. J.; Deshpande, M. D.
1995-10-01
CBS3DS is a computer code written in FORTRAN 77 to compute the backscattering radar cross section of cavity backed apertures in infinite ground plane and slots in thick infinite ground plane. CBS3DS implements the hybrid Finite Element Method (FEM) and Method of Moments (MoM) techniques. This code uses the tetrahedral elements, with vector edge basis functions for FEM in the volume of the cavity/slot and the triangular elements with the basis functions for MoM at the apertures. By virtue of FEM, this code can handle any arbitrarily shaped three-dimensional cavities filled with inhomogeneous lossy materials; due to MoM, the apertures can be of any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computer the code is intended to run.
Program optimizations: The interplay between power, performance, and energy
Leon, Edgar A.; Karlin, Ian; Grant, Ryan E.; ...
2016-05-16
Practical considerations for future supercomputer designs will impose limits on both instantaneous power consumption and total energy consumption. Working within these constraints while providing the maximum possible performance, application developers will need to optimize their code for speed alongside power and energy concerns. This paper analyzes the effectiveness of several code optimizations including loop fusion, data structure transformations, and global allocations. A per component measurement and analysis of different architectures is performed, enabling the examination of code optimizations on different compute subsystems. Using an explicit hydrodynamics proxy application from the U.S. Department of Energy, LULESH, we show how code optimizationsmore » impact different computational phases of the simulation. This provides insight for simulation developers into the best optimizations to use during particular simulation compute phases when optimizing code for future supercomputing platforms. Here, we examine and contrast both x86 and Blue Gene architectures with respect to these optimizations.« less
Computational simulation of progressive fracture in fiber composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Riley, G.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
1997-12-16
The access tower around the Athena II rocket for the Lunar Prospector spacecraft, to be launched for NASA by Lockheed Martin, was rolled back today at Launch Complex 46 at Cape Canaveral Air Station for final prelaunch preparations. The small robotic spacecraft is designed to provide the first global maps of the Moon's surface compositional elements and its gravitational and magnetic fields. The launch of Lunar Prospector is currently scheduled for Jan. 5, 1998 at 8:31 p.m
1997-12-16
The access tower around the Athena II rocket for the Lunar Prospector spacecraft, to be launched for NASA by Lockheed Martin, was rolled back today at Launch Complex 46 at Cape Canaveral Air Station for final prelaunch preparations. The small robotic spacecraft is designed to provide the first global maps of the Moon's surface compositional elements and its gravitational and magnetic fields. The launch of Lunar Prospector is currently scheduled for Jan. 5, 1998 at 8:31 p.m
Report of the Admission of Women to the U.S. Military Academy (Project Athena III)
1979-06-01
Attitudes Toward Women’s Roles. ..... 47 Summary ..... ............ . . . . 52 CHAPTER 2 CADET BASIC TRAINING ........ 54 Male and Female Performances...Sex for West Point Cadets .......... ... 46 17. Comparison of Attitude Toward Wbmen Scores WANS) for Men and Women at Several Institutions...gender. xiii S . . ... , , , , " .... i • :•: - :• Although attitudes toward womens’ roles in society and in the military are more traditional for men
NASA Technical Reports Server (NTRS)
Ming, D. W.; Morris, R. V.; Gellert, R.; Yen, A.; Bell, J. F., III; Blaney, D.; Christensen, P. R.; Crumpler, L.; Chu, P.; Farrand, W. H.
2005-01-01
The primary objective of the MER Spirit and Opportunity Rovers is to identify and investigate rocks, outcrops, and soils that have the highest possible chance of preserving evidence of water activity on Mars. The Athena Science Instrument Payload onboard the two rovers has provided geochemical and mineralogical information that indicates a variety of aqueous processes and various degrees of alteration at the two landing sites.
Report of the Admission of Women to the U.S. Military Academy (Project Athena IV),
1980-06-01
analysis of coeducation at West Point from June 1979 to June 1980. Included are highlights of individual research projects conducted to understand...which focused on attitudes, values, and performances of cadets in the first three years of coeducation (Adams, 1979). Graduate Assessment phase which has...important organizational process -- coeducation at a military institution. The second program goal is an outgrowth of the first -- to provide Knowledge to
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wemhoff, A P; Burnham, A K
2006-04-05
Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less
Ducted-Fan Engine Acoustic Predictions using a Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Rumsey, C. L.; Biedron, R. T.; Farassat, F.; Spence, P. L.
1998-01-01
A Navier-Stokes computer code is used to predict one of the ducted-fan engine acoustic modes that results from rotor-wake/stator-blade interaction. A patched sliding-zone interface is employed to pass information between the moving rotor row and the stationary stator row. The code produces averaged aerodynamic results downstream of the rotor that agree well with a widely used average-passage code. The acoustic mode of interest is generated successfully by the code and is propagated well upstream of the rotor; temporal and spatial numerical resolution are fine enough such that attenuation of the signal is small. Two acoustic codes are used to find the far-field noise. Near-field propagation is computed by using Eversman's wave envelope code, which is based on a finite-element model. Propagation to the far field is accomplished by using the Kirchhoff formula for moving surfaces with the results of the wave envelope code as input data. Comparison of measured and computed far-field noise levels show fair agreement in the range of directivity angles where the peak radiation lobes from the inlet are observed. Although only a single acoustic mode is targeted in this study, the main conclusion is a proof-of-concept: Navier-Stokes codes can be used both to generate and propagate rotor/stator acoustic modes forward through an engine, where the results can be coupled to other far-field noise prediction codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avara, Mark J.; Reynolds, Christopher S.; Bogdanovic, Tamara, E-mail: mavara@astro.umd.edu, E-mail: chris@astro.umd.edu, E-mail: tamarab@gatech.edu
2013-08-20
The role played by magnetic fields in the intracluster medium (ICM) of galaxy clusters is complex. The weakly collisional nature of the ICM leads to thermal conduction that is channeled along field lines. This anisotropic heat conduction profoundly changes the instabilities of the ICM atmosphere, with convective stabilities being driven by temperature gradients of either sign. Here, we employ the Athena magnetohydrodynamic code to investigate the local non-linear behavior of the heat-flux-driven buoyancy instability (HBI) relevant in the cores of cooling-core clusters where the temperature increases with radius. We study a grid of two-dimensional simulations that span a large rangemore » of initial magnetic field strengths and numerical resolutions. For very weak initial fields, we recover the previously known result that the HBI wraps the field in the horizontal direction, thereby shutting off the heat flux. However, we find that simulations that begin with intermediate initial field strengths have a qualitatively different behavior, forming HBI-stable filaments that resist field-line wrapping and enable sustained vertical conductive heat flux at a level of 10%-25% of the Spitzer value. While astrophysical conclusions regarding the role of conduction in cooling cores require detailed global models, our local study proves that systems dominated by the HBI do not necessarily quench the conductive heat flux.« less
NASA Astrophysics Data System (ADS)
Sekiya, Minoru; Onishi, Isamu K.
2018-06-01
The streaming instability and Kelvin–Helmholtz instability are considered the two major sources causing clumping of dust particles and turbulence in the dust layer of a protoplanetary disk as long as we consider the dead zone where the magnetorotational instability does not grow. Extensive numerical simulations have been carried out in order to elucidate the condition for the development of particle clumping caused by the streaming instability. In this paper, a set of two parameters suitable for classifying the numerical results is proposed. One is the Stokes number that has been employed in previous works and the other is the dust particle column density that is nondimensionalized using the gas density in the midplane, Keplerian angular velocity, and difference between the Keplerian and gaseous orbital velocities. The magnitude of dust clumping is a measure of the behavior of the dust layer. Using three-dimensional numerical simulations of dust particles and gas based on Athena code v. 4.2, it is confirmed that the magnitude of dust clumping for two disk models are similar if the corresponding sets of values of the two parameters are identical to each other, even if the values of the metallicity (i.e., the ratio of the columns density of the dust particles to that of the gas) are different.
NASA Technical Reports Server (NTRS)
Bonhaus, Daryl L.; Wornom, Stephen F.
1991-01-01
Two codes which solve the 3-D Thin Layer Navier-Stokes (TLNS) equations are used to compute the steady state flow for two test cases representing typical finite wings at transonic conditions. Several grids of C-O topology and varying point densities are used to determine the effects of grid refinement. After a description of each code and test case, standards for determining code efficiency and accuracy are defined and applied to determine the relative performance of the two codes in predicting turbulent transonic wing flows. Comparisons of computed surface pressure distributions with experimental data are made.
Computer Simulation of the VASIMR Engine
NASA Technical Reports Server (NTRS)
Garrison, David
2005-01-01
The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.
Procedures for the computation of unsteady transonic flows including viscous effects
NASA Technical Reports Server (NTRS)
Rizzetta, D. P.
1982-01-01
Modifications of the code LTRAN2, developed by Ballhaus and Goorjian, which account for viscous effects in the computation of planar unsteady transonic flows are presented. Two models are considered and their theoretical development and numerical implementation is discussed. Computational examples employing both models are compared with inviscid solutions and with experimental data. Use of the modified code is described.
ERIC Educational Resources Information Center
Pon-Barry, Heather; Packard, Becky Wai-Ling; St. John, Audrey
2017-01-01
A dilemma within computer science departments is developing sustainable ways to expand capacity within introductory computer science courses while remaining committed to inclusive practices. Training near-peer mentors for peer code review is one solution. This paper describes the preparation of near-peer mentors for their role, with a focus on…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... analysis and design, and computer software design and coding. Given the fact that over $500 million were... acoustic algorithms, computer architecture, and source code that dated to the 1970s. Since that time... towards the end of 2012. Version 3.0 is an entirely new, state-of-the-art computer program used for...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administration COURT SERVICES AND OFFENDER SUPERVISION AGENCY FOR THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS... proprietary interest in the information. (e) Computer software means tools by which records are created, stored, and retrieved. Normally, computer software, including source code, object code, and listings of...
Error threshold for color codes and random three-body Ising models.
Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A
2009-08-28
We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.
Photoionization and High Density Gas
NASA Technical Reports Server (NTRS)
Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)
2002-01-01
We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.
The mathematical theory of signal processing and compression-designs
NASA Astrophysics Data System (ADS)
Feria, Erlan H.
2006-05-01
The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.
Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code
NASA Technical Reports Server (NTRS)
Weinberg, B. C.; Mcdonald, H.
1980-01-01
There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.
NASA Technical Reports Server (NTRS)
Teske, M. E.
1984-01-01
This is a user manual for the computer code ""AGDISP'' (AGricultural DISPersal) which has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern.
HOMAR: A computer code for generating homotopic grids using algebraic relations: User's manual
NASA Technical Reports Server (NTRS)
Moitra, Anutosh
1989-01-01
A computer code for fast automatic generation of quasi-three-dimensional grid systems for aerospace configurations is described. The code employs a homotopic method to algebraically generate two-dimensional grids in cross-sectional planes, which are stacked to produce a three-dimensional grid system. Implementation of the algebraic equivalents of the homotopic relations for generating body geometries and grids are explained. Procedures for controlling grid orthogonality and distortion are described. Test cases with description and specification of inputs are presented in detail. The FORTRAN computer program and notes on implementation and use are included.
Advances in Computational Capabilities for Hypersonic Flows
NASA Technical Reports Server (NTRS)
Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip
1997-01-01
The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.