Sample records for station computer codes

  1. Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burk, K.W.; Andrews, G.L.

    1989-02-01

    The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less

  2. Manual for obscuration code with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Takacs, L.

    1986-01-01

    The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.

  3. Near Zone: Basic scattering code user's manual with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Silvestro, J. W.

    1989-01-01

    The Electromagnetic Code - Basic Scattering Code, Version 3, is a user oriented computer code to analyze near and far zone patterns of antennas in the presence of scattering structures, to provide coupling between antennas in a complex environment, and to determine radiation hazard calculations at UHF and above. The analysis is based on uniform asymptotic techniques formulated in terms of the Uniform Geometrical Theory of Diffraction (UTD). Complicated structures can be simulated by arbitrarily oriented flat plates and an infinite ground plane that can be perfectly conducting or dielectric. Also, perfectly conducting finite elliptic cylinder, elliptic cone frustum sections, and finite composite ellipsoids can be used to model the superstructure of a ship, the body of a truck, and airplane, a satellite, etc. This manual gives special consideration to space station modeling applications. This is a user manual designed to give an overall view of the operation of the computer code, to instruct a user in how to model structures, and to show the validity of the code by comparing various computed results against measured and alternative calculations such as method of moments whenever available.

  4. The Remote Analysis Station (RAS) as an instructional system

    NASA Technical Reports Server (NTRS)

    Rogers, R. H.; Wilson, C. L.; Dye, R. H.; Jaworski, E.

    1981-01-01

    "Hands-on" training in LANDSAT data analysis techniques can be obtained using a desk-top, interactive remote analysis station (RAS) which consists of a color CRT imagery display, with alphanumeric overwrite and keyboard, as well as a cursor controller and modem. This portable station can communicate via modem and dial-up telephone with a host computer at 1200 baud or it can be hardwired to a host computer at 9600 baud. A Z80 microcomputer controls the display refresh memory and remote station processing. LANDSAT data is displayed as three-band false-color imagery, one-band color-sliced imagery, or color-coded processed imagery. Although the display memory routinely operates at 256 x 256 picture elements, a display resolution of 128 x 128 can be selected to fill the display faster. In the false color mode the computer packs the data into one 8-bit character. When the host is not sending pictorial information the characters sent are in ordinary ASCII code. System capabilities are described.

  5. Space station integrated wall design and penetration damage control

    NASA Technical Reports Server (NTRS)

    Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.

    1987-01-01

    The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.

  6. Project summaries

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.

  7. Field estimates of gravity terrain corrections and Y2K-compatible method to convert from gravity readings with multiple base stations to tide- and long-term drift-corrected observations

    USGS Publications Warehouse

    Plouff, Donald

    2000-01-01

    Gravity observations are directly made or are obtained from other sources by the U.S. Geological Survey in order to prepare maps of the anomalous gravity field and consequently to interpret the subsurface distribution of rock densities and associated lithologic or geologic units. Observations are made in the field with gravity meters at new locations and at reoccupations of previously established gravity "stations." This report illustrates an interactively-prompted series of steps needed to convert gravity "readings" to values that are tied to established gravity datums and includes computer programs to implement those steps. Inasmuch as individual gravity readings have small variations, gravity-meter (instrument) drift may not be smoothly variable, and acommodations may be needed for ties to previously established stations, the reduction process is iterative. Decision-making by the program user is prompted by lists of best values and graphical displays. Notes about irregularities of topography, which affect the value of observed gravity but are not shown in sufficient detail on topographic maps, must be recorded in the field. This report illustrates ways to record field notes (distances, heights, and slope angles) and includes computer programs to convert field notes to gravity terrain corrections. This report includes approaches that may serve as models for other applications, for example: portrayal of system flow; style of quality control to document and validate computer applications; lack of dependence on proprietary software except source code compilation; method of file-searching with a dwindling list; interactive prompting; computer code to write directly in the PostScript (Adobe Systems Incorporated) printer language; and high-lighting the four-digit year on the first line of time-dependent data sets for assured Y2K compatibility. Computer source codes provided are written in the Fortran scientific language. In order for the programs to operate, they first must be converted (compiled) into an executable form on the user's computer. Although program testing was done in a UNIX (tradename of American Telephone and Telegraph Company) computer environment, it is anticipated that only a system-dependent date-and-time function may need to be changed for adaptation to other computer platforms that accept standard Fortran code.d del iliscipit volorer sequi ting etue feum zzriliquatum zzriustrud esenibh ex esto esequat.

  8. A Fast Code for Jupiter Atmospheric Entry

    NASA Technical Reports Server (NTRS)

    Tauber, Michael E.; Wercinski, Paul; Yang, Lily; Chen, Yih-Kanq; Arnold, James (Technical Monitor)

    1998-01-01

    A fast code was developed to calculate the forebody heating environment and heat shielding that is required for Jupiter atmospheric entry probes. A carbon phenolic heat shield material was assumed and, since computational efficiency was a major goal, analytic expressions were used, primarily, to calculate the heating, ablation and the required insulation. The code was verified by comparison with flight measurements from the Galileo probe's entry; the calculation required 3.5 sec of CPU time on a work station. The computed surface recessions from ablation were compared with the flight values at six body stations. The average, absolute, predicted difference in the recession was 12.5% too high. The forebody's mass loss was overpredicted by 5.5% and the heat shield mass was calculated to be 15% less than the probe's actual heat shield. However, the calculated heat shield mass did not include contingencies for the various uncertainties that must be considered in the design of probes. Therefore, the agreement with the Galileo probe's values was considered satisfactory, especially in view of the code's fast running time and the methods' approximations.

  9. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  10. Antenna pattern control using impedance surfaces

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Liu, Kefeng

    1992-01-01

    During this research period, we have effectively transferred existing computer codes from CRAY supercomputer to work station based systems. The work station based version of our code preserved the accuracy of the numerical computations while giving a much better turn-around time than the CRAY supercomputer. Such a task relieved us of the heavy dependence of the supercomputer account budget and made codes developed in this research project more feasible for applications. The analysis of pyramidal horns with impedance surfaces was our major focus during this research period. Three different modeling algorithms in analyzing lossy impedance surfaces were investigated and compared with measured data. Through this investigation, we discovered that a hybrid Fourier transform technique, which uses the eigen mode in the stepped waveguide section and the Fourier transformed field distributions across the stepped discontinuities for lossy impedances coating, gives a better accuracy in analyzing lossy coatings. After a further refinement of the present technique, we will perform an accurate radiation pattern synthesis in the coming reporting period.

  11. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general computing workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed to demonstrate the ability of an expert system to autonomously control the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. Integration options are explored and several possible solutions are presented.

  12. A Fast Code for Jupiter Atmospheric Entry Analysis

    NASA Technical Reports Server (NTRS)

    Yauber, Michael E.; Wercinski, Paul; Yang, Lily; Chen, Yih-Kanq

    1999-01-01

    A fast code was developed to calculate the forebody heating environment and heat shielding that is required for Jupiter atmospheric entry probes. A carbon phenolic heat shield material was assumed and, since computational efficiency was a major goal, analytic expressions were used, primarily, to calculate the heating, ablation and the required insulation. The code was verified by comparison with flight measurements from the Galileo probe's entry. The calculation required 3.5 sec of CPU time on a work station, or three to four orders of magnitude less than for previous Jovian entry heat shields. The computed surface recessions from ablation were compared with the flight values at six body stations. The average, absolute, predicted difference in the recession was 13.7% too high. The forebody's mass loss was overpredicted by 5.3% and the heat shield mass was calculated to be 15% less than the probe's actual heat shield. However, the calculated heat shield mass did not include contingencies for the various uncertainties that must be considered in the design of probes. Therefore, the agreement with the Galileo probe's values was satisfactory in view of the code's fast running time and the methods' approximations.

  13. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    NASA Technical Reports Server (NTRS)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  14. Geomagnetic Storm Impact On GPS Code Positioning

    NASA Astrophysics Data System (ADS)

    Uray, Fırat; Varlık, Abdullah; Kalaycı, İbrahim; Öǧütcü, Sermet

    2017-04-01

    This paper deals with the geomagnetic storm impact on GPS code processing with using GIPSY/OASIS research software. 12 IGS stations in mid-latitude were chosen to conduct the experiment. These IGS stations were classified as non-cross correlation receiver reporting P1 and P2 (NONCC-P1P2), non-cross correlation receiver reporting C1 and P2 (NONCC-C1P2) and cross-correlation (CC-C1P2) receiver. In order to keep the code processing consistency between the classified receivers, only P2 code observations from the GPS satellites were processed. Four extreme geomagnetic storms October 2003, day of the year (DOY), 29, 30 Halloween Storm, November 2003, DOY 20, November 2004, DOY 08 and four geomagnetic quiet days in 2005 (DOY 92, 98, 99, 100) were chosen for this study. 24-hour rinex data of the IGS stations were processed epoch-by-epoch basis. In this way, receiver clock and Earth Centered Earth Fixed (ECEF) Cartesian Coordinates were solved for a per-epoch basis for each day. IGS combined broadcast ephemeris file (brdc) were used to partly compensate the ionospheric effect on the P2 code observations. There is no tropospheric model was used for the processing. Jet Propulsion Laboratory Application Technology Satellites (JPL ATS) computed coordinates of the stations were taken as true coordinates. The differences of the computed ECEF coordinates and assumed true coordinates were resolved to topocentric coordinates (north, east, up). Root mean square (RMS) errors for each component were calculated for each day. The results show that two-dimensional and vertical accuracy decreases significantly during the geomagnetic storm days comparing with the geomagnetic quiet days. It is observed that vertical accuracy is much more affected than the horizontal accuracy by geomagnetic storm. Up to 50 meters error in vertical component has been observed in geomagnetic storm day. It is also observed that performance of Klobuchar ionospheric correction parameters during geomagnetic storm days cannot guarantee the improving accuracy due to the ionospheric scintillation.

  15. Modelling the performance of the monogroove with screen heat pipe for use in the radiator of the solar dynamic power system of the NASA Space Station

    NASA Technical Reports Server (NTRS)

    Evans, Austin Lewis

    1987-01-01

    A computer code to model the steady-state performance of a monogroove heat pipe for the NASA Space Station is presented, including the effects on heat pipe performance of a screen in the evaporator section which deals with transient surges in the heat input. Errors in a previous code have been corrected, and the new code adds additional loss terms in order to model several different working fluids. Good agreement with existing performance curves is obtained. From a preliminary evaluation of several of the radiator design parameters it is found that an optimum fin width could be achieved but that structural considerations limit the thickness of the fin to a value above optimum.

  16. Computational study of duct and pipe flows using the method of pseudocompressibility

    NASA Technical Reports Server (NTRS)

    Williams, Robert W.

    1991-01-01

    A viscous, three-dimensional, incompressible, Navier-Stokes Computational Fluid Dynamics code employing pseudocompressibility is used for the prediction of laminar primary and secondary flows in two 90-degree bends of constant cross section. Under study are a square cross section duct bend with 2.3 radius ratio and a round cross section pipe bend with 2.8 radius ratio. Sensitivity of predicted primary and secondary flow to inlet boundary conditions, grid resolution, and code convergence is investigated. Contour and velocity versus spanwise coordinate plots comparing prediction to experimental data flow components are shown at several streamwise stations before, within, and after the duct and pipe bends. Discussion includes secondary flow physics, computational method, computational requirements, grid dependence, and convergence rates.

  17. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  18. Wireless Headset Communication System

    NASA Technical Reports Server (NTRS)

    Lau, Wilfred K.; Swanson, Richard; Christensen, Kurt K.

    1995-01-01

    System combines features of pagers, walkie-talkies, and cordless telephones. Wireless headset communication system uses digital modulation on spread spectrum to avoid interference among units. Consists of base station, 4 radio/antenna modules, and as many as 16 remote units with headsets. Base station serves as network controller, audio-mixing network, and interface to such outside services as computers, telephone networks, and other base stations. Developed for use at Kennedy Space Center, system also useful in industrial maintenance, emergency operations, construction, and airport operations. Also, digital capabilities exploited; by adding bar-code readers for use in taking inventories.

  19. Computation of Estonian CORS data using Bernese 5.2 and Gipsy 6.4 softwares

    NASA Astrophysics Data System (ADS)

    Kollo, Karin; Kall, Tarmo; Liibusk, Aive

    2017-04-01

    GNSS permanent station network in Estonia (ESTREF) was established already in 2007. In 2014-15 extensive reconstruction of ESTREF was carried out, including the establishment of 18 new stations, change of the hardware in CORS stations as well as establishing GNSS-RTK service for the whole Estonia. For GNSS-RTK service one needs precise coordinates in well-defined reference frame, i.e., ETRS89. For long time stability of stations and time-series analysis the re-processing of Estonian CORS data is ongoing. We re-process data from 2007 until 2015 with program Bernese GNSS 5.2 (Dach, 2015). For the set of ESTREF stations established in 2007, we perform as well computations with GIPSY 6.4 software (Ries et al., 2015). In the computations daily GPS-only solution was used. For precise orbits, final products from CODE (CODE analysis centre at the Astronomical Institute of the University of Bern) and JPL (Jet Propulsion Laboratory) for Bernese and GIPSY solutions were used, respectively. The cut-off angle was set to 10 degrees in order to avoid near-field multipath influence. In GIPSY, precise point positioning method with fixing ambiguities was used. Bernese calculations were performed based on double difference processing. Antenna phase centers were modelled based on igs08.atx and epnc_08.atx files. Vienna mapping function was used for mapping tropospheric delays. For the GIPSY solution, the higher order ionospheric term was modelled based on IRI-2012b model. For the Bernese solution higher order ionospheric term was neglected. FES2004 ocean tide loading model was used for the both computation strategies. As a result, two solutions using different scientific GNSS computation programs were obtained. The results from Bernese and GIPSY solutions were compared, using station repeatability values, RMS and coordinate differences. KEYWORDS: GNSS reference station network, Bernese GNSS 5.2, Gipsy 6.4, Estonia. References: Dach, R., S. Lutz, P. Walser, P. Fridez (Eds); 2015: Bernese GNSS Software Version 5.2. User manual, Astronomical Institute, Universtiy of Bern, Bern Open Publishing. DOI: 10.7892/boris.72297; ISBN: 978-3-906813-05-9. Paul Ries, Willy Bertiger, Shailen, Shailen Desai, & Kevin Miller. (2015). GIPSY 6.4 Release Notes. Jet Propulsion Laboratory, California Institute of Technology. Retrieved from https://gipsy-oasis.jpl.nasa.gov/docs/index.php

  20. Advanced Communication Techniques

    DTIC Science & Technology

    1988-07-01

    networks with different structures have been developed. In some networks, stations (i.e., computers and/or their peripherals, such as printers , etc.) are...existence of codes which exceed the Gilbert- Varsharmov bound as demonstrated by Tafasman, Vladut, and Zink . Geometric methods will then be used to analyze

  1. Development of the 3DHZETRN code for space radiation protection

    NASA Astrophysics Data System (ADS)

    Wilson, John; Badavi, Francis; Slaba, Tony; Reddell, Brandon; Bahadori, Amir; Singleterry, Robert

    Space radiation protection requires computationally efficient shield assessment methods that have been verified and validated. The HZETRN code is the engineering design code used for low Earth orbit dosimetric analysis and astronaut record keeping with end-to-end validation to twenty percent in Space Shuttle and International Space Station operations. HZETRN treated diffusive leakage only at the distal surface limiting its application to systems with a large radius of curvature. A revision of HZETRN that included forward and backward diffusion allowed neutron leakage to be evaluated at both the near and distal surfaces. That revision provided a deterministic code of high computational efficiency that was in substantial agreement with Monte Carlo (MC) codes in flat plates (at least to the degree that MC codes agree among themselves). In the present paper, the 3DHZETRN formalism capable of evaluation in general geometry is described. Benchmarking will help quantify uncertainty with MC codes (Geant4, FLUKA, MCNP6, and PHITS) in simple shapes such as spheres within spherical shells and boxes. Connection of the 3DHZETRN to general geometry will be discussed.

  2. Photovoltaic conversion of laser power to electrical power

    NASA Technical Reports Server (NTRS)

    Walker, G. H.; Heinbockel, J. H.

    1986-01-01

    Photovoltaic laser to electric converters are attractive for use with a space-based laser power station. This paper presents the results of modeling studies for a silicon vertical junction converter used with a Nd laser. A computer code was developed for the model and this code was used to conduct a parametric study for a Si vertical junction converter consisting of one p-n junction irradiated with a Nd laser. These calculations predict an efficiency over 50 percent for an optimized converter.

  3. Study of the GPS inter-frequency calibration of timing receivers

    NASA Astrophysics Data System (ADS)

    Defraigne, P.; Huang, W.; Bertrand, B.; Rovera, D.

    2018-02-01

    When calibrating Global Positioning System (GPS) stations dedicated to timing, the hardware delays of P1 and P2, the P(Y)-codes on frequencies L1 and L2, are determined separately. In the international atomic time (TAI) network the GPS stations of the time laboratories are calibrated relatively against reference stations. This paper aims at determining the consistency between the P1 and P2 hardware delays (called dP1 and dP2) of these reference stations, and to look at the stability of the inter-signal hardware delays dP1-dP2 of all the stations in the network. The method consists of determining the dP1-dP2 directly from the GPS pseudorange measurements corrected for the frequency-dependent antenna phase center and the frequency-dependent ionosphere corrections, and then to compare these computed dP1-dP2 to the calibrated values. Our results show that the differences between the computed and calibrated dP1-dP2 are well inside the expected combined uncertainty of the two quantities. Furthermore, the consistency between the calibrated time transfer solution obtained from either single-frequency P1 or dual-frequency P3 for reference laboratories is shown to be about 1.0 ns, well inside the 2.1 ns uB uncertainty of a time transfer link based on GPS P3 or Precise Point Positioning. This demonstrates the good consistency between the P1 and P2 hardware delays of the reference stations used for calibration in the TAI network. The long-term stability of the inter-signal hardware delays is also analysed from the computed dP1-dP2. It is shown that only variations larger than 2 ns can be detected for a particular station, while variations of 200 ps can be detected when differentiating the results between two stations. Finally, we also show that in the differential calibration process as used in the TAI network, using the same antenna phase center or using different positions for L1 and L2 signals gives maximum differences of 200 ps on the hardware delays of the separate codes P1 and P2; however, the final impact on the P3 combination is less than 10 ps.

  4. Solar water heater for NASA's Space Station

    NASA Technical Reports Server (NTRS)

    Somers, Richard E.; Haynes, R. Daniel

    1988-01-01

    The feasibility of using a solar water heater for NASA's Space Station is investigated using computer codes developed to model the Space Station configuration, orbit, and heating systems. Numerous orbit variations, system options, and geometries for the collector were analyzed. Results show that a solar water heater, which would provide 100 percent of the design heating load and would not impose a significant impact on the Space Station overall design is feasible. A heat pipe or pumped fluid radial plate collector of about 10-sq m, placed on top of the habitat module was found to be well suited for satisfying water demand of the Space Station. Due to the relatively small area required by a radial plate, a concentrator is unnecessary. The system would use only 7 to 10 percent as much electricity as an electric water-heating system.

  5. A Computer Program to Model Passive Acoustic Antisubmarine Search Using Monte Carlo Simulation Techniques.

    DTIC Science & Technology

    1983-09-01

    duplicate a continuous function on a digital computer, and thus the machine representatic- of the GMA is only a close approximation of the continuous...error process. Thus, the manner in which the GMA process is digitally replicated has an effect on the results of the simulation. The parameterization of...Information Center 2 Cameron Station Alexandria, Virginia 22314 2. Libary , Code 0142 2 Naval Postgraduate School Monterey, California 93943 3. Professor

  6. CFD Sensitivity Analysis of a Modern Civil Transport Near Buffet-Onset Conditions

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Allison, Dennis O.; Biedron, Robert T.; Buning, Pieter G.; Gainer, Thomas G.; Morrison, Joseph H.; Rivers, S. Melissa; Mysko, Stephen J.; Witkowski, David P.

    2001-01-01

    A computational fluid dynamics (CFD) sensitivity analysis is conducted for a modern civil transport at several conditions ranging from mostly attached flow to flow with substantial separation. Two different Navier-Stokes computer codes and four different turbulence models are utilized, and results are compared both to wind tunnel data at flight Reynolds number and flight data. In-depth CFD sensitivities to grid, code, spatial differencing method, aeroelastic shape, and turbulence model are described for conditions near buffet onset (a condition at which significant separation exists). In summary, given a grid of sufficient density for a given aeroelastic wing shape, the combined approximate error band in CFD at conditions near buffet onset due to code, spatial differencing method, and turbulence model is: 6% in lift, 7% in drag, and 16% in moment. The biggest two contributers to this uncertainty are turbulence model and code. Computed results agree well with wind tunnel surface pressure measurements both for an overspeed 'cruise' case as well as a case with small trailing edge separation. At and beyond buffet onset, computed results agree well over the inner half of the wing, but shock location is predicted too far aft at some of the outboard stations. Lift, drag, and moment curves are predicted in good agreement with experimental results from the wind tunnel.

  7. Computer program for aerodynamic and blading design of multistage axial-flow compressors

    NASA Technical Reports Server (NTRS)

    Crouse, J. E.; Gorrell, W. T.

    1981-01-01

    A code for computing the aerodynamic design of a multistage axial-flow compressor and, if desired, the associated blading geometry input for internal flow analysis codes is presented. Compressible flow, which is assumed to be steady and axisymmetric, is the basis for a two-dimensional solution in the meridional plane with viscous effects modeled by pressure loss coefficients and boundary layer blockage. The radial equation of motion and the continuity equation are solved with the streamline curvature method on calculation stations outside the blade rows. The annulus profile, mass flow, pressure ratio, and rotative speed are input. A number of other input parameters specify and control the blade row aerodynamics and geometry. In particular, blade element centerlines and thicknesses can be specified with fourth degree polynomials for two segments. The output includes a detailed aerodynamic solution and, if desired, blading coordinates that can be used for internal flow analysis codes.

  8. Space Station Freedom electrical performance model

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Green, Robert D.; Kerslake, Thomas W.; Mckissock, David B.; Trudell, Jeffrey J.

    1993-01-01

    The baseline Space Station Freedom electric power system (EPS) employs photovoltaic (PV) arrays and nickel hydrogen (NiH2) batteries to supply power to housekeeping and user electrical loads via a direct current (dc) distribution system. The EPS was originally designed for an operating life of 30 years through orbital replacement of components. As the design and development of the EPS continues, accurate EPS performance predictions are needed to assess design options, operating scenarios, and resource allocations. To meet these needs, NASA Lewis Research Center (LeRC) has, over a 10 year period, developed SPACE (Station Power Analysis for Capability Evaluation), a computer code designed to predict EPS performance. This paper describes SPACE, its functionality, and its capabilities.

  9. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Traditional expert systems, such as diagnostic and training systems, interact with users only through a keyboard and screen, and are usually symbolic in nature. Expert systems that require access to data bases, complex simulations and real-time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general purpose workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The latter approach was chosen to implement TEXSYS, the thermal expert system, developed by NASA Ames Research Center in conjunction with Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. This paper will explore the integration options, and present several possible solutions.

  10. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  11. Surface code quantum communication.

    PubMed

    Fowler, Austin G; Wang, David S; Hill, Charles D; Ladd, Thaddeus D; Van Meter, Rodney; Hollenberg, Lloyd C L

    2010-05-07

    Quantum communication typically involves a linear chain of repeater stations, each capable of reliable local quantum computation and connected to their nearest neighbors by unreliable communication links. The communication rate of existing protocols is low as two-way classical communication is used. By using a surface code across the repeater chain and generating Bell pairs between neighboring stations with probability of heralded success greater than 0.65 and fidelity greater than 0.96, we show that two-way communication can be avoided and quantum information can be sent over arbitrary distances with arbitrarily low error at a rate limited only by the local gate speed. This is achieved by using the unreliable Bell pairs to measure nonlocal stabilizers and feeding heralded failure information into post-transmission error correction. Our scheme also applies when the probability of heralded success is arbitrarily low.

  12. GLOBECOM '86 - Global Telecommunications Conference, Houston, TX, Dec. 1-4, 1986, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Papers are presented on local area networks; formal methods for communication protocols; computer simulation of communication systems; spread spectrum and coded communications; tropical radio propagation; VLSI for communications; strategies for increasing software productivity; multiple access communications; advanced communication satellite technologies; and spread spectrum systems. Topics discussed include Space Station communication and tracking development and design; transmission networks; modulation; data communications; computer network protocols and performance; and coding and synchronization. Consideration is given to free space optical communications systems; VSAT communication networks; network topology design; advances in adaptive filtering echo cancellation and adaptive equalization; advanced signal processing for satellite communications; the elements, design, and analysis of fiber-optic networks; and advances in digital microwave systems.

  13. A new method for computing the reliability of consecutive k-out-of-n:F systems

    NASA Astrophysics Data System (ADS)

    Gökdere, Gökhan; Gürcan, Mehmet; Kılıç, Muhammet Burak

    2016-01-01

    In many physical systems, reliability evaluation, such as ones encountered in telecommunications, the design of integrated circuits, microwave relay stations, oil pipeline systems, vacuum systems in accelerators, computer ring networks, and spacecraft relay stations, have had applied consecutive k-out-of-n system models. These systems are characterized as logical connections among the components of the systems placed in lines or circles. In literature, a great deal of attention has been paid to the study of the reliability evaluation of consecutive k-out-of-n systems. In this paper, we propose a new method to compute the reliability of consecutive k-out-of-n:F systems, with n linearly and circularly arranged components. The proposed method provides a simple way for determining the system failure probability. Also, we write R-Project codes based on our proposed method to compute the reliability of the linear and circular systems which have a great number of components.

  14. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1988-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic and algorithmic needs. Both of these needs could be met using a general purpose workstation running both symbolic and algorithmic codes, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed by the NASA Ames Research Center in conjunction with the Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. The integration options and several possible solutions are presented.

  15. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

  16. United States data collection activities and requirements, volume 1

    NASA Technical Reports Server (NTRS)

    Hrin, S.; Mcgregor, D.

    1977-01-01

    The potential market for a data collection system was investigated to determine whether the user needs would be sufficient to support a satellite relay data collection system design. The activities of 107,407 data collections stations were studied to determine user needs in agriculture, climatology, environmental monitoring, forestry, geology, hydrology, meteorology, and oceanography. Descriptions of 50 distinct data collections networks are described and used to form the user data base. The computer program used to analyze the station data base is discussed, and results of the analysis are presented in maps and graphs. Information format and coding is described in the appendix.

  17. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    NASA Technical Reports Server (NTRS)

    Wu, Honglu; Atwell, William; Cucinotta, Francis A.; Yang, Chui-hsu

    1996-01-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  18. Mir Cooperative Solar Array Flight Performance Data and Computational Analysis

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hoffman, David J.

    1997-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  19. Report on the Program and Contract Infrastructure Technical Requirements Development for the Guam Realignment Program

    DTIC Science & Technology

    2012-02-08

    Office GRN Guam Road Network GWA Guam Waterworks Authority ICG Interagency Coordination Group JFY Japanese Fiscal Year JRM Joint...PAC) (Pacific) NCTS Naval Computer and Telecommunications Station NEPA National Environmental Policy Act NPDES National Pollutant Discharge...Elimination System OPNAV Operational Navy UFC Unified Facilities Criteria U.S. United States USC United States Code USDA United States

  20. Internet Protocol Over Telemetry Testing for Earth Science Capability Demo Summary

    NASA Technical Reports Server (NTRS)

    Franz, Russ; Pestana, Mark; Bessent, Shedrick; Hang, Richard; Ng, Howard

    2006-01-01

    The development and flight tests described here focused on utilizing existing pulse code modulation (PCM) telemetry equipment to enable on-vehicle networks of instruments and computers to be a simple extension of the ground station network. This capability is envisioned as a necessary component of a global range that supports test and development of manned and unmanned airborne vehicles.

  1. Transitional flow in thin tubes for space station freedom radiator

    NASA Technical Reports Server (NTRS)

    Loney, Patrick; Ibrahim, Mounir

    1995-01-01

    A two dimensional finite volume method is used to predict the film coefficients in the transitional flow region (laminar or turbulent) for the radiator panel tubes. The code used to perform this analysis is CAST (Computer Aided Simulation of Turbulent Flows). The information gathered from this code is then used to augment a Sinda85 model that predicts overall performance of the radiator. A final comparison is drawn between the results generated with a Sinda85 model using the Sinda85 provided transition region heat transfer correlations and the Sinda85 model using the CAST generated data.

  2. Identification of Trends into Dose Calculations for Astronauts through Performing Sensitivity Analysis on Calculational Models Used by the Radiation Health Office

    NASA Technical Reports Server (NTRS)

    Adams, Thomas; VanBaalen, Mary

    2009-01-01

    The Radiation Health Office (RHO) determines each astronaut s cancer risk by using models to associate the amount of radiation dose that astronauts receive from spaceflight missions. The baryon transport codes (BRYNTRN), high charge (Z) and energy transport codes (HZETRN), and computer risk models are used to determine the effective dose received by astronauts in Low Earth orbit (LEO). This code uses an approximation of the Boltzman transport formula. The purpose of the project is to run this code for various International Space Station (ISS) flight parameters in order to gain a better understanding of how this code responds to different scenarios. The project will determine how variations in one set of parameters such as, the point of the solar cycle and altitude can affect the radiation exposure of astronauts during ISS missions. This project will benefit NASA by improving mission dosimetry.

  3. Satellite freeze forecast system: Executive summary

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D. (Principal Investigator)

    1983-01-01

    A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.

  4. Space Station solar water heater

    NASA Technical Reports Server (NTRS)

    Horan, D. C.; Somers, Richard E.; Haynes, R. D.

    1990-01-01

    The feasibility of directly converting solar energy for crew water heating on the Space Station Freedom (SSF) and other human-tended missions such as a geosynchronous space station, lunar base, or Mars spacecraft was investigated. Computer codes were developed to model the systems, and a proof-of-concept thermal vacuum test was conducted to evaluate system performance in an environment simulating the SSF. The results indicate that a solar water heater is feasible. It could provide up to 100 percent of the design heating load without a significant configuration change to the SSF or other missions. The solar heater system requires only 15 percent of the electricity that an all-electric system on the SSF would require. This allows a reduction in the solar array or a surplus of electricity for onboard experiments.

  5. A Dynamic/Anisotropic Low Earth Orbit (LEO) Ionizing Radiation Model

    NASA Technical Reports Server (NTRS)

    Badavi, Francis F.; West, Katie J.; Nealy, John E.; Wilson, John W.; Abrahms, Briana L.; Luetke, Nathan J.

    2006-01-01

    The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of ionizing radiation environmental models, nuclear transport code algorithms, and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate 6 degree of freedom (DOF) description of ISS trajectory and orientation.

  6. HART-II Acoustic Predictions using a Coupled CFD/CSD Method

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.

    2009-01-01

    This paper documents results to date from the Rotorcraft Acoustic Characterization and Mitigation activity under the NASA Subsonic Rotary Wing Project. The primary goal of this activity is to develop a NASA rotorcraft impulsive noise prediction capability which uses first principles fluid dynamics and structural dynamics. During this effort, elastic blade motion and co-processing capabilities have been included in a recent version of the computational fluid dynamics code (CFD). The CFD code is loosely coupled to computational structural dynamics (CSD) code using new interface codes. The CFD/CSD coupled solution is then used to compute impulsive noise on a plane under the rotor using the Ffowcs Williams-Hawkings solver. This code system is then applied to a range of cases from the Higher Harmonic Aeroacoustic Rotor Test II (HART-II) experiment. For all cases presented, the full experimental configuration (i.e., rotor and wind tunnel sting mount) are used in the coupled CFD/CSD solutions. Results show good correlation between measured and predicted loading and loading time derivative at the only measured radial station. A contributing factor for a typically seen loading mean-value offset between measured data and predictions data is examined. Impulsive noise predictions on the measured microphone plane under the rotor compare favorably with measured mid-frequency noise for all cases. Flow visualization of the BL and MN cases shows that vortex structures generated in the prediction method are consist with measurements. Future application of the prediction method is discussed.

  7. Satellite interference analysis and simulation using personal computers

    NASA Astrophysics Data System (ADS)

    Kantak, Anil

    1988-03-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  8. Satellite Interference Analysis and Simulation Using Personal Computers

    NASA Technical Reports Server (NTRS)

    Kantak, Anil

    1988-01-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  9. Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets.

    PubMed

    Scharfe, Michael; Pielot, Rainer; Schreiber, Falk

    2010-01-11

    Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics.

  10. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  11. Temporal Constraint Propagation for Airlift Planning Analysis

    DTIC Science & Technology

    1989-12-01

    STATIOU KMSP) (OFFLOAD-STATION EGUN ) (AVAILABLE-TIME COIO) (EARLIEST-ARRIVAL-TIME C012) (LATEST-ARRIVAL-TIME COlS) (PRIORITY 001) (BULK-CARGO 200...CODES NIL)) ($F (LOAD-DESIG!ATOR RI) (ONLOAD-STATION KLFI) (OFFLOAD-STATION EGUN ) (AVAILABLE-TIME COOO) (EARLIEST-ARRIVAL-TIME COOS) (LATEST-ARRIVAL...CATEGORY-CODES NIL)) ($F (LOAD-DESIGNATOR RIS) (ONLOAD-STATION KSBD) (OFFLOAD-STATION EGUN ) (AVAILABLE-TIME COO) (EARLIEST-ARRIVAL-TIME COOS) (LATEST

  12. Inlet flowfield investigation. Part 2: Computation of the flow about a supercruise forebody at supersonic speeds

    NASA Technical Reports Server (NTRS)

    Paynter, G. C.; Salemann, V.; Strom, E. E. I.

    1984-01-01

    A numerical procedure which solves the parabolized Navier-Stokes (PNS) equations on a body fitted mesh was used to compute the flow about the forebody of an advanced tactical supercruise fighter configuration in an effort to explore the use of a PNS method for design of supersonic cruise forebody geometries. Forebody flow fields were computed at Mach numbers of 1.5, 2.0, and 2.5, and at angles-of-attack of 0 deg, 4 deg, and 8 deg. at each Mach number. Computed results are presented at several body stations and include contour plots of Mach number, total pressure, upwash angle, sidewash angle and cross-plane velocity. The computational analysis procedure was found reliable for evaluating forebody flow fields of advanced aircraft configurations for flight conditions where the vortex shed from the wing leading edge is not a dominant flow phenomenon. Static pressure distributions and boundary layer profiles on the forebody and wing were surveyed in a wind tunnel test, and the analytical results are compared to the data. The current status of the parabolized flow flow field code is described along with desirable improvements in the code.

  13. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    NASA Astrophysics Data System (ADS)

    Chow, James C. L.; Lam, Phil; Jaffray, David A.

    2012-02-01

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR_GET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  14. Pressure measurements in a low-density nozzle plume for code verification

    NASA Technical Reports Server (NTRS)

    Penko, Paul F.; Boyd, Iain D.; Meissner, Dana L.; Dewitt, Kenneth J.

    1991-01-01

    Measurements of Pitot pressure were made in the exit plane and plume of a low-density, nitrogen nozzle flow. Two numerical computer codes were used to analyze the flow, including one based on continuum theory using the explicit MacCormack method, and the other on kinetic theory using the method of direct-simulation Monte Carlo (DSMC). The continuum analysis was carried to the nozzle exit plane and the results were compared to the measurements. The DSMC analysis was extended into the plume of the nozzle flow and the results were compared with measurements at the exit plane and axial stations 12, 24 and 36 mm into the near-field plume. Two experimental apparatus were used that differed in design and gave slightly different profiles of pressure measurements. The DSMC method compared well with the measurements from each apparatus at all axial stations and provided a more accurate prediction of the flow than the continuum method, verifying the validity of DSMC for such calculations.

  15. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  16. A users manual for the method of moments Aircraft Modeling Code (AMC), version 2

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1994-01-01

    This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.

  17. A user's manual for the method of moments Aircraft Modeling Code (AMC)

    NASA Technical Reports Server (NTRS)

    Peters, M. E.; Newman, E. H.

    1989-01-01

    This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.

  18. Flow characteristics at U.S. Geological Survey streamgages in the conterminous United States

    USGS Publications Warehouse

    Wolock, David

    2003-01-01

    This dataset represents point locations and flow characteristics for current (as of November 20, 2001) and historical U.S. Geological Survey (USGS) streamgages in the conterminous United States. The flow characteristics were computed from the daily streamflow data recorded at each streamgage for the period of record. The attributes associated with each streamgage include: Station number Station name Station latitude (decimal degrees in North American Datum of 1983, NAD 83) Station longitude (decimal degrees in NAD 83) First date (year, month, day) of streamflow data Last date (year, month, day) of streamflow data Number of days of streamflow data Minimum and maximum daily flow for the period of record (cubic feet per second) Percentiles (1, 5, 10, 20, 25, 50, 75, 80, 90, 95, 99) of daily flow for the period of record (cubic feet per second) Average and standard deviation of daily flow for the period of record (cubic feet per second) Mean annual base-flow index (BFI: see supplemental information) computed for the period of record (fraction, ranging from 0 to 1) Year-to-year standard deviation of the annual base-flow index computed for the period of record (fraction) Number of years of data used to compute the base-flow index (years) Reported drainage area (square miles) Reported contributing drainage area (square miles) National Water Information System (NWIS)-Web page URL for streamgage Hydrologic Unit Code (HUC, 8 digit) Hydrologic landscape region (HLR) River Reach File 1 (RF1) segment identification number (E2RF1##) Station numbers, names, locations, and drainage areas were acquired through the National Water Information System (NWIS)-Web (http://water.usgs.gov/nwis) on November 20, 2001. The streamflow data used to compute flow characteristics were copied from the Water server (water.usgs.gov:/www/htdocs/nwisweb/data1/discharge/) on November 2, 2001. The missing value indicator for all attributes is -99. Some streamflow characteristics are missing for: (1) streamgages measuring flow subject to tidal effects, which cause flow to reverse directions, (2) streamgages with site information but no streamflow data at the time the data were retrieved, and (3) streamgages with record length too short to compute the base-flow index.

  19. Code of conduct for the International Space Station Crew. National Aeronautics and Space Administration (NASA). Interim final rule.

    PubMed

    2000-12-21

    NASA is issuing new regulations entitled "International Space Station Crew," to implement certain provisions of the International Space Station (ISS) Intergovernmental Agreement (IGA) regarding ISS crewmembers' observance of an ISS Code of Conduct.

  20. 2nd Generation QUATARA Flight Computer Project

    NASA Technical Reports Server (NTRS)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven

    2015-01-01

    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  1. DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Williams, C. H.; Spurlock, O. F.

    2014-01-01

    From the late 1960's through 1997, the leadership of NASA's Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRC's primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the code's operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960's is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the Atlas/Centaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (Atlas/Centaur, Titan/Centaur, and Shuttle/Centaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUP's many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.

  2. DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Spurlock, O. Frank; Williams, Craig H.

    2015-01-01

    From the late 1960s through 1997, the leadership of NASAs Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRCs primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the codes operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960s is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the AtlasCentaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (AtlasCentaur, TitanCentaur, and ShuttleCentaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUPs many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.

  3. New Developments in Modeling MHD Systems on High Performance Computing Architectures

    NASA Astrophysics Data System (ADS)

    Germaschewski, K.; Raeder, J.; Larson, D. J.; Bhattacharjee, A.

    2009-04-01

    Modeling the wide range of time and length scales present even in fluid models of plasmas like MHD and X-MHD (Extended MHD including two fluid effects like Hall term, electron inertia, electron pressure gradient) is challenging even on state-of-the-art supercomputers. In the last years, HPC capacity has continued to grow exponentially, but at the expense of making the computer systems more and more difficult to program in order to get maximum performance. In this paper, we will present a new approach to managing the complexity caused by the need to write efficient codes: Separating the numerical description of the problem, in our case a discretized right hand side (r.h.s.), from the actual implementation of efficiently evaluating it. An automatic code generator is used to describe the r.h.s. in a quasi-symbolic form while leaving the translation into efficient and parallelized code to a computer program itself. We implemented this approach for OpenGGCM (Open General Geospace Circulation Model), a model of the Earth's magnetosphere, which was accelerated by a factor of three on regular x86 architecture and a factor of 25 on the Cell BE architecture (commonly known for its deployment in Sony's PlayStation 3).

  4. Computation of asymmetric supersonic flows around cones at large incidence

    NASA Technical Reports Server (NTRS)

    Degani, David

    1987-01-01

    The Schiff-Steger parabolized Navier-Stokes (PNS) code has been modified to allow computation of conical flowfields around cones at high incidence. The improved algorithm of Degani and Schiff has been incorporated with the PNS code. This algorithm adds the cross derivative and circumferential viscous terms to the original PNS code and modifies the algebraic eddy viscosity turbulence model to take into account regions of so called cross-flow separation. Assuming the flowfield is conical (but not necessarily symmetric) a marching stepback procedure is used: the solution is marched one step downstream using improved PNS code and the flow variables are then scaled to place the solution back to the original station. The process is repeated until no change in the flow variables is observed with further marching. The flow variables are then constant along rays of the flowfield. The experiments obtained by Bannik and Nebbeling were chosen as a test case. In these experiments a cone of 7.5 deg. half angle at Mach number 2.94 and Reynolds number 1.372 x 10(7) was tested up 34 deg. angle of attack. At high angle of attack nonconical asymmetric leeward side vortex patterns were observed. In the first set of computations, using an earlier obtained solution of the above cone for angle of attack of 22.6 deg. and at station x=0.5 as a starting solution, the angle of attack was gradually increased up to 34 deg. During this procedure the grid was carfully adjusted to capture the bow shock. A stable, converged symmetric solution was obtained. Since the numerical code converged to a symmetric solution which is not the physical one, the stability was tested by a random perturbation at each point. The possible effect of surface roughness or non perfect body shape was also investigated. It was concluded that although the assumption of conical viscous flows can be very useful for certain cases, it can not be used for the present case. Thus the second part of the investigation attempted to obtain a marching (in space) solution with the PNS method using the conical solution as initial data. Finally, the solution of the full Navier-Stokes equations was carried out.

  5. Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets

    PubMed Central

    2010-01-01

    Background Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. Results We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. Conclusions The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics. PMID:20064262

  6. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  7. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn

    2016-07-15

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifiermore » for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.« less

  8. Ionospheric Modelling using GPS to Calibrate the MWA. I: Comparison of First Order Ionospheric Effects between GPS Models and MWA Observations

    NASA Astrophysics Data System (ADS)

    Arora, B. S.; Morgan, J.; Ord, S. M.; Tingay, S. J.; Hurley-Walker, N.; Bell, M.; Bernardi, G.; Bhat, N. D. R.; Briggs, F.; Callingham, J. R.; Deshpande, A. A.; Dwarakanath, K. S.; Ewall-Wice, A.; Feng, L.; For, B.-Q.; Hancock, P.; Hazelton, B. J.; Hindson, L.; Jacobs, D.; Johnston-Hollitt, M.; Kapińska, A. D.; Kudryavtseva, N.; Lenc, E.; McKinley, B.; Mitchell, D.; Oberoi, D.; Offringa, A. R.; Pindor, B.; Procopio, P.; Riding, J.; Staveley-Smith, L.; Wayth, R. B.; Wu, C.; Zheng, Q.; Bowman, J. D.; Cappallo, R. J.; Corey, B. E.; Emrich, D.; Goeke, R.; Greenhill, L. J.; Kaplan, D. L.; Kasper, J. C.; Kratzenberg, E.; Lonsdale, C. J.; Lynch, M. J.; McWhirter, S. R.; Morales, M. F.; Morgan, E.; Prabu, T.; Rogers, A. E. E.; Roshi, A.; Shankar, N. Udaya; Srivani, K. S.; Subrahmanyan, R.; Waterson, M.; Webster, R. L.; Whitney, A. R.; Williams, A.; Williams, C. L.

    2015-08-01

    We compare first-order (refractive) ionospheric effects seen by the MWA with the ionosphere as inferred from GPS data. The first-order ionosphere manifests itself as a bulk position shift of the observed sources across an MWA field of view. These effects can be computed from global ionosphere maps provided by GPS analysis centres, namely the CODE. However, for precision radio astronomy applications, data from local GPS networks needs to be incorporated into ionospheric modelling. For GPS observations, the ionospheric parameters are biased by GPS receiver instrument delays, among other effects, also known as receiver DCBs. The receiver DCBs need to be estimated for any non-CODE GPS station used for ionosphere modelling. In this work, single GPS station-based ionospheric modelling is performed at a time resolution of 10 min. Also the receiver DCBs are estimated for selected Geoscience Australia GPS receivers, located at Murchison Radio Observatory, Yarragadee, Mount Magnet and Wiluna. The ionospheric gradients estimated from GPS are compared with that inferred from MWA. The ionospheric gradients at all the GPS stations show a correlation with the gradients observed with the MWA. The ionosphere estimates obtained using GPS measurements show promise in terms of providing calibration information for the MWA.

  9. Interconexión de la Estación Astrofísica de Bosque Alegre y la red local de datos del Observatorio Astronómico de Córdoba

    NASA Astrophysics Data System (ADS)

    Nicotra, M. A.; Anun, S.; Montes, M.; Goldes, G.; Carranza, G.

    We describe the outlines of a project for the interconnection between the Astrophysical Station of Bosque Alegre and the wide area network of the University of Córdoba. The Astrophysical Station is located 38.55 km (23.96 milles) from the Observatory of Córdoba. This location is suitable for radio links in the range of centimeters wavelenghts. In the last years, Spread-Spectrum technology equipments has become popular. Spread-Spectrum signals, contrary to narrow band radio signals, operates within a widthband 20 to 200 times broader than the widthband of the modulated information. Signals are modulated by special spreading codes, in such a way that emulates noisy signals. These codes are known under the generic designation of pseudo-random or pseudo-noise. In addition, the wide band is correlated with a low power density in the emitted signals. Spread-Spectrum equipment links are stable, exhibits low interferences with conventional radio transmitters, and their commercial prices are remarkably lower than those for the conventional microwave devices. Data links are compliant with Ethernet protocol networks and operates with data tramsmition rates up to 4 Mbits per second. The described equipment will enable the access to full-Internet services for visitor astronomers in Bosque Alegre. Also, it will be possible fast transfer for the observational data from telescope to computers in the local area network at Córdoba. This project must be considered as the second stage of another wide purpose project, which has the main purpose in transforming the Bosque Alegre Station as a fully robotic station controlled from the computational center at the Observatory in Cordoba. The advantages of robotic telescopes has recently been the subject of several discussions. However, it is now widely accepted that an automatic station enables some important options in the use of the astronomical instruments, such us the possibility of performing parallel programs, one of which is selected accordingly to environmental conditions in the instant of the observation.

  10. Simulation of Combustion Systems with Realistic g-jitter

    NASA Technical Reports Server (NTRS)

    Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.

    2003-01-01

    In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.

  11. Instructor/Operator Station Design Handbook for Aircrew Training Devices.

    DTIC Science & Technology

    1987-10-01

    to only the necessary work areas and baffles it from the CRT; (f) use of a selective -spectrum lighting system, in which the spectral output of the...operator. While the device provides some new features which support training, such as a debrief facility and a computer-based instructor training module , the...ZIP Code) 10 SOURCE OF FUNDING NUMBERS Brooks Air Force Base, Texas 78235-5601 PROGRAM PROJECT TASK WORK UNIT ELEMENT NO NO NO ACCESSION NO 62205F

  12. Site classification for National Strong Motion Observation Network System (NSMONS) stations in China using an empirical H/V spectral ratio method

    NASA Astrophysics Data System (ADS)

    Ji, Kun; Ren, Yefei; Wen, Ruizhi

    2017-10-01

    Reliable site classification of the stations of the China National Strong Motion Observation Network System (NSMONS) has not yet been assigned because of lacking borehole data. This study used an empirical horizontal-to-vertical (H/V) spectral ratio (hereafter, HVSR) site classification method to overcome this problem. First, according to their borehole data, stations selected from KiK-net in Japan were individually assigned a site class (CL-I, CL-II, or CL-III), which is defined in the Chinese seismic code. Then, the mean HVSR curve for each site class was computed using strong motion recordings captured during the period 1996-2012. These curves were compared with those proposed by Zhao et al. (2006a) for four types of site classes (SC-I, SC-II, SC-III, and SC-IV) defined in the Japanese seismic code (JRA, 1980). It was found that an approximate range of the predominant period Tg could be identified by the predominant peak of the HVSR curve for the CL-I and SC-I sites, CL-II and SC-II sites, and CL-III and SC-III + SC-IV sites. Second, an empirical site classification method was proposed based on comprehensive consideration of peak period, amplitude, and shape of the HVSR curve. The selected stations from KiK-net were classified using the proposed method. The results showed that the success rates of the proposed method in identifying CL-I, CL-II, and CL-III sites were 63%, 64%, and 58% respectively. Finally, the HVSRs of 178 NSMONS stations were computed based on recordings from 2007 to 2015 and the sites classified using the proposed method. The mean HVSR curves were re-calculated for three site classes and compared with those from KiK-net data. It was found that both the peak period and the amplitude were similar for the mean HVSR curves derived from NSMONS classification results and KiK-net borehole data, implying the effectiveness of the proposed method in identifying different site classes. The classification results have good agreement with site classes based on borehole data of 81 stations in China, which indicates that our site classification results are acceptable and that the proposed method is practicable.

  13. Position surveillance using one active ranging satellite and time-of-arrival of a signal from an independent satellite

    NASA Technical Reports Server (NTRS)

    Anderson, R. E.; Frey, R. L.; Lewis, J. R.

    1980-01-01

    Position surveillance using one active ranging/communication satellite and the time-of-arrival of signals from an independent satellite was shown to be feasible and practical. A towboat on the Mississippi River was equipped with a tone-code ranging transponder and a receiver tuned to the timing signals of the GOES satellite. A similar transponder was located at the office of the towing company. Tone-code ranging interrogations were transmitted from the General Electric Earth Station Laboratory through ATS-6 to the towboat and to the ground truth transponder office. Their automatic responses included digital transmissions of time-of-arrival measurements derived from the GOES signals. The Earth Station Laboratory determined ranges from the satellites to the towboat and computed position fixes. The ATS-6 lines-of-position were more precise than 0.1 NMi, 1 sigma, and the GOES lines-of-position were more precise than 1.6 NMi, 1 sigma. High quality voice communications were accomplished with the transponders using a nondirectional antenna on the towboat. The simple and effective surveillance technique merits further evaluation using operational maritime satellites.

  14. Identification coding schemes for modulated reflectance systems

    DOEpatents

    Coates, Don M [Santa Fe, NM; Briles, Scott D [Los Alamos, NM; Neagley, Daniel L [Albuquerque, NM; Platts, David [Santa Fe, NM; Clark, David D [Santa Fe, NM

    2006-08-22

    An identifying coding apparatus employing modulated reflectance technology involving a base station emitting a RF signal, with a tag, located remotely from the base station, and containing at least one antenna and predetermined other passive circuit components, receiving the RF signal and reflecting back to the base station a modulated signal indicative of characteristics related to the tag.

  15. Study of Unsteady Flows with Concave Wall Effect

    NASA Technical Reports Server (NTRS)

    Wang, Chi R.

    2003-01-01

    This paper presents computational fluid dynamic studies of the inlet turbulence and wall curvature effects on the flow steadiness at near wall surface locations in boundary layer flows. The time-stepping RANS numerical solver of the NASA Glenn-HT RANS code and a one-equation turbulence model, with a uniform inlet turbulence modeling level of the order of 10 percent of molecular viscosity, were used to perform the numerical computations. The approach was first calibrated for its predictabilities of friction factor, velocity, and temperature at near surface locations within a transitional boundary layer over concave wall. The approach was then used to predict the velocity and friction factor variations in a boundary layer recovering from concave curvature. As time iteration proceeded in the computations, the computed friction factors converged to their values from existing experiments. The computed friction factors, velocity, and static temperatures at near wall surface locations oscillated periodically in terms of time iteration steps and physical locations along the span-wise direction. At the upstream stations, the relationship among the normal and tangential velocities showed vortices effects on the velocity variations. Coherent vortices effect on the velocity components broke down at downstream stations. The computations also predicted the vortices effects on the velocity variations within a boundary layer flow developed along a concave wall surface with a downstream recovery flat wall surface. It was concluded that the computational approach might have the potential to analyze the flow steadiness in a turbine blade flow.

  16. Solar dynamic power for the Space Station

    NASA Technical Reports Server (NTRS)

    Archer, J. S.; Diamant, E. S.

    1986-01-01

    This paper describes a computer code which provides a significant advance in the systems analysis capabilities of solar dynamic power modules. While the code can be used to advantage in the preliminary analysis of terrestrial solar dynamic modules its real value lies in the adaptions which make it particularly useful for the conceptualization of optimized power modules for space applications. In particular, as illustrated in the paper, the code can be used to establish optimum values of concentrator diameter, concentrator surface roughness, concentrator rim angle and receiver aperture corresponding to the main heat cycle options - Organic Rankine and Brayton - and for certain receiver design options. The code can also be used to establish system sizing margins to account for the loss of reflectivity in orbit or the seasonal variation of insolation. By the simulation of the interactions among the major components of a solar dynamic module and through simplified formulations of the major thermal-optic-thermodynamic interactions the code adds a powerful, efficient and economic analytical tool to the repertory of techniques available for the design of advanced space power systems.

  17. LOX/LH2 vane pump for auxiliary propulsion systems

    NASA Technical Reports Server (NTRS)

    Hemminger, J. A.; Ulbricht, T. E.

    1985-01-01

    Positive displacement pumps offer potential efficiency advantages over centrifugal pumps for future low thrust space missions. Low flow rate applications, such as space station auxiliary propulsion or dedicated low thrust orbiter transfer vehicles, are typical of missions where low flow and high head rise challenge centrifugal pumps. The positive displacement vane pump for pumping of LOX and LH2 is investigated. This effort has included: (1) a testing program in which pump performance was investigated for differing pump clearances and for differing pump materials while pumping LN2, LOX, and LH2; and (2) an analysis effort, in which a comprehensive pump performance analysis computer code was developed and exercised. An overview of the theoretical framework of the performance analysis computer code is presented, along with a summary of analysis results. Experimental results are presented for pump operating in liquid nitrogen. Included are data on the effects on pump performance of pump clearance, speed, and pressure rise. Pump suction performance is also presented.

  18. Comparison between air pollution concentrations measured at the nearest monitoring station to the delivery hospital and those measured at stations nearest the residential postal code regions of pregnant women in Fukuoka.

    PubMed

    Michikawa, Takehiro; Morokuma, Seiichi; Nitta, Hiroshi; Kato, Kiyoko; Yamazaki, Shin

    2017-06-13

    Numerous earlier studies examining the association of air pollution with maternal and foetal health estimated maternal exposure to air pollutants based on the women's residential addresses. However, residential addresses, which are personally identifiable information, are not always obtainable. Since a majority of pregnant women reside near their delivery hospitals, the concentrations of air pollutants at the respective delivery hospitals may be surrogate markers of pollutant exposure at home. We compared air pollutant concentrations measured at the nearest monitoring station to Kyushu University Hospital with those measured at the closest monitoring stations to the respective residential postal code regions of pregnant women in Fukuoka. Aggregated postal code data for the home addresses of pregnant women who delivered at Kyushu University Hospital in 2014 was obtained from Kyushu University Hospital. For each of the study's 695 women who resided in Fukuoka Prefecture, we assigned pollutant concentrations measured at the nearest monitoring station to Kyushu University Hospital and pollutant concentrations measured at the nearest monitoring station to their respective residential postal code regions. Among the 695 women, 584 (84.0%) resided in the proximity of the nearest monitoring station to hospital or one of the four other stations (as the nearest stations to their respective residential postal code region) in Fukuoka city. Pearson's correlation for daily mean concentrations among the monitoring stations in Fukuoka city was strong for fine particulate matter (PM 2.5 ), suspended particulate matter (SPM), and photochemical oxidants (Ox) (coefficients ≥0.9), but moderate for coarse particulate matter (the result of subtracting the PM 2.5 from the SPM concentrations), nitrogen dioxide, and sulphur dioxide. Hospital-based and residence-based concentrations of PM 2.5 , SPM, and Ox were comparable. For PM 2.5 , SPM, and Ox, exposure estimation based on the delivery hospital is likely to approximate that based on the home of pregnant women.

  19. A New Model for Real-Time Regional Vertical Total Electron Content and Differential Code Bias Estimation Using IGS Real-Time Service (IGS-RTS) Products

    NASA Astrophysics Data System (ADS)

    Abdelazeem, Mohamed; Çelik, Rahmi N.; El-Rabbany, Ahmed

    2016-04-01

    The international global navigation satellite system (GNSS) real-time service (IGS-RTS) products have been used extensively for real-time precise point positioning and ionosphere modeling applications. In this study, we develop a regional model for real-time vertical total electron content (RT-VTEC) and differential code bias (RT-DCB) estimation over Europe using the IGS-RTS satellite orbit and clock products. The developed model has a spatial and temporal resolution of 1°×1° and 15 minutes, respectively. GPS observations from a regional network consisting of 60 IGS and EUREF reference stations are processed in the zero-difference mode using the Bernese-5.2 software package in order to extract the geometry-free linear combination of the smoothed code observations. The spherical harmonic expansion function is used to model the VTEC, the receiver and the satellite DCBs. To validate the proposed model, the RT-VTEC values are computed and compared with the final IGS-global ionospheric map (IGS-GIM) counterparts in three successive days under high solar activity including one of an extreme geomagnetic activity. The real-time satellite DCBs are also estimated and compared with the IGS-GIM counterparts. Moreover, the real-time receiver DCB for six IGS stations are obtained and compared with the IGS-GIM counterparts. The examined stations are located in different latitudes with different receiver types. The findings reveal that the estimated RT-VTEC values show agreement with the IGS-GIM counterparts with root mean-square-errors (RMSEs) values less than 2 TEC units. In addition, RMSEs of both the satellites and receivers DCBs are less than 0.85 ns and 0.65 ns, respectively in comparison with the IGS-GIM.

  20. Performance of the MIR Cooperative Solar Array After 2.5 Years in Orbit

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hoffman, David J.

    1999-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States and Russia to produce 6 kW of power for the Russian space station Mir. Four, multi-orbit test sequences were executed between June 1996 and December 1998 to measure MCSA electrical performance. A dedicated Fortran computer code was developed to analyze the detailed thermal-electrical performance of the MCSA. The computational performance results compared very favorably with the measured flight data in most cases. Minor performance degradation was detected in one current generating section of the MCSA. Yet overall, the flight data indicated the MCSA was meeting and exceeding performance expectations. There was no precipitous performance loss due to contamination or other causes after 2.5 years of operation. In this paper, we review the MCSA flight electrical performance tests, data and computational modeling and discuss findings from data comparisons with the computational results.

  1. Three-dimensional time-dependent STAR reactor kinetics analyses coupled with RETRAN and MCPWR system response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.

    1989-11-01

    The operation of a nuclear power plant must be regularly supported by various reactor dynamics and thermal-hydraulic analyses, which may include final safety analysis report (FSAR) design-basis calculations, and conservative and best-estimate analyses. The development and improvement of computer codes and analysis methodologies provide many advantages, including the ability to evaluate the effect of modeling simplifications and assumptions made in previous reactor kinetics and thermal-hydraulic calculations. This paper describes the results of using the RETRAN, MCPWR, and STAR codes in a tandem, predictive-corrective manner for three pressurized water reactor (PWR) transients: (a) loss of feedwater (LOF) anticipated transient without scrammore » (ATWS), (b) station blackout ATWS, and (c) loss of total reactor coolant system (RCS) flow with a scram.« less

  2. Validation of the technique for absolute total electron content and differential code biases estimation

    NASA Astrophysics Data System (ADS)

    Mylnikova, Anna; Yasyukevich, Yury; Yasyukevich, Anna

    2017-04-01

    We have developed a technique for vertical total electron content (TEC) and differential code biases (DCBs) estimation using data from a single GPS/GLONASS station. The algorithm is based on TEC expansion into Taylor series in space and time (TayAbsTEC). We perform the validation of the technique using Global Ionospheric Maps (GIM) computed by Center for Orbit Determination in Europe (CODE) and Jet Propulsion Laboratory (JPL). We compared differences between absolute vertical TEC (VTEC) from GIM and VTEC evaluated by TayAbsTEC for 2009 year (solar activity minimum - sunspot number about 0), and for 2014 year (solar activity maximum - sunspot number 110). Since there is difference between VTEC from CODE and VTEC from JPL, we compare TayAbsTEC VTEC with both of them. We found that TayAbsTEC VTEC is closer to CODE VTEC than to JPL VTEC. The difference between TayAbsTEC VTEC and GIM VTEC is more noticeable for solar activity maximum (2014) than for solar activity minimum (2009) for both CODE and JPL. The distribution of VTEC differences is close to Gaussian distribution, so we conclude that results of TayAbsTEC are in the agreement with GIM VTEC. We also compared DCBs evaluated by TayAbsTEC and DCBs from GIM, computed by CODE. The TayAbsTEC DCBs are in good agreement with CODE DCBs for GPS satellites, but differ noticeable for GLONASS. We used DCBs to correct slant TEC to find out which DCBs give better results. Slant TEC correction with CODE DCBs produces negative and nonphysical TEC values. Slant TEC correction with TayAbsTEC DCBs doesn't produce such artifacts. The technique we developed is used for VTEC and DCBs calculation given only local GPS/GLONASS networks data. The evaluated VTEC data are in GIM framework which is handy when various data analyses are made.

  3. Channel coding in the space station data system network

    NASA Technical Reports Server (NTRS)

    Healy, T.

    1982-01-01

    A detailed discussion of the use of channel coding for error correction, privacy/secrecy, channel separation, and synchronization is presented. Channel coding, in one form or another, is an established and common element in data systems. No analysis and design of a major new system would fail to consider ways in which channel coding could make the system more effective. The presence of channel coding on TDRS, Shuttle, the Advanced Communication Technology Satellite Program system, the JSC-proposed Space Operations Center, and the proposed 30/20 GHz Satellite Communication System strongly support the requirement for the utilization of coding for the communications channel. The designers of the space station data system have to consider the use of channel coding.

  4. Preliminary consideration on the seismic actions recorded during the 2016 Central Italy seismic sequence

    NASA Astrophysics Data System (ADS)

    Carlo Ponzo, Felice; Ditommaso, Rocco; Nigro, Antonella; Nigro, Domenico S.; Iacovino, Chiara

    2017-04-01

    After the Mw 6.0 mainshock of August 24, 2016 at 03.36 a.m. (local time), with the epicenter located between the towns of Accumoli (province of Rieti), Amatrice (province of Rieti) and Arquata del Tronto (province of Ascoli Piceno), several activities were started in order to perform some preliminary evaluations on the characteristics of the recent seismic sequence in the areas affected by the earthquake. Ambient vibration acquisitions have been performed using two three-directional velocimetric synchronized stations, with a natural frequency equal to 0.5Hz and a digitizer resolution of equal to 24bit. The activities are continuing after the events of the seismic sequence of October 26 and October 30, 2016. In this paper, in order to compare recorded and code provision values in terms of peak (PGA, PGV and PGD), spectral and integral (Housner Intensity) seismic parameters, several preliminary analyses have been performed on accelerometric time-histories acquired by three near fault station of the RAN (Italian Accelerometric Network): Amatrice station (station code AMT), Norcia station (station code NRC) and Castelsantangelo sul Nera station (station code CNE). Several comparisons between the elastic response spectra derived from accelerometric recordings and the elastic demand spectra provided by the Italian seismic code (NTC 2008) have been performed. Preliminary results retrieved from these analyses highlight several apparent difference between experimental data and conventional code provision. Then, the ongoing seismic sequence appears compatible with the historical seismicity in terms of integral parameters, but not in terms of peak and spectral values. It seems appropriate to reconsider the necessity to revise the simplified design approach based on the conventional spectral values. Acknowledgements This study was partially funded by the Italian Department of Civil Protection within the project DPC-RELUIS 2016 - RS4 ''Seismic observatory of structures and health monitoring'' and by the "Centre of Integrated Geomorphology for the Mediterranean Area - CGIAM" within the Framework Agreement with the University of Basilicata "Study, Research and Experimentation in the Field of Analysis and Monitoring of Seismic Vulnerability of Strategic and Relevant Buildings for the purposes of Civil Protection and Development of Innovative Strategies of Seismic Reinforcement".

  5. 47 CFR 95.119 - Station identification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... station identification is the call sign assigned to the GMRS station or system. (c) A unit number may be...: (1) Voice in the English language; or (2) International Morse code telegraphy. (e) A station need not...

  6. Analysis of a Radiation Model of the Shuttle Space Suit

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke M.; Nealy, John E.; Kim, Myung-Hee; Qualls, Garry D.; Wilson, John W.

    2003-01-01

    The extravehicular activity (EVA) required to assemble the International Space Station (ISS) will take approximately 1500 hours with 400 hours of EVA per year in operations and maintenance. With the Space Station at an inclination of 51.6 deg the radiation environment is highly variable with solar activity being of great concern. Thus, it is important to study the dose gradients about the body during an EVA to help determine the cancer risk associated with the different environments the ISS will encounter. In this paper we are concerned only with the trapped radiation (electrons and protons). Two different scenarios are looked at: the first is the quiet geomagnetic periods in low Earth orbit (LEO) and the second is during a large solar particle event in the deep space environment. This study includes a description of how the space suit's computer aided design (CAD) model was developed along with a description of the human model. Also included is a brief description of the transport codes used to determine the total integrated dose at several locations within the body. Finally, the results of the transport codes when applied to the space suit and human model and a brief description of the results are presented.

  7. Calibration Method for IATS and Application in Multi-Target Monitoring Using Coded Targets

    NASA Astrophysics Data System (ADS)

    Zhou, Yueyin; Wagner, Andreas; Wunderlich, Thomas; Wasmeier, Peter

    2017-06-01

    The technique of Image Assisted Total Stations (IATS) has been studied for over ten years and is composed of two major parts: one is the calibration procedure which combines the relationship between the camera system and the theodolite system; the other is the automatic target detection on the image by various methods of photogrammetry or computer vision. Several calibration methods have been developed, mostly using prototypes with an add-on camera rigidly mounted on the total station. However, these prototypes are not commercially available. This paper proposes a calibration method based on Leica MS50 which has two built-in cameras each with a resolution of 2560 × 1920 px: an overview camera and a telescope (on-axis) camera. Our work in this paper is based on the on-axis camera which uses the 30-times magnification of the telescope. The calibration consists of 7 parameters to estimate. We use coded targets, which are common tools in photogrammetry for orientation, to detect different targets in IATS images instead of prisms and traditional ATR functions. We test and verify the efficiency and stability of this monitoring method with multi-target.

  8. A Review of Australian Investigations on Aeronautical Fatigue during the Period April 1979 to March 1981.

    DTIC Science & Technology

    1981-03-01

    RD73 9. COST CODE: b. Sponsoring Agency: 27003 SUPPLY 50/2 10. IMPRINT: 11. COMPUTER PROGRAM(S) Aeronautical Research (Title(s) and language(s...laminates. 9/24 An advanced iso -parametric element is also being Jeveloped specifically for the analysis of disbonds and internal flaws in composite...FAILURE - STATION 119 iso I f FIG. 9.3 NOMAD STRLFCI URAl I AlT 10(L TESI FIG. 9.4 FAILED NOMAD STRUT UPPER END FITTING FIG. 9.5 FRACTURE FACES OF FAILED

  9. Cyberinfrastructure for the Unified Study of Earth Structure and Earthquake Sources in Complex Geologic Environments

    NASA Astrophysics Data System (ADS)

    Zhao, L.; Chen, P.; Jordan, T. H.; Olsen, K. B.; Maechling, P.; Faerman, M.

    2004-12-01

    The Southern California Earthquake Center (SCEC) is developing a Community Modeling Environment (CME) to facilitate the computational pathways of physics-based seismic hazard analysis (Maechling et al., this meeting). Major goals are to facilitate the forward modeling of seismic wavefields in complex geologic environments, including the strong ground motions that cause earthquake damage, and the inversion of observed waveform data for improved models of Earth structure and fault rupture. Here we report on a unified approach to these coupled inverse problems that is based on the ability to generate and manipulate wavefields in densely gridded 3D Earth models. A main element of this approach is a database of receiver Green tensors (RGT) for the seismic stations, which comprises all of the spatial-temporal displacement fields produced by the three orthogonal unit impulsive point forces acting at each of the station locations. Once the RGT database is established, synthetic seismograms for any earthquake can be simply calculated by extracting a small, source-centered volume of the RGT from the database and applying the reciprocity principle. The partial derivatives needed for point- and finite-source inversions can be generated in the same way. Moreover, the RGT database can be employed in full-wave tomographic inversions launched from a 3D starting model, because the sensitivity (Fréchet) kernels for travel-time and amplitude anomalies observed at seismic stations in the database can be computed by convolving the earthquake-induced displacement field with the station RGTs. We illustrate all elements of this unified analysis with an RGT database for 33 stations of the California Integrated Seismic Network in and around the Los Angeles Basin, which we computed for the 3D SCEC Community Velocity Model (SCEC CVM3.0) using a fourth-order staggered-grid finite-difference code. For a spatial grid spacing of 200 m and a time resolution of 10 ms, the calculations took ~19,000 node-hours on the Linux cluster at USC's High-Performance Computing Center. The 33-station database with a volume of ~23.5 TB was archived in the SCEC digital library at the San Diego Supercomputer Center using the Storage Resource Broker (SRB). From a laptop, anyone with access to this SRB collection can compute synthetic seismograms for an arbitrary source in the CVM in a matter of minutes. Efficient approaches have been implemented to use this RGT database in the inversions of waveforms for centroid and finite moment tensors and tomographic inversions to improve the CVM. Our experience with these large problems suggests areas where the cyberinfrastructure currently available for geoscience computation needs to be improved.

  10. United States Historical Climatology Network (US HCN) monthly temperature and precipitation data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniels, R.C.; Boden, T.A.; Easterling, D.R.

    1996-01-11

    This document describes a database containing monthly temperature and precipitation data for 1221 stations in the contiguous United States. This network of stations, known as the United States Historical Climatology Network (US HCN), and the resulting database were compiled by the National Climatic Data Center, Asheville, North Carolina. These data represent the best available data from the United States for analyzing long-term climate trends on a regional scale. The data for most stations extend through December 31, 1994, and a majority of the station records are serially complete for at least 80 years. Unlike many data sets that have beenmore » used in past climate studies, these data have been adjusted to remove biases introduced by station moves, instrument changes, time-of-observation differences, and urbanization effects. These monthly data are available free of charge as a numeric data package (NDP) from the Carbon Dioxide Information Analysis Center. The NDP includes this document and 27 machine-readable data files consisting of supporting data files, a descriptive file, and computer access codes. This document describes how the stations in the US HCN were selected and how the data were processed, defines limitations and restrictions of the data, describes the format and contents of the magnetic media, and provides reprints of literature that discuss the editing and adjustment techniques used in the US HCN.« less

  11. Modeling of temporal variation of very low frequency radio waves over long paths as observed from Indian Antarctic stations

    NASA Astrophysics Data System (ADS)

    Sasmal, Sudipta; Basak, Tamal; Chakraborty, Suman; Palit, Sourav; Chakrabarti, Sandip K.

    2017-07-01

    Characteristics of very low frequency (VLF) signal depends on solar illumination across the propagation path. For a long path, solar zenith angle varies widely over the path and this has a significant influence on the propagation characteristics. To study the effect, Indian Centre for Space Physics participated in the 27th and 35th Scientific Expedition to Antarctica. VLF signals transmitted from the transmitters, namely, VTX (18.2 kHz), Vijayanarayanam, India, and NWC (19.8 kHz), North West Cape, Australia, were recorded simultaneously at Indian permanent stations Maitri and Bharati having respective geographic coordinates 70.75°S, 11.67°E, and 69.4°S, 76.17°E. A very stable diurnal variation of the signal has been obtained from both the stations. We reproduced the signal variations of VLF signal using solar zenith angle model coupled with long wavelength propagation capability (LWPC) code. We divided the whole path into several segments and computed the solar zenith angle (χ) profile. We assumed a linear relationship between the Wait's exponential model parameters effective reflection height (h'), steepness parameter (β), and solar zenith angle. The h' and β values were later used in the LWPC code to obtain the VLF signal amplitude at a particular time. The same procedure was repeated to obtain the whole day signal. Nature of the whole day signal variation from the theoretical modeling is also found to match with our observation to some extent.

  12. ICC '86; Proceedings of the International Conference on Communications, Toronto, Canada, June 22-25, 1986, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.

  13. High data rate coding for the space station telemetry links.

    NASA Technical Reports Server (NTRS)

    Lumb, D. R.; Viterbi, A. J.

    1971-01-01

    Coding systems for high data rates were examined from the standpoint of potential application in space-station telemetry links. Approaches considered included convolutional codes with sequential, Viterbi, and cascaded-Viterbi decoding. It was concluded that a high-speed (40 Mbps) sequential decoding system best satisfies the requirements for the assumed growth potential and specified constraints. Trade-off studies leading to this conclusion are viewed, and some sequential (Fano) algorithm improvements are discussed, together with real-time simulation results.

  14. An EMTP system level model of the PMAD DC test bed

    NASA Technical Reports Server (NTRS)

    Dravid, Narayan V.; Kacpura, Thomas J.; Tam, Kwa-Sur

    1991-01-01

    A power management and distribution direct current (PMAD DC) test bed was set up at the NASA Lewis Research Center to investigate Space Station Freedom Electric Power Systems issues. Efficiency of test bed operation significantly improves with a computer simulation model of the test bed as an adjunct tool of investigation. Such a model is developed using the Electromagnetic Transients Program (EMTP) and is available to the test bed developers and experimenters. The computer model is assembled on a modular basis. Device models of different types can be incorporated into the system model with only a few lines of code. A library of the various model types is created for this purpose. Simulation results and corresponding test bed results are presented to demonstrate model validity.

  15. Predicting the magnetospheric plasma of weather

    NASA Technical Reports Server (NTRS)

    Dawson, John M.

    1986-01-01

    The prediction of the plasma environment in time, the plasma weather, is discussed. It is important to be able to predict when large magnetic storms will produce auroras, which will affect the space station operating in low orbit, and what precautions to take both for personnel and sensitive control (computer) equipment onboard. It is also important to start to establish a set of plasma weather records and a record of the ability to predict this weather. A successful forecasting system requires a set of satellite weather stations to provide data from which predictions can be made and a set of plasma weather codes capable of accurately forecasting the status of the Earth's magnetosphere. A numerical magnetohydrodynamic fluid model which is used to model the flow in the magnetosphere, the currents flowing into and out of the auroral regions, the magnetopause, the bow shock location and the magnetotail of the Earth is discussed.

  16. International Space Station (ISS) Meteoroid/Orbital Debris Shielding

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.

    1999-01-01

    Design practices to provide protection for International Space Station (ISS) crew and critical equipment from meteoroid and orbital debris (M/OD) Impacts have been developed. Damage modes and failure criteria are defined for each spacecraft system. Hypervolocity Impact -1 - and analyses are used to develop ballistic limit equations (BLEs) for each exposed spacecraft system. BLEs define Impact particle sizes that result in threshold failure of a particular spacecraft system as a function of Impact velocity, angles and particle density. The BUMPER computer code Is used to determine the probability of no penetration (PNP) that falls the spacecraft shielding based on NASA standard meteoroid/debris models, a spacecraft geometry model, and the BLEs. BUMPER results are used to verify spacecraft shielding requirements Low-weight, high-performance shielding alternatives have been developed at the NASA Johnson Space Center (JSC) Hypervelocity Impact Technology Facility (HITF) to meet spacecraft protection requirements.

  17. Alternative Fuels Data Center: Natural Gas Fueling Station Locations

    Science.gov Websites

    or ZIP code or along a route in the United States. Loading alternative fueling station locator Fleet Rightsizing System Efficiency Locate Stations Search by Location Map a Route Laws & Incentives

  18. The EDIT-COMGEOM Code

    DTIC Science & Technology

    1975-09-01

    This report assumes a familiarity with the GIFT and MAGIC computer codes. The EDIT-COMGEOM code is a FORTRAN computer code. The EDIT-COMGEOM code...converts the target description data which was used in the MAGIC computer code to the target description data which can be used in the GIFT computer code

  19. Incident Energy Focused Design and Validation for the Floating Potential Probe

    NASA Technical Reports Server (NTRS)

    Fincannon, James

    2002-01-01

    Utilizing the spacecraft shadowing and incident energy analysis capabilities of the NASA Glenn Research Center Power and Propulsion Office's SPACE System Power Analysis for Capability Evaluation) computer code, this paper documents the analyses for various International Space Station (ISS) Floating Potential Probe (EPP) preliminary design options. These options include various solar panel orientations and configurations as well as deployment locations on the ISS. The incident energy for the final selected option is characterized. A good correlation between the predicted data and on-orbit operational telemetry is demonstrated. Minor deviations are postulated to be induced by degradation or sensor drift.

  20. Analysis of physical-chemical processes governing SSME internal fluid flows

    NASA Technical Reports Server (NTRS)

    Singhal, A. K.; Owens, S. F.; Mukerjee, T.; Keeton, L. W.; Prakash, C.; Przekwas, A. J.

    1984-01-01

    The efforts to adapt CHAM's computational fluid dynamics code, PHOENICS, to the analysis of flow within the high pressure fuel turbopump (HPFTP) aft-platform seal cavity of the SSME are summarized. In particular, the special purpose PHOENICS satellite and ground station specifically formulated for this application are listed and described, and the preliminary results of the first part two-dimensional analyses are presented and discussed. Planned three-dimensional analyses are also briefly outlined. To further understand the mixing and combustion processes in the SSME fuelside preburners, a single oxygen-hydrogen jet element was investigated.

  1. Performance Assessment of a Gnss-Based Troposphere Path Delay Estimation Software

    NASA Astrophysics Data System (ADS)

    Mariotti, Gilles; Avanzi, Alessandro; Graziani, Alberto; Tortora, Paolo

    2013-04-01

    Error budgets of Deep Space Radio Science experiments are heavily affected by interplanetary and Earth transmission media, that corrupt, due to their non-unitary refraction index, the radiometric information of signals coming from the spacecraft. An effective removal of these noise sources is crucial to achieve the accuracy and signal stability levels required by radio science applications. Depending on the nature of these refractions, transmission media are divided into dispersive (that consists of ionized particles, i.e. Solar Wind and Ionosphere) and non-dispersive ones (the refraction is caused by neutral particles: Earth Troposphere). While dispersive noises are successfully removed by multifrequency combinations (as for GPS with the well-known ionofree combination), the most accurate estimation of tropospheric noise is obtained using microwave radiometers (MWR). As the use of MWRs suffers from strong operational limitations (rain and heavy clouds conditions), the GNSS-based processing is still widely adopted to provide a cost-effective, all-weather condition estimation of the troposphere path delay. This work describes the development process and reports the results of a GNSS analysis code specifically aimed to the estimation of the path delays introduced by the troposphere above deep space complexes, to be used for the calibration of Range and Doppler radiometric data. The code has been developed by the Radio Science Laboratory of the University of Bologna in Forlì, and is currently in the testing phase. To this aim, the preliminary output is compared to MWR measurements and IGS TropoSINEX products in order to assess the reliability of the estimate. The software works using ionofree carrier-phase observables and is based upon a double-difference approach, in which the GNSS receiver placed nearby the Deep Space receiver acts as the rover station. Several baselines are then created with various IGS and EUREF stations (master or reference stations) in order to perform the differentiation. The code relies on several IGS products, like SP3 precise orbits and SINEX positions available for the master stations in order to remove several error components, while the phase ambiguities (both wide and narrow lane) are resolved using the modified LAMBDA (MLAMBDA) method. The double-differenced data are then processed by a Kalman Filter that estimates the contingent positioning error of the rover station, its Zenith Wet Delay (ZWD) and the residual phase ambiguities. On the other hand, the Zenith Hydrostatic Delay (ZHD) is preliminarily computed using a mathematical model, based on surface meteorological measurements. The final product of the developed code is an output file containing the estimated ZWD and ZHD time-series in a format compatible with the major orbit determination software, e.g. the CSP card format (TRK-2-23) used by NASA JPL's Orbit Determination Program.

  2. A Three-Dimensional Unsteady CFD Model of Compressor Stability

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.

    2006-01-01

    A three-dimensional unsteady CFD code called CSTALL has been developed and used to investigate compressor stability. The code solved the Euler equations through the entire annulus and all blade rows. Blade row turning, losses, and deviation were modeled using body force terms which required input data at stations between blade rows. The input data was calculated using a separate Navier-Stokes turbomachinery analysis code run at one operating point near stall, and was scaled to other operating points using overall characteristic maps. No information about the stalled characteristic was used. CSTALL was run in a 2-D throughflow mode for very fast calculations of operating maps and estimation of stall points. Calculated pressure ratio characteristics for NASA stage 35 agreed well with experimental data, and results with inlet radial distortion showed the expected loss of range. CSTALL was also run in a 3-D mode to investigate inlet circumferential distortion. Calculated operating maps for stage 35 with 120 degree distortion screens showed a loss in range and pressure rise. Unsteady calculations showed rotating stall with two part-span stall cells. The paper describes the body force formulation in detail, examines the computed results, and concludes with observations about the code.

  3. Influence of seismic anisotropy on the cross correlation tensor: numerical investigations

    NASA Astrophysics Data System (ADS)

    Saade, M.; Montagner, J. P.; Roux, P.; Cupillard, P.; Durand, S.; Brenguier, F.

    2015-05-01

    Temporal changes in seismic anisotropy can be interpreted as variations in the orientation of cracks in seismogenic zones, and thus as variations in the stress field. Such temporal changes have been observed in seismogenic zones before and after earthquakes, although they are still not well understood. In this study, we investigate the azimuthal polarization of surface waves in anisotropic media with respect to the orientation of anisotropy, from a numerical point of view. This technique is based on the observation of the signature of anisotropy on the nine-component cross-correlation tensor (CCT) computed from seismic ambient noise recorded on pairs of three-component sensors. If noise sources are spatially distributed in a homogeneous medium, the CCT allows the reconstruction of the surface wave Green's tensor between the station pairs. In homogeneous, isotropic medium, four off-diagonal terms of the surface wave Green's tensor are null, but not in anisotropic medium. This technique is applied to three-component synthetic seismograms computed in a transversely isotropic medium with a horizontal symmetry axis, using a spectral element code. The CCT is computed between each pair of stations and then rotated, to approximate the surface wave Green's tensor by minimizing the off-diagonal components. This procedure allows the calculation of the azimuthal variation of quasi-Rayleigh and quasi-Love waves. In an anisotropic medium, in some cases, the azimuth of seismic anisotropy can induce a large variation in the horizontal polarization of surface waves. This variation depends on the relative angle between a pair of stations and the direction of anisotropy, the amplitude of the anisotropy, the frequency band of the signal and the depth of the anisotropic layer.

  4. Space Station UCS antenna pattern computation and measurement. [UHF Communication Subsystem

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Lu, Ba P.; Johnson, Larry A.; Fournet, Jon S.; Panneton, Robert J.; Ngo, John D.; Eggers, Donald S.; Arndt, G. D.

    1993-01-01

    The purpose of this paper is to analyze the interference to the Space Station Ultrahigh Frequency (UHF) Communication Subsystem (UCS) antenna radiation pattern due to its environment - Space Station. A hybrid Computational Electromagnetics (CEM) technique was applied in this study. The antenna was modeled using the Method of Moments (MOM) and the radiation patterns were computed using the Uniform Geometrical Theory of Diffraction (GTD) in which the effects of the reflected and diffracted fields from surfaces, edges, and vertices of the Space Station structures were included. In order to validate the CEM techniques, and to provide confidence in the computer-generated results, a comparison with experimental measurements was made for a 1/15 scale Space Station mockup. Based on the results accomplished, good agreement on experimental and computed results was obtained. The computed results using the CEM techniques for the Space Station UCS antenna pattern predictions have been validated.

  5. Comparison of liquid rocket engine base region heat flux computations using three turbulence models

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Griffith, Dwaine O., II; Prendergast, Maurice J.; Seaford, C. M.

    1993-01-01

    The flow in the base region of launch vehicles is characterized by flow separation, flow reversals, and reattachment. Computation of the convective heat flux in the base region and on the nozzle external surface of Space Shuttle Main Engine and Space Transportation Main Engine (STME) is an important part of defining base region thermal environments. Several turbulence models were incorporated in a CFD code and validated for flow and heat transfer computations in the separated and reattaching regions associated with subsonic and supersonic flows over backward facing steps. Heat flux computations in the base region of a single STME engine and a single S1C engine were performed using three different wall functions as well as a renormalization-group based k-epsilon model. With the very limited data available, the computed values are seen to be of the right order of magnitude. Based on the validation comparisons, it is concluded that all the turbulence models studied have predicted the reattachment location and the velocity profiles at various axial stations downstream of the step very well.

  6. Empirical transfer functions for stations in the Central California seismological network

    USGS Publications Warehouse

    Bakun, W.H.; Dratler, Jay

    1976-01-01

    A sequence of calibration signals composed of a station identification code, a transient from the release of the seismometer mass at rest from a known displacement from the equilibrium position, and a transient from a known step in voltage to the amplifier input are generated by the automatic daily calibration system (ADCS) now operational in the U.S. Geological Survey central California seismographic network. Documentation of a sequence of interactive programs to compute, from the calibration data, the complex transfer functions for the seismographic system (ground motion through digitizer) the electronics (amplifier through digitizer), and the seismometer alone are presented. The analysis utilizes the Fourier transform technique originally suggested by Espinosa et al (1962). Section I is a general description of seismographic calibration. Section II contrasts the 'Fourier transform' and the 'least-squares' techniques for analyzing transient calibration signals. Theoretical consideration for the Fourier transform technique used here are described in Section III. Section IV is a detailed description of the sequence of calibration signals generated by the ADCS. Section V is a brief 'cookbook description' of the calibration programs; Section VI contains a detailed sample program execution. Section VII suggests the uses of the resultant empirical transfer functions. Supplemental interactive programs by which smooth response functions, suitable for reducing seismic data to ground motion, are also documented in Section VII. Appendices A and B contain complete listings of the Fortran source Codes while Appendix C is an update containing preliminary results obtained from an analysis of some of the calibration signals from stations in the seismographic network near Oroville, California.

  7. Description of a MIL-STD-1553B Data Bus Ada Driver for the LeRC EPS Testbed

    NASA Technical Reports Server (NTRS)

    Mackin, Michael A.

    1995-01-01

    This document describes the software designed to provide communication between control computers in the NASA Lewis Research Center Electrical Power System Testbed using MIL-STD-1553B. The software drivers are coded in the Ada programming language and were developed on a MSDOS-based computer workstation. The Electrical Power System (EPS) Testbed is a reduced-scale prototype space station electrical power system. The power system manages and distributes electrical power from the sources (batteries or photovoltaic arrays) to the end-user loads. The electrical system primary operates at 120 volts DC, and the secondary system operates at 28 volts DC. The devices which direct the flow of electrical power are controlled by a network of six control computers. Data and control messages are passed between the computers using the MIL-STD-1553B network. One of the computers, the Power Management Controller (PMC), controls the primary power distribution and another, the Load Management Controller (LMC), controls the secondary power distribution. Each of these computers communicates with two other computers which act as subsidiary controllers. These subsidiary controllers are, in turn, connected to the devices which directly control the flow of electrical power.

  8. Trash Diverter Orientation Angle Optimization at Run-Off River Type Hydro-power Plant using CFD

    NASA Astrophysics Data System (ADS)

    Munisamy, Kannan M.; Kamal, Ahmad; Shuaib, Norshah Hafeez; Yusoff, Mohd. Zamri; Hasini, Hasril; Rashid, Azri Zainol; Thangaraju, Savithry K.; Hamid, Hazha

    2010-06-01

    Tenom Pangi Hydro Power Station in Tenom, Sabah is suffering from poor river quality with a lot of suspended trashes. This problem necessitates the need for a trash diverter to divert the trash away from the intake region. Previously, a trash diverter (called Trash Diverter I) was installed at the site but managed to survived for a short period of time due to an impact with huge log as a results of a heavy flood. In the current project, a second trash diverter structure is designed (called Trash Diverter II) with improved features compared to Trash Diverter I. The Computational Fluid Dynamics (CFD) analysis is done to evaluate the river flow interaction onto the trash diverter from the fluid flow point of view, Computational Fluids Dynamics is a numerical approach to solve fluid flow profile for different inlet conditions. In this work, the river geometry is modeled using commercial CFD code, FLUENT®. The computational model consists of Reynolds Averaged Navier-Stokes (RANS) equations coupled with other related models using the properties of the fluids under investigation. The model is validated with site-measurements done at Tenom Pangi Hydro Power Station. Different operating condition of river flow rate and weir opening is also considered. The optimum angle is determined in this simulation to further use the data for 3D simulation and structural analysis.

  9. Software for storage and processing coded messages for the international exchange of meteorological information

    NASA Astrophysics Data System (ADS)

    Popov, V. N.; Botygin, I. A.; Kolochev, A. S.

    2017-01-01

    The approach allows representing data of international codes for exchange of meteorological information using metadescription as the formalism associated with certain categories of resources. Development of metadata components was based on an analysis of the data of surface meteorological observations, atmosphere vertical sounding, atmosphere wind sounding, weather radar observing, observations from satellites and others. A common set of metadata components was formed including classes, divisions and groups for a generalized description of the meteorological data. The structure and content of the main components of a generalized metadescription are presented in detail by the example of representation of meteorological observations from land and sea stations. The functional structure of a distributed computing system is described. It allows organizing the storage of large volumes of meteorological data for their further processing in the solution of problems of the analysis and forecasting of climatic processes.

  10. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  11. Identification and Classification of Orthogonal Frequency Division Multiple Access (OFDMA) Signals Used in Next Generation Wireless Systems

    DTIC Science & Technology

    2012-03-01

    advanced antenna systems AMC adaptive modulation and coding AWGN additive white Gaussian noise BPSK binary phase shift keying BS base station BTC ...QAM-16, and QAM-64, and coding types include convolutional coding (CC), convolutional turbo coding (CTC), block turbo coding ( BTC ), zero-terminating

  12. Development of the NASA/FLAGRO computer program for analysis of airframe structures

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Newman, J. C., Jr.

    1994-01-01

    The NASA/FLAGRO (NASGRO) computer program was developed for fracture control analysis of space hardware and is currently the standard computer code in NASA, the U.S. Air Force, and the European Agency (ESA) for this purpose. The significant attributes of the NASGRO program are the numerous crack case solutions, the large materials file, the improved growth rate equation based on crack closure theory, and the user-friendly promptive input features. In support of the National Aging Aircraft Research Program (NAARP); NASGRO is being further developed to provide advanced state-of-the-art capability for damage tolerance and crack growth analysis of aircraft structural problems, including mechanical systems and engines. The project currently involves a cooperative development effort by NASA, FAA, and ESA. The primary tasks underway are the incorporation of advanced methodology for crack growth rate retardation resulting from spectrum loading and improved analysis for determining crack instability. Also, the current weight function solutions in NASGRO or nonlinear stress gradient problems are being extended to more crack cases, and the 2-d boundary integral routine for stress analysis and stress-intensity factor solutions is being extended to 3-d problems. Lastly, effort is underway to enhance the program to operate on personal computers and work stations in a Windows environment. Because of the increasing and already wide usage of NASGRO, the code offers an excellent mechanism for technology transfer for new fatigue and fracture mechanics capabilities developed within NAARP.

  13. A Study of the Behavior of Children in a Preschool Equipped with Computers.

    ERIC Educational Resources Information Center

    Klinzing, Dene G.

    A study was conducted: (1) to compare the popularity of computer stations with nine other activity stations; (2) to determine the differences in the type of play displayed by the children in preschool and note the type of play displayed at the computer stations versus the other activity stations; (3) to determine whether the preschool activities,…

  14. 47 CFR 97.207 - Space station.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....05 GHz segments. (d) A space station may automatically retransmit the radio signals of Earth stations... transmissions may consist of specially coded messages intended to facilitate communications or related to the... remaining source of stored energy, or through other equivalent procedures specifically disclosed in the...

  15. 47 CFR 97.207 - Space station.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....05 GHz segments. (d) A space station may automatically retransmit the radio signals of Earth stations... transmissions may consist of specially coded messages intended to facilitate communications or related to the... remaining source of stored energy, or through other equivalent procedures specifically disclosed in the...

  16. 47 CFR 97.207 - Space station.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....05 GHz segments. (d) A space station may automatically retransmit the radio signals of Earth stations... transmissions may consist of specially coded messages intended to facilitate communications or related to the... remaining source of stored energy, or through other equivalent procedures specifically disclosed in the...

  17. 47 CFR 97.207 - Space station.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....05 GHz segments. (d) A space station may automatically retransmit the radio signals of Earth stations... transmissions may consist of specially coded messages intended to facilitate communications or related to the... remaining source of stored energy, or through other equivalent procedures specifically disclosed in the...

  18. 47 CFR 97.207 - Space station.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....05 GHz segments. (d) A space station may automatically retransmit the radio signals of Earth stations... transmissions may consist of specially coded messages intended to facilitate communications or related to the... remaining source of stored energy, or through other equivalent procedures specifically disclosed in the...

  19. Digital optical computer II

    NASA Astrophysics Data System (ADS)

    Guilfoyle, Peter S.; Stone, Richard V.

    1991-12-01

    OptiComp is currently completing a 32-bit, fully programmable digital optical computer (DOC II) that is designed to operate in a UNIX environment running RISC microcode. OptiComp's DOC II architecture is focused toward parallel microcode implementation where data is input in a dual rail format. By exploiting the physical principals inherent to optics (speed and low power consumption), an architectural balance of optical interconnects and software code efficiency can be achieved including high fan-in and fan-out. OptiComp's DOC II program is jointly sponsored by the Office of Naval Research (ONR), the Strategic Defense Initiative Office (SDIO), NASA space station group and Rome Laboratory (USAF). This paper not only describes the motivational basis behind DOC II but also provides an optical overview and architectural summary of the device that allows the emulation of any digital instruction set.

  20. Verification of BWR Turbine Skyshine Dose with the MCNP5 Code Based on an Experiment Made at SHIMANE Nuclear Power Station

    NASA Astrophysics Data System (ADS)

    Tayama, Ryuichi; Wakasugi, Kenichi; Kawanaka, Ikunori; Kadota, Yoshinobu; Murakami, Yasuhiro

    We measured the skyshine dose from turbine buildings at Shimane Nuclear Power Station Unit 1 (NS-1) and Unit 2 (NS-2), and then compared it with the dose calculated with the Monte Carlo transport code MCNP5. The skyshine dose values calculated with the MCNP5 code agreed with the experimental data within a factor of 2.8, when the roof of the turbine building was precisely modeled. We concluded that our MCNP5 calculation was valid for BWR turbine skyshine dose evaluation.

  1. Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bekar, Kursat B.; Ibrahim, Ahmad M.

    2017-05-01

    This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less

  2. Coded mask telescopes for X-ray astronomy

    NASA Astrophysics Data System (ADS)

    Skinner, G. K.; Ponman, T. J.

    1987-04-01

    The principle of the coded mask techniques are discussed together with the methods of image reconstruction. The coded mask telescopes built at the University of Birmingham, including the SL 1501 coded mask X-ray telescope flown on the Skylark rocket and the Coded Mask Imaging Spectrometer (COMIS) projected for the Soviet space station Mir, are described. A diagram of a coded mask telescope and some designs for coded masks are included.

  3. Application of Adjoint Method and Spectral-Element Method to Tomographic Inversion of Regional Seismological Structure Beneath Japanese Islands

    NASA Astrophysics Data System (ADS)

    Tsuboi, S.; Miyoshi, T.; Obayashi, M.; Tono, Y.; Ando, K.

    2014-12-01

    Recent progress in large scale computing by using waveform modeling technique and high performance computing facility has demonstrated possibilities to perform full-waveform inversion of three dimensional (3D) seismological structure inside the Earth. We apply the adjoint method (Liu and Tromp, 2006) to obtain 3D structure beneath Japanese Islands. First we implemented Spectral-Element Method to K-computer in Kobe, Japan. We have optimized SPECFEM3D_GLOBE (Komatitsch and Tromp, 2002) by using OpenMP so that the code fits hybrid architecture of K-computer. Now we could use 82,134 nodes of K-computer (657,072 cores) to compute synthetic waveform with about 1 sec accuracy for realistic 3D Earth model and its performance was 1.2 PFLOPS. We use this optimized SPECFEM3D_GLOBE code and take one chunk around Japanese Islands from global mesh and compute synthetic seismograms with accuracy of about 10 second. We use GAP-P2 mantle tomography model (Obayashi et al., 2009) as an initial 3D model and use as many broadband seismic stations available in this region as possible to perform inversion. We then use the time windows for body waves and surface waves to compute adjoint sources and calculate adjoint kernels for seismic structure. We have performed several iteration and obtained improved 3D structure beneath Japanese Islands. The result demonstrates that waveform misfits between observed and theoretical seismograms improves as the iteration proceeds. We now prepare to use much shorter period in our synthetic waveform computation and try to obtain seismic structure for basin scale model, such as Kanto basin, where there are dense seismic network and high seismic activity. Acknowledgements: This research was partly supported by MEXT Strategic Program for Innovative Research. We used F-net seismograms of the National Research Institute for Earth Science and Disaster Prevention.

  4. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  5. Probabilistic structural analysis of a truss typical for space station

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.

    1990-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated using the computer code NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) to identify and quantify the uncertainties and respective sensitivities associated with corresponding uncertainties in the primitive variables (structural, material, and loads parameters) that defines the truss. The distribution of each of these primitive variables is described in terms of one of several available distributions such as the Weibull, exponential, normal, log-normal, etc. The cumulative distribution function (CDF's) for the response functions considered and sensitivities associated with the primitive variables for given response are investigated. These sensitivities help in determining the dominating primitive variables for that response.

  6. The use of automatic programming techniques for fault tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  7. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  8. The expanded role of computers in Space Station Freedom real-time operations

    NASA Technical Reports Server (NTRS)

    Crawford, R. Paul; Cannon, Kathleen V.

    1990-01-01

    The challenges that NASA and its international partners face in their real-time operation of the Space Station Freedom necessitate an increased role on the part of computers. In building the operational concepts concerning the role of the computer, the Space Station program is using lessons learned experience from past programs, knowledge of the needs of future space programs, and technical advances in the computer industry. The computer is expected to contribute most significantly in real-time operations by forming a versatile operating architecture, a responsive operations tool set, and an environment that promotes effective and efficient utilization of Space Station Freedom resources.

  9. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  10. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less

  11. Analysis and Design of Crew Sleep Station for ISS

    NASA Technical Reports Server (NTRS)

    Keener, John F.; Paul, Thomas; Eckhardt, Bradley; Smith, Fredrick

    2002-01-01

    This paper details the analysis and design of the Temporary Sleep Station (TeSS) environmental control system for International Space Station (ISS). The TeSS will provide crewmembers with a private and personal space, to accommodate sleeping, donning and doffing of clothing, personal communication and performance of recreational activities. The need for privacy to accommodate these activities requires adequate ventilation inside the TeSS. This study considers whether temperature, carbon dioxide, and humidity within the TeSS remain within crew comfort and safety levels for various expected operating scenarios. Evaluation of these scenarios required the use and integration of various simulation codes. An approach was adapted for this study, whereby results from a particular code were integrated with other codes when necessary. Computational Fluid Dynamics (CFD) methods were used to evaluate the flow field inside the TeSS, from which local gradients for temperature, velocity, and species concentration such as CO (sub 2) could be determined. A model of the TeSS, containing a human, as well as equipment such as a laptop computer, was developed in FLUENT, a finite-volume code. Other factors, such as detailed analysis of the heat transfer through the structure, radiation, and air circulation from the TeSS to the US Laboratory Aisle, where the TeSS is housed, were considered in the model. A complementary model was developed in G189A, a code which has been used by NASA/JSC for environmental control systems analyses since the Apollo program. Boundary conditions were exchanged between the FLUENT and G189A TeSS models. G189A provides human respiration rates to the FLUENT model, while the FLUENT model provides local convective heat transfer coefficients to G189A model. An additional benefit from using an approach with both a systems simulation and CFD model, is the capability to verify the results of each model by comparison to the results of the other model. The G189A and FLUENT models were used to evaluate various ventilation designs for the TeSS over a range of operating conditions with varying crew metabolic load, equipment operating modes, ventilation flow rates, and with the TeSS doors open and closed. Results from the study were instrumental in the optimization of a design for the TeSS ventilation hardware. A special case was considered where failure of the TeSS ventilation system occurred. In this case, a study was conducted in order to determine the time required for the CO (sub 2) concentration inside the TeSS to increase to ISS limit values under transient conditions. A lumped-capacitance code, SINDA-FLUINT was used in this case to provide accurate predictions of the human reaction to the TeSS cabin conditions including core and skin temperatures and body heat storage. A simple two-dimensional CFD model of a crewmember inside the TeSS was developed in FLUENT in order to determine the volume envelope of the respired air from the human, which maintained a minimum velocity profile. This volume was then used in the SINDA-FLUINT model to facilitate the calculations of CO (sub 2) concentrations, dry bulb temperatures and humidity levels inside the TeSS.

  12. 78 FR 44561 - Ortho-Phthalaldehyde; Receipt of Application for Emergency Exemption, Solicitation of Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... code 111). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). B. What should I consider as I prepare my comments for EPA? 1. Submitting CBI... coolant additives. Non-use of OPA in the requested manner would leave NASA's International Space Station...

  13. Miscellaneous streamflow measurements in the State of Washington, January 1961 to September 1985

    USGS Publications Warehouse

    Williams, John R.; Riis, S.A.

    1989-01-01

    This report is a compilation of previously published miscellaneous streamflow measurements made in Washington State by the U.S. Geological Survey between January 1961 and September 1985. It is a supplement to a volume of similar data for the period 1890 to January 1961. The data include stream name and stream to which it is tributary, latitude and longitude, county code, hydrologic unit code, land-line location, drainage area, and measurement dates and discharges. In general, the data sites are not at gaging stations; however, some data are given for gaging station sites during periods when the stations were not in operation. All data in this report have been entered into a computerized data base that includes the data for the period 1890 to January 1961. The data can be retrieved in a variety of ways, such as by county, by hydrologic unit code, by river basin , or by size of drainage area. (USGS)

  14. Computational/Experimental Aeroheating Predictions for X-33. Phase 2; Vehicle

    NASA Technical Reports Server (NTRS)

    Hamilton, H. Harris, II; Weilmuenster, K. James; Horvath, Thomas J.; Berry, Scott A.

    1998-01-01

    Laminar and turbulent heating-rate calculations from an "engineering" code and laminar calculations from a "benchmark" Navier-Stokes code are compared with experimental wind-tunnel data obtained on several candidate configurations for the X-33 Phase 2 flight vehicle. The experimental data were obtained at a Mach number of 6 and a freestream Reynolds number ranging from 1 to 8 x 10(exp 6)/ft. Comparisons are presented along the windward symmetry plane and in a circumferential direction around the body at several axial stations at angles of attack from 20 to 40 deg. The experimental results include both laminar and turbulent flow. For the highest angle of attack some of the measured heating data exhibited a "non-laminar" behavior which caused the heating to increase above the laminar level long before "classical" transition to turbulent flow was observed. This trend was not observed at the lower angles of attack. When the flow was laminar, both codes predicted the heating along the windward symmetry plane reasonably well but under-predicted the heating in the chine region. When the flow was turbulent the LATCH code accurately predicted the measured heating rates. Both codes were used to calculate heating rates over the X-33 vehicle at the peak heating point on the design trajectory and they were found to be in very good agreement over most of the vehicle windward surface.

  15. Water-quality, streamflow, and meteorological data for the Tualatin River Basin, Oregon, 1991-93

    USGS Publications Warehouse

    Doyle, M.C.; Caldwell, J.M.

    1996-01-01

    Surface-water-quality data, ground-water-quality data, streamflow data, field measurements, aquatic-biology data, meteorological data, and quality-assurance data were collected in the Tualatin River Basin from 1991 to 1993 by the U.S. Geological Survey (USGS) and the Unified Sewerage Agency of Washington County, Oregon (USA). The data from that study, which are part of this report, are presented in American Standard Code for Information Interchange (ASCII) format in subject-specific data files on a Compact Disk-Read Only Memory (CD-ROM). The text of this report describes the objectives of the study, the location of sampling sites, sample-collection and processing techniques, equipment used, laboratory analytical methods, and quality-assurance procedures. The data files on CD-ROM contain the analytical results of water samples collected in the Tualatin River Basin, streamflow measurements of the main-stem Tualatin River and its major tributaries, flow data from the USA wastewater-treatment plants, flow data from stations that divert water from the main-stem Tualatin River, aquatic-biology data, and meteorological data from the Tualatin Valley Irrigation District (TVID) Agrimet Weather Station located in Verboort, Oregon. Specific information regarding the contents of each data file is given in the text. The data files use a series of letter codes that distinguish each line of data. These codes are defined in data tables accompanying the text. Presenting data on CD-ROM offers several advantages: (1) the data can be accessed easily and manipulated by computers, (2) the data can be distributed readily over computer networks, and (3) the data may be more easily transported and stored than a large printed report. These data have been used by the USGS to (1) identify the sources, transport, and fate of nutrients in the Tualatin River Basin, (2) quantify relations among nutrient loads, algal growth, low dissolved-oxygen concentrations, and high pH, and (3) develop and calibrate a water- quality model that allows managers to test options for alleviating water-quality problems.

  16. A Working Model for the System Alumina-Magnesia.

    DTIC Science & Technology

    1983-05-01

    Several regions in the resulting diagram appear rather uncertain: the liquidus ’National bureau of StandaTds. JANAF Thermochemical Tables, by D. R. Stull ...Code 131) 1 Naval Ordnance Station, Indian Head (Technical Library) 29 Naval Postgraduate School. Monterey Code 012, Dean of Research (1) Code 06... Dean of Science and Engineering (1) Code 1424. Library - Technical Reports (2) Code 33. Weapons Engineering Program Office (1) Code 61. Chairman

  17. Development and application of the GIM code for the Cyber 203 computer

    NASA Technical Reports Server (NTRS)

    Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.

    1982-01-01

    The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.

  18. Reduction and coding of synthetic aperture radar data with Fourier transforms

    NASA Technical Reports Server (NTRS)

    Tilley, David G.

    1995-01-01

    Recently, aboard the Space Radar Laboratory (SRL), the two roles of Fourier Transforms for ocean image synthesis and surface wave analysis have been implemented with a dedicated radar processor to significantly reduce Synthetic Aperture Radar (SAR) ocean data before transmission to the ground. The object was to archive the SAR image spectrum, rather than the SAR image itself, to reduce data volume and capture the essential descriptors of the surface wave field. SAR signal data are usually sampled and coded in the time domain for transmission to the ground where Fourier Transforms are applied both to individual radar pulses and to long sequences of radar pulses to form two-dimensional images. High resolution images of the ocean often contain no striking features and subtle image modulations by wind generated surface waves are only apparent when large ocean regions are studied, with Fourier transforms, to reveal periodic patterns created by wind stress over the surface wave field. Major ocean currents and atmospheric instability in coastal environments are apparent as large scale modulations of SAR imagery. This paper explores the possibility of computing complex Fourier spectrum codes representing SAR images, transmitting the coded spectra to Earth for data archives and creating scenes of surface wave signatures and air-sea interactions via inverse Fourier transformations with ground station processors.

  19. A two-dimensional hydrodynamic model of the St. Clair-Detroit River waterway in the Great Lakes basin

    USGS Publications Warehouse

    Holtschlag, David J.; Koschik, John A.

    2002-01-01

    The St. Clair–Detroit River Waterway connects Lake Huron with Lake Erie in the Great Lakes basin to form part of the international boundary between the United States and Canada. A two-dimensional hydrodynamic model is developed to compute flow velocities and water levels as part of a source-water assessment of public water intakes. The model, which uses the generalized finite-element code RMA2, discretizes the waterway into a mesh formed by 13,783 quadratic elements defined by 42,936 nodes. Seven steadystate scenarios are used to calibrate the model by adjusting parameters associated with channel roughness in 25 material zones in sub-areas of the waterway. An inverse modeling code is used to systematically adjust model parameters and to determine their associated uncertainty by use of nonlinear regression. Calibration results show close agreement between simulated and expected flows in major channels and water levels at gaging stations. Sensitivity analyses describe the amount of information available to estimate individual model parameters, and quantify the utility of flow measurements at selected cross sections and water-level measurements at gaging stations. Further data collection, model calibration analysis, and grid refinements are planned to assess and enhance two-dimensional flow simulation capabilities describing the horizontal flow distributions in St. Clair and Detroit Rivers and circulation patterns in Lake St. Clair.

  20. Space Communications Emulation Facility

    NASA Technical Reports Server (NTRS)

    Hill, Chante A.

    2004-01-01

    Establishing space communication between ground facilities and other satellites is a painstaking task that requires many precise calculations dealing with relay time, atmospheric conditions, and satellite positions, to name a few. The Space Communications Emulation Facility (SCEF) team here at NASA is developing a facility that will approximately emulate the conditions in space that impact space communication. The emulation facility is comprised of a 32 node distributed cluster of computers; each node representing a satellite or ground station. The objective of the satellites is to observe the topography of the Earth (water, vegetation, land, and ice) and relay this information back to the ground stations. Software originally designed by the University of Kansas, labeled the Emulation Manager, controls the interaction of the satellites and ground stations, as well as handling the recording of data. The Emulation Manager is installed on a Linux Operating System, employing both Java and C++ programming codes. The emulation scenarios are written in extensible Markup Language, XML. XML documents are designed to store, carry, and exchange data. With XML documents data can be exchanged between incompatible systems, which makes it ideal for this project because Linux, MAC and Windows Operating Systems are all used. Unfortunately, XML documents cannot display data like HTML documents. Therefore, the SCEF team uses XML Schema Definition (XSD) or just schema to describe the structure of an XML document. Schemas are very important because they have the capability to validate the correctness of data, define restrictions on data, define data formats, and convert data between different data types, among other things. At this time, in order for the Emulation Manager to open and run an XML emulation scenario file, the user must first establish a link between the schema file and the directory under which the XML scenario files are saved. This procedure takes place on the command line on the Linux Operating System. Once this link has been established the Emulation manager validates all the XML files in that directory against the schema file, before the actual scenario is run. Using some very sophisticated commercial software called the Satellite Tool Kit (STK) installed on the Linux box, the Emulation Manager is able to display the data and graphics generated by the execution of a XML emulation scenario file. The Emulation Manager software is written in JAVA programming code. Since the SCEF project is in the developmental stage, the source code for this type of software is being modified to better fit the requirements of the SCEF project. Some parameters for the emulation are hard coded, set at fixed values. Members of the SCEF team are altering the code to allow the user to choose the values of these hard coded parameters by inserting a toolbar onto the preexisting GUI.

  1. Device 2F112 (F-14A WST (Weapon System Trainers)) Instructor Console Review.

    DTIC Science & Technology

    1983-12-01

    Cockpit Section-Trainee Station, b. Instructor Operator Station (OS), c. Computer System, d. Wide-Angle Visual System (WAVS), e. Auxiliary Systems. The...relationship of the three stations can be seen in Figure 1. The stations will be reviewed in greater detail in following sections. Fhe computer system...d) Printer 2) TRAINEE AREA 3) HYDRAULIC POWFR ROOM 4) ELEC. POWER/AIR COMPRESSORS 5) COMPUTER /PERIPHERAL AREA Figure 1. Device 2FI12 general layout

  2. Real-time realizations of the Bayesian Infrasonic Source Localization Method

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Arrowsmith, S.; Hofstetter, A.; Nippress, A.

    2015-12-01

    The Bayesian Infrasonic Source Localization method (BISL), introduced by Mordak et al. (2010) and upgraded by Marcillo et al. (2014) is destined for the accurate estimation of the atmospheric event origin at local, regional and global scales by the seismic and infrasonic networks and arrays. The BISL is based on probabilistic models of the source-station infrasonic signal propagation time, picking time and azimuth estimate merged with a prior knowledge about celerity distribution. It requires at each hypothetical source location, integration of the product of the corresponding source-station likelihood functions multiplied by a prior probability density function of celerity over the multivariate parameter space. The present BISL realization is generally time-consuming procedure based on numerical integration. The computational scheme proposed simplifies the target function so that integrals are taken exactly and are represented via standard functions. This makes the procedure much faster and realizable in real-time without practical loss of accuracy. The procedure executed as PYTHON-FORTRAN code demonstrates high performance on a set of the model and real data.

  3. The advanced software development workstation project

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  4. Parallel workflow manager for non-parallel bioinformatic applications to solve large-scale biological problems on a supercomputer.

    PubMed

    Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas

    2016-04-01

    Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .

  5. Modular space station, phase B extension. Information management advanced development. Volume 5: Software assembly

    NASA Technical Reports Server (NTRS)

    Gerber, C. R.

    1972-01-01

    The development of uniform computer program standards and conventions for the modular space station is discussed. The accomplishments analyzed are: (1) development of computer program specification hierarchy, (2) definition of computer program development plan, and (3) recommendations for utilization of all operating on-board space station related data processing facilities.

  6. Study of hypervelocity meteoroid impact on orbital space stations

    NASA Technical Reports Server (NTRS)

    Leimbach, K. R.; Prozan, R. J.

    1973-01-01

    Structural damage resulting in hypervelocity impact of a meteorite on a spacecraft is discussed. Of particular interest is the backside spallation caused by such a collision. To treat this phenomenon two numerical schemes were developed in the course of this study to compute the elastic-plastic flow fracture of a solid. The numerical schemes are a five-point finite difference scheme and a four-node finite element scheme. The four-node finite element scheme proved to be less sensitive to the type of boundary conditions and loadings. Although further development work is needed to improve the program versatility (generalization of the network topology, secondary storage for large systems, improving of the coding to reduce the run time, etc.), the basic framework is provided for a utilitarian computer program which may be used in a wide variety of situations. Analytic results showing the program output are given for several test cases.

  7. Aerodynamic Design and Computational Analysis of a Spacecraft Cabin Ventilation Fan

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue in a cost-effective way, early attention to fan design, selection, and installation has been recommended. Toward that end, NASA has begun to investigate the potential for small-fan noise reduction through improvements in fan aerodynamic design. Using tools and methodologies similar to those employed by the aircraft engine industry, most notably computational fluid dynamics (CFD) codes, the aerodynamic design of a new cabin ventilation fan has been developed, and its aerodynamic performance has been predicted and analyzed. The design, intended to serve as a baseline for future work, is discussed along with selected CFD results

  8. Theoretical Thermal Evaluation of Energy Recovery Incinerators

    DTIC Science & Technology

    1985-12-01

    Army Logistics Mgt Center, Fort Lee , VA DTIC Alexandria, VA DTNSRDC Code 4111 (R. Gierich), Bethesda MD; Code 4120, Annapolis, MD; Code 522 (Library...Washington. DC: Code (I6H4. Washington. DC NAVSECGRUACT PWO (Code .’^O.’^). Winter Harbor. IVIE ; PWO (Code 4(1). Edzell. Scotland; PWO. Adak AK...NEW YORK Fort Schuyler. NY (Longobardi) TEXAS A&M UNIVERSITY W.B. Ledbetter College Station. TX UNIVERSITY OF CALIFORNIA Energy Engineer. Davis CA

  9. NOAA Weather Radio - SAME

    Science.gov Websites

    Station Search Coverage Maps Outages View Outages Report Outages Information General Information Receiver Information Reception Problems NWR Alarms Automated Voices FIPS Codes NWR - Special Needs SAME USING SAME SAME FIPS (Federal Information Processing Standards) code changes and / or SAME location code changes

  10. Space station human productivity study. Volume 4: Issues

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The 305 Issues contained represent topics recommended for study in order to develop requirements in support of space station crew performance/productivity. The overall subject matter, space station elements affecting crew productivity, was organized into a coded subelement listing, which is included for the reader's reference. Each issue is numbered according to the 5-digit topical coding scheme. The requirements column on each Issue page shows a cross-reference to the unresolved requirement statement(s). Because topical overlaps were frequently encountered, many initial Issues were consolidated. Apparent gaps, therefore, may be accounted for by an Issue described within a related subelement. A glossary of abbreviations used throughout the study documentation is also included.

  11. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Method of computing coverage. 80.771 Section 80... STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method of computing coverage. Compute the +17 dBu contour as follows: (a) Determine the effective antenna...

  12. Some key considerations in evolving a computer system and software engineering support environment for the space station program

    NASA Technical Reports Server (NTRS)

    Mckay, C. W.; Bown, R. L.

    1985-01-01

    The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.

  13. Building a panel data set on fuel stations located in the Spanish regional areas of Madrid and Barcelona

    PubMed Central

    Balaguer, Jacint; Ripollés, Jordi

    2016-01-01

    The data described in this article were collected daily over the period June 10, 2010, to November 25, 2012, from the website of the Spanish Ministry of Industry, Energy and Tourism. The database includes information about fuel stations regarding to their prices (both gross and net of taxes), brand, location (latitude and longitude), and postal code in the Spanish provinces of Madrid and Barcelona. Moreover, obtaining the postal codes has allowed us to select those stations that are operating within the metropolitan areas of Madrid and Barcelona. By considering those fuel stations that uninterruptedly provided prices during the entire period, the data can be especially useful to explore the dynamics of prices in fuel markets. This is the case of Balaguer and Ripollés (2016), “Asymmetric fuel price responses under heterogeneity” [1], who, taking into account the presence of the potential heterogeneity of the behaviour of fuel stations, used this statistical information to perform an analysis on asymmetric fuel price responses. PMID:26933671

  14. Building a panel data set on fuel stations located in the Spanish regional areas of Madrid and Barcelona.

    PubMed

    Balaguer, Jacint; Ripollés, Jordi

    2016-06-01

    The data described in this article were collected daily over the period June 10, 2010, to November 25, 2012, from the website of the Spanish Ministry of Industry, Energy and Tourism. The database includes information about fuel stations regarding to their prices (both gross and net of taxes), brand, location (latitude and longitude), and postal code in the Spanish provinces of Madrid and Barcelona. Moreover, obtaining the postal codes has allowed us to select those stations that are operating within the metropolitan areas of Madrid and Barcelona. By considering those fuel stations that uninterruptedly provided prices during the entire period, the data can be especially useful to explore the dynamics of prices in fuel markets. This is the case of Balaguer and Ripollés (2016), "Asymmetric fuel price responses under heterogeneity" [1], who, taking into account the presence of the potential heterogeneity of the behaviour of fuel stations, used this statistical information to perform an analysis on asymmetric fuel price responses.

  15. Implementation of a 3D mixing layer code on parallel computers

    NASA Technical Reports Server (NTRS)

    Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.

    1995-01-01

    This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.

  16. Blasim: A computational tool to assess ice impact damage on engine blades

    NASA Astrophysics Data System (ADS)

    Reddy, E. S.; Abumeri, G. H.; Chamis, C. C.

    1993-04-01

    A portable computer called BLASIM was developed at NASA LeRC to assess ice impact damage on aircraft engine blades. In addition to ice impact analyses, the code also contains static, dynamic, resonance margin, and supersonic flutter analysis capabilities. Solid, hollow, superhybrid, and composite blades are supported. An optional preprocessor (input generator) was also developed to interactively generate input for BLASIM. The blade geometry can be defined using a series of airfoils at discrete input stations or by a finite element grid. The code employs a coarse, fixed finite element mesh containing triangular plate finite elements to minimize program execution time. Ice piece is modeled using an equivalent spherical objective that has a high velocity opposite that of the aircraft and parallel to the engine axis. For local impact damage assessment, the impact load is considered as a distributed force acting over a region around the impact point. The average radial strain of the finite elements along the leading edge is used as a measure of the local damage. To estimate damage at the blade root, the impact is treated as an impulse and a combined stress failure criteria is employed. Parametric studies of local and root ice impact damage, and post-impact dynamics are discussed for solid and composite blades.

  17. A combined experimental-modelling method for the detection and analysis of pollution in coastal zones

    NASA Astrophysics Data System (ADS)

    Limić, Nedzad; Valković, Vladivoj

    1996-04-01

    Pollution of coastal seas with toxic substances can be efficiently detected by examining toxic materials in sediment samples. These samples contain information on the overall pollution from surrounding sources such as yacht anchorages, nearby industries, sewage systems, etc. In an efficient analysis of pollution one must determine the contribution from each individual source. In this work it is demonstrated that a modelling method can be utilized for solving this latter problem. The modelling method is based on a unique interpretation of concentrations in sediments from all sampling stations. The proposed method is a synthesis consisting of the utilization of PIXE as an efficient method of pollution concentration determination and the code ANCOPOL (N. Limic and R. Benis, The computer code ANCOPOL, SimTel/msdos/geology, 1994 [1]) for the calculation of contributions from the main polluters. The efficiency and limits of the proposed method are demonstrated by discussing trace element concentrations in sediments of Punat Bay on the island of Krk in Croatia.

  18. Three Dimensional Viscous Flow Field in an Axial Flow Turbine Nozzle Passage

    NASA Technical Reports Server (NTRS)

    Ristic, D.; Lakshminarayana, B.

    1997-01-01

    The objective of this investigation is experimental and computational study of three dimensional viscous flow field in the nozzle passage of an axial flow turbine stage. The nozzle passage flow field has been measured using a two sensor hot-wire probe at various axial and radial stations. In addition, two component LDV measurements at one axial station (x/c(sum m) = 0.56) were performed to measure the velocity field. Static pressure measurements and flow visualization, using a fluorescent oil technique, were also performed to obtain the location of transition and the endwall limiting streamlines. A three dimensional boundary layer code, with a simple intermittency transition model, was used to predict the viscous layers along the blade and endwall surfaces. The boundary layers on the blade surface were found to be very thin and mostly laminar, except on the suction surface downstream of 70% axial chord. Strong radial pressure gradient, especially close to the suction surface, induces strong cross flow components in the trailing edge regions of the blade. On the end-walls the boundary layers were much thicker, especially near the suction corner of the casing surface, caused by secondary flow. The secondary flow region near the suction-casing surface corner indicates the presence of the passage vortex detached from the blade surface. The corner vortex is found to be very weak. The presence of a closely spaced rotor downstream (20% of the nozzle vane chord) introduces unsteadiness in the blade passage. The measured instantaneous velocity signal was filtered using FFT square window to remove the periodic unsteadiness introduced by the downstream rotor and fans. The filtering decreased the free stream turbulence level from 2.1% to 0.9% but had no influence on the computed turbulence length scale. The computation of the three dimensional boundary layers is found to be accurate on the nozzle passage blade surfaces, away from the end-walls and the secondary flow region. On the nozzle passage endwall surfaces the presence of strong pressure gradients and secondary flow limit the validity of the boundary layer code.

  19. Simultaneous Laser Ranging and Communication from an Earth-Based Satellite Laser Ranging Station to the Lunar Reconnaissance Orbiter in Lunar Orbit

    NASA Technical Reports Server (NTRS)

    Sun, Xiaoli; Skillman, David R.; Hoffman, Evan D.; Mao, Dandan; McGarry, Jan F.; Neumann, Gregory A.; McIntire, Leva; Zellar, Ronald S.; Davidson, Frederic M.; Fong, Wai H.; hide

    2013-01-01

    We report a free space laser communication experiment from the satellite laser ranging (SLR) station at NASA Goddard Space Flight Center (GSFC) to the Lunar Reconnaissance Orbiter (LRO) in lunar orbit through the on board one-way Laser Ranging (LR) receiver. Pseudo random data and sample image files were transmitted to LRO using a 4096-ary pulse position modulation (PPM) signal format. Reed-Solomon forward error correction codes were used to achieve error free data transmission at a moderate coding overhead rate. The signal fading due to the atmosphere effect was measured and the coding gain could be estimated.

  20. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  1. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  2. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  3. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  4. 40 CFR 194.23 - Models and computer codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  5. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scipioni, B.; Liu, D.; Song, T.

    1993-05-01

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL`s systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor andmore » analyze the PDSF.« less

  6. Experimental flutter boundaries with unsteady pressure distributions for the NACA 0012 Benchmark Model

    NASA Technical Reports Server (NTRS)

    Rivera, Jose A., Jr.; Dansberry, Bryan E.; Farmer, Moses G.; Eckstrom, Clinton V.; Seidel, David A.; Bennett, Robert M.

    1991-01-01

    The Structural Dynamics Div. at NASA-Langley has started a wind tunnel activity referred to as the Benchmark Models Program. The objective is to acquire test data that will be useful for developing and evaluating aeroelastic type Computational Fluid Dynamics codes currently in use or under development. The progress is described which was achieved in testing the first model in the Benchmark Models Program. Experimental flutter boundaries are presented for a rigid semispan model (NACA 0012 airfoil section) mounted on a flexible mount system. Also, steady and unsteady pressure measurements taken at the flutter condition are presented. The pressure data were acquired over the entire model chord located at the 60 pct. span station.

  7. Alternative Fuels Data Center: Propane Fueling Station Locations

    Science.gov Websites

    petroleum gas (propane) fueling stations near an address or ZIP code or along a route in the United States Location Map a Route Laws & Incentives Search Federal State Key Legislation Data & Tools Widgets

  8. Propagation Velocity of Solid Earth Tides

    NASA Astrophysics Data System (ADS)

    Pathak, S.

    2017-12-01

    One of the significant considerations in most of the geodetic investigations is to take into account the outcome of Solid Earth tides on the location and its consequent impact on the time series of coordinates. In this research work, the propagation velocity resulting from the Solid Earth tides between the Indian stations is computed. Mean daily coordinates for the stations have been computed by applying static precise point positioning technique for a day. The computed coordinates are used as an input for computing the tidal displacements at the stations by Gravity method along three directions at 1-minute interval for 24 hours. Further the baseline distances are computed between four Indian stations. Computation of the propagation velocity for Solid Earth tides can be done by the virtue of study of the concurrent effect of it in-between the stations of identified baseline distance along with the time consumed by the tides for reaching from one station to another. The propagation velocity helps in distinguishing the impact at any station if the consequence at a known station for a specific time-period is known. Thus, with the knowledge of propagation velocity, the spatial and temporal effects of solid earth tides can be estimated with respect to a known station. As theoretically explained, the tides generated are due to the position of celestial bodies rotating about Earth. So the need of study is to observe the correlation of propagation velocity with the rotation speed of the Earth. The propagation velocity of Solid Earth tides comes out to be in the range of 440-470 m/s. This velocity comes out to be in a good agreement with the Earth's rotation speed.

  9. Computer-Assisted Laboratory Stations.

    ERIC Educational Resources Information Center

    Snyder, William J., Hanyak, Michael E.

    1985-01-01

    Describes the advantages and features of computer-assisted laboratory stations for use in a chemical engineering program. Also describes a typical experiment at such a station: determining the response times of a solid state humidity sensor at various humidity conditions and developing an empirical model for the sensor. (JN)

  10. Estimating generalized skew of the log-Pearson Type III distribution for annual peak floods in Illinois

    USGS Publications Warehouse

    Oberg, Kevin A.; Mades, Dean M.

    1987-01-01

    Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)

  11. Computer Description of Black Hawk Helicopter

    DTIC Science & Technology

    1979-06-01

    Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents

  12. Selected Streamflow Statistics for Streamgaging Stations in Delaware, 2003

    USGS Publications Warehouse

    Ries, Kernell G.

    2004-01-01

    Flow-duration and low-flow frequency statistics were calculated for 15 streamgaging stations in Delaware, in cooperation with the Delaware Geological Survey. The flow-duration statistics include the 1-, 2-, 5-, 10-, 20-, 30-, 40-, 50-, 60-, 70-, 80-, 90-, 95-, 98-, and 99-percent duration discharges. The low-flow frequency statistics include the average discharges for 1, 7, 14, 30, 60, 90, and 120 days that recur, on average, once in 1.01, 2, 5, 10, 20, 50, and 100 years. The statistics were computed using U.S. Geological Survey computer programs that can be downloaded from the World Wide Web at no cost. The computer programs automate standard U.S. Geological Survey methods for computing the statistics. Documentation is provided at the Web sites for the individual programs. The computed statistics are presented in tabular format on a separate page for each station, along with the station name, station number, the location, the period of record, and remarks.

  13. Site Effects, Attenuation and Signal Duration in the 1356 Basel Earthquake Area (Southern Upper Rhine Graben))

    NASA Astrophysics Data System (ADS)

    GRANET, M.; BOITEL, G.

    2001-12-01

    A field experiment has been carried out in the epicentral area of the Basel (northern Switzerland) earthquake of 18 October 1356, the largest historical earthquake in central Europe, with the aim to better characterize the spatial variability of the amplitudes of the seismic waves due to the local geology. Such site effects evaluation are needed in seismic engineering in order to establish effective building codes. In order to determine the site effects, we used a spectral ratio method, utilizing the data collected from a mobile network of 45 stations, installed from March to August 2000. As the main result, we found resonant peak amplitudes at 3, 4 and 6 Hz, which are more pronounced when the seismic stations are located on the sediments. From the same data set, attenuation laws have been calculated. They show the importance of the geometrical attenuation in this region and the influence of the local geology on the amplitude of ground velocities. Finally, we notice that the velocities are more amplified for the lower part of the observed seismic signal frequency band. The computation of relations linking the duration of the signal to the magnitude, the distance and the local geology shows a good correlation of stations characterized by long duration signals with those affected by site effects. As for ground velocities, the duration becomes also more significant at the low frequencies. Finally, we computed the quality factor QP using the spectral ratio method. Unfortunately does the limited number of available data prevent us to obtain a very detailed model. Nevertheless does QP show a very significant attenuation across the whole area, without large contrasts, and a decrease of the attenuation with increasing frequencies. To conclude, this newly collected data set from a dense array of 45 stations in this tectonically active and hazardous area shows large site effects associated with an increasing of both amplitudes and duration of the signal, especially at low frequencies.

  14. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  15. View southeast of computer controlled energy monitoring system. System replaced ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View southeast of computer controlled energy monitoring system. System replaced strip chart recorders and other instruments under the direct observation of the load dispatcher. - Thirtieth Street Station, Load Dispatch Center, Thirtieth & Market Streets, Railroad Station, Amtrak (formerly Pennsylvania Railroad Station), Philadelphia, Philadelphia County, PA

  16. User manual for semi-circular compact range reflector code: Version 2

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1987-01-01

    A computer code has been developed at the Ohio State University ElectroScience Laboratory to analyze a semi-circular paraboloidal reflector with or without a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the reflector or its individual components at a given distance from the center of the paraboloid. The code computes the fields along a radial, horizontal, vertical or axial cut at that distance. Thus, it is very effective in computing the size of the sweet spot for a semi-circular compact range reflector. This report describes the operation of the code. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  17. OFFSET - RAY TRACING OPTICAL ANALYSIS OF OFFSET SOLAR COLLECTOR FOR SPACE STATION SOLAR DYNAMIC POWER SYSTEM

    NASA Technical Reports Server (NTRS)

    Jefferies, K.

    1994-01-01

    OFFSET is a ray tracing computer code for optical analysis of a solar collector. The code models the flux distributions within the receiver cavity produced by reflections from the solar collector. It was developed to model the offset solar collector of the solar dynamic electric power system being developed for Space Station Freedom. OFFSET has been used to improve the understanding of the collector-receiver interface and to guide the efforts of NASA contractors also researching the optical components of the power system. The collector for Space Station Freedom consists of 19 hexagonal panels each containing 24 triangular, reflective facets. Current research is geared toward optimizing flux distribution inside the receiver via changes in collector design and receiver orientation. OFFSET offers many options for experimenting with the design of the system. The offset parabolic collector model configuration is determined by an input file of facet corner coordinates. The user may choose other configurations by changing this file, but to simulate collectors that have other than 19 groups of 24 triangular facets would require modification of the FORTRAN code. Each of the roughly 500 facets in the assembled collector may be independently aimed to smooth out, or tailor, the flux distribution on the receiver's wall. OFFSET simulates the effects of design changes such as in receiver aperture location, tilt angle, and collector facet contour. Unique features of OFFSET include: 1) equations developed to pseudo-randomly select ray originating sources on the Sun which appear evenly distributed and include solar limb darkening; 2) Cone-optics technique used to add surface specular error to the ray originating sources to determine the apparent ray sources of the reflected sun; 3) choice of facet reflective surface contour -- spherical, ideal parabolic, or toroidal; 4) Gaussian distributions of radial and tangential components of surface slope error added to the surface normals at the ten nodal points on each facet; and 5) color contour plots of receiver incident flux distribution generated by PATRAN processing of FORTRAN computer code output. OFFSET output includes a file of input data for confirmation, a PATRAN results file containing the values necessary to plot the flux distribution at the receiver surface, a PATRAN results file containing the intensity distribution on a 40 x 40 cm area of the receiver aperture plane, a data file containing calculated information on the system configuration, a file including the X-Y coordinates of the target points of each collector facet on the aperture opening, and twelve P/PLOT input data files to allow X-Y plotting of various results data. OFFSET is written in FORTRAN (70%) for the IBM VM operating system. The code contains PATRAN statements (12%) and P/PLOT statements (18%) for generating plots. Once the program has been run on VM (or an equivalent system), the PATRAN and P/PLOT files may be transferred to a DEC VAX (or equivalent system) with access to PATRAN for PATRAN post processing. OFFSET was written in 1988 and last updated in 1989. PATRAN is a registered trademark of PDA Engineering. IBM is a registered trademark of International Business Machines Corporation. DEC VAX is a registered trademark of Digital Equipment Corporation.

  18. Using Python to generate AHPS-based precipitation simulations over CONUS using Amazon distributed computing

    NASA Astrophysics Data System (ADS)

    Machalek, P.; Kim, S. M.; Berry, R. D.; Liang, A.; Small, T.; Brevdo, E.; Kuznetsova, A.

    2012-12-01

    We describe how the Climate Corporation uses Python and Clojure, a language impleneted on top of Java, to generate climatological forecasts for precipitation based on the Advanced Hydrologic Prediction Service (AHPS) radar based daily precipitation measurements. A 2-year-long forecasts is generated on each of the ~650,000 CONUS land based 4-km AHPS grids by constructing 10,000 ensembles sampled from a 30-year reconstructed AHPS history for each grid. The spatial and temporal correlations between neighboring AHPS grids and the sampling of the analogues are handled by Python. The parallelization for all the 650,000 CONUS stations is further achieved by utilizing the MAP-REDUCE framework (http://code.google.com/edu/parallel/mapreduce-tutorial.html). Each full scale computational run requires hundreds of nodes with up to 8 processors each on the Amazon Elastic MapReduce (http://aws.amazon.com/elasticmapreduce/) distributed computing service resulting in 3 terabyte datasets. We further describe how we have productionalized a monthly run of the simulations process at full scale of the 4km AHPS grids and how the resultant terabyte sized datasets are handled.

  19. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  20. 3D-RTK Capability of Single Gnss Receivers

    NASA Astrophysics Data System (ADS)

    Stempfhuber, W.

    2013-08-01

    Small, aerial objects are now being utilised in many areas of civil object capture and monitoring. As a rule, the standard application of a simple GPS receiver with code solutions serves the 3D-positioning of the trajectories or recording positions. Without GPS correction information, these can be calculated at an accuracy of 10-20 metres. Corrected code solutions (DGPS) generally lie in the metre range. A precise 3D-positioning of the UAV (unmanned aerial vehicle) trajectories in the centimetre range provides significant improvements. In addition, the recording time of each sensor can be synchronized with the exact time stamp of the GNSS low-cost system. In recent years, increasing works on positioning from L1 GPS raw data have been published. Along with this, the carrier phase measurements with the established evaluation algorithms are analysed in the post processing method to centimetre-exact positions or to high-precision 3D trajectories [e.g. Schwieger and Gläser, 2005 or Korth and Hofmann 20011]. The use of reference information from local reference stations or a reference network serves the purpose of carrier phase ambiguity resolution. Furthermore, there are many activities worldwide in the area of PPP techniques (Precise Point Positioning). However, dual frequency receivers are primarily used in this instance. Moreover, very long initialisation times must be scheduled for this. A research project on the subject of low-cost RTK GNSS was developed for real-time applications at the Beuth Hochschule für Technik Berlin University of Applied Sciences [Stempfhuber 2012]. The overall system developed for the purpose of real-time applications with centimetre accuracy is modularly constructed and can be used for various applications (http://prof.beuthhochschule.de/stempfhuber/seite-publikation/). With hardware costing a few hundred Euro and a total weight of 500-800 g (including the battery), this system is ideally suited for UAV applications. In addition, the GNSS data processed with the RTK method can be provided in standardised NMEA format. Through the reduced shadowing effects of the aerial objects, GNSS external factors such as multipath cause few problems. With L1 carrier phase analysis, the baseline computation must nevertheless remain limited at the range of a few kilometres. With distances of more than 5 kilometres between the reference station and the rover station position, mistakes arise in the decimetre area. The overall modular system consists of a low-cost, single-frequency receiver (e.g. uBlox LEA4T or 6T receiver), a L1 antenna (e.g. the Trimble Bullet III), a developed data logger including an integrated WLAN communication module for storage and securing of the raw data as well as a power supply. Optimisation of the L1 antenna has shown that, in this instance, many problems relating to signal reception can be reduced. A calibration of the choke-ring adaptors for various antenna calibration facilities results in good and homogeneous antenna parameters. In this situation, the real-time algorithm from the Open Source project RTKLib [Takasu, 2010] generally runs on a small computer at the reference station. In this case, the data transfer from the L1 receiver to the PC is realisable through a serial cable. The rover station can transfer the raw data to the computing algorithm over a WLAN network or through a data radio. Of course, this computational algorithm can also be adapted to an integrated computing module for L1 carrier phase resolutions. The average time to first fix (TTFF) amounts to a few minutes depending on the satellite constellation. Different test series in movement simulators and in moving objects have shown that a stable, fixed solution is achieved with a normal satellite constellation. A test series with a Microdrones quadrocopter could also be conducted. In comparison of the RTK positions with a geodetic dual frequency receiver, differences are in millimetre ranges. In addition, reference systems (based on total stations) are present for the precise examination of the kinematically captured positioning [Eisenbeiss et al. 2009].

  1. Plant Habitat Telemetry / Command Interface and E-MIST

    NASA Technical Reports Server (NTRS)

    Walker, Uriae M.

    2013-01-01

    Plant Habitat (PH) is an experiment to be taken to the International Space Station (ISS) in 2016. It is critical that ground support computers have the ability to uplink commands to control PH, and that ISS computers have the ability to downlink PH telemetry data to ground support. This necessitates communication software that can send, receive, and process, PH specific commands and telemetry. The objective of the Plant Habitat Telemetry/ Command Interface is to provide this communication software, and to couple it with an intuitive Graphical User Interface (GUI). Initial investigation of the project objective led to the decision that code be written in C++ because of its compatibility with existing source code infrastructures and robustness. Further investigation led to a determination that multiple Ethernet packet structures would need to be created to effectively transmit data. Setting a standard for packet structures would allow us to distinguish these packets that would range from command type packets to sub categories of telemetry packets. In order to handle this range of packet types, the conclusion was made to take an object-oriented programming approach which complemented our decision to use the C++ programming language. In addition, extensive utilization of port programming concepts was required to implement the core functionality of the communication software. Also, a concrete understanding of a packet processing software was required in order to put aU the components of ISS-to-Ground Support Equipment (GSE) communication together and complete the objective. A second project discussed in this paper is Exposing Microbes to the Stratosphere (EMIST). This project exposes microbes into the stratosphere to observe how they are impacted by atmospheric effects. This paper focuses on the electrical and software expectations of the project, specifically drafting the printed circuit board, and programming the on-board sensors. The Eagle Computer-Aided Drafting (CAD) software was used to draft the E-MIST circuit. This required several component libraries to be created. Coding the sensors and obtaining sensor data involved using the Arduino Uno developmental board and coding language, and properly wiring peripheral sensors to the microcontroller (the central control unit of the experiment).

  2. Weekend Commercial Children's Television, 1975. A Study of Programming and Advertising to Children on Five Boston Stations.

    ERIC Educational Resources Information Center

    Barcus, F. Earle

    Some 25-1/2 hours of Boston commercial television for children were monitored on a Saturday and Sunday in April 1975. The monitoring covered three network affiliated stations and two independent UHF stations. Monitoring, coding, and editing provided much statistical data, which was analyzed to yield findings in the areas of distribution of…

  3. African Doppler Surveys (ADOS).

    DTIC Science & Technology

    1983-06-01

    UP TO 31 MARCH 1983 MAILING DATE: 11 APRIL 1983 SURVEY STATUS DATA STATUSDOPPLER STATION L1 OR1ATION - ______(hte 1,(Whether data dis- GEOD . TIES...SURVEY STATUS DTA STATUS DOPPLER STATION INFORMATION (Whet~her data dia- GEOD . TIES patched to...data dis- GEOD . TIES patched to COUNThY STATION NAME APPROXIMAT OPPLER designated CODE AND ADOS NO. SZION ()outh Comp. Center) COORDINATES fin- Yes

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivkin, C.; Blake, C.; Burgess, R.

    This report explains the Regulations, Codes, and Standards (RCS) requirements for hydrogen dispensing stations in the State of California. The reports shows the basic components of a hydrogen dispensing station in a simple schematic drawing; the permits and approvals that would typically be required for the construction and operation of a hydrogen dispensing station; and a basic permit that might be employed by an Authority Having Jurisdiction (AHJ).

  5. Automated Transfer Vehicle (ATV) Critical Safety Software Overview

    NASA Astrophysics Data System (ADS)

    Berthelier, D.

    2002-01-01

    The European Automated Transfer Vehicle is an unmanned transportation system designed to dock to International Space Station (ISS) and to contribute to the logistic servicing of the ISS. Concisely, ATV control is realized by a nominal flight control function (using computers, softwares, sensors, actuators). In order to cover the extreme situations where this nominal chain can not ensure safe trajectory with respect to ISS, a segregated proximity flight safety function is activated, where unsafe free drift trajectories can be encountered. This function relies notably on a segregated computer, the Monitoring and Safing Unit (MSU) ; in case of major ATV malfunction detection, ATV is then controlled by MSU software. Therefore, this software is critical because a MSU software failure could result in catastrophic consequences. This paper provides an overview both of this software functions and of the software development and validation method which is specific considering its criticality. First part of the paper describes briefly the proximity flight safety chain. Second part deals with the software functions. Indeed, MSU software is in charge of monitoring nominal computers and ATV corridors, using its own navigation algorithms, and, if an abnormal situation is detected, it is in charge of the ATV control during the Collision Avoidance Manoeuvre (CAM) consisting in an attitude controlled braking boost, followed by a Post-CAM manoeuvre : a Sun-pointed ATV attitude control during up to 24 hours on a safe trajectory. Monitoring, navigation and control algorithms principles are presented. Third part of this paper describes the development and validation process : algorithms functional studies , ADA coding and unit validations ; algorithms ADA code integration and validation on a specific non real-time MATLAB/SIMULINK simulator ; global software functional engineering phase, architectural design, unit testing, integration and validation on target computer.

  6. Comparison of MELCOR and SCDAP/RELAP5 results for a low-pressure, short-term station blackout at Browns Ferry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbajo, J.J.

    1995-12-31

    This study compares results obtained with two U.S. Nuclear Regulatory Commission (NRC)-sponsored codes, MELCOR version 1.8.3 (1.8PQ) and SCDAP/RELAP5 Mod3.1 release C, for the same transient - a low-pressure, short-term station blackout accident at the Browns Ferry nuclear plant. This work is part of MELCOR assessment activities to compare core damage progression calculations of MELCOR against SCDAP/RELAP5 since the two codes model core damage progression very differently.

  7. User's manual for semi-circular compact range reflector code

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.; Burnside, Walter D.

    1986-01-01

    A computer code was developed to analyze a semi-circular paraboloidal reflector antenna with a rolled edge at the top and a skirt at the bottom. The code can be used to compute the total near field of the antenna or its individual components at a given distance from the center of the paraboloid. Thus, it is very effective in computing the size of the sweet spot for RCS or antenna measurement. The operation of the code is described. Various input and output statements are explained. Some results obtained using the computer code are presented to illustrate the code's capability as well as being samples of input/output sets.

  8. Highly fault-tolerant parallel computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spielman, D.A.

    We re-introduce the coded model of fault-tolerant computation in which the input and output of a computational device are treated as words in an error-correcting code. A computational device correctly computes a function in the coded model if its input and output, once decoded, are a valid input and output of the function. In the coded model, it is reasonable to hope to simulate all computational devices by devices whose size is greater by a constant factor but which are exponentially reliable even if each of their components can fail with some constant probability. We consider fine-grained parallel computations inmore » which each processor has a constant probability of producing the wrong output at each time step. We show that any parallel computation that runs for time t on w processors can be performed reliably on a faulty machine in the coded model using w log{sup O(l)} w processors and time t log{sup O(l)} w. The failure probability of the computation will be at most t {center_dot} exp(-w{sup 1/4}). The codes used to communicate with our fault-tolerant machines are generalized Reed-Solomon codes and can thus be encoded and decoded in O(n log{sup O(1)} n) sequential time and are independent of the machine they are used to communicate with. We also show how coded computation can be used to self-correct many linear functions in parallel with arbitrarily small overhead.« less

  9. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  10. A generalized one-dimensional computer code for turbomachinery cooling passage flow calculations

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Roelke, Richard J.; Meitner, Peter L.

    1989-01-01

    A generalized one-dimensional computer code for analyzing the flow and heat transfer in the turbomachinery cooling passages was developed. This code is capable of handling rotating cooling passages with turbulators, 180 degree turns, pin fins, finned passages, by-pass flows, tip cap impingement flows, and flow branching. The code is an extension of a one-dimensional code developed by P. Meitner. In the subject code, correlations for both heat transfer coefficient and pressure loss computations were developed to model each of the above mentioned type of coolant passages. The code has the capability of independently computing the friction factor and heat transfer coefficient on each side of a rectangular passage. Either the mass flow at the inlet to the channel or the exit plane pressure can be specified. For a specified inlet total temperature, inlet total pressure, and exit static pressure, the code computers the flow rates through the main branch and the subbranches, flow through tip cap for impingement cooling, in addition to computing the coolant pressure, temperature, and heat transfer coefficient distribution in each coolant flow branch. Predictions from the subject code for both nonrotating and rotating passages agree well with experimental data. The code was used to analyze the cooling passage of a research cooled radial rotor.

  11. Heave and Roll Response of Free Floating Bodies of Cylindrical Shape

    DTIC Science & Technology

    1977-02-01

    27, De 1000 1 a 1,NPARTS ?8. C CH~ ECK ( IF ITS OUJT OF WATER P9 9IF (VDCI) ’I.E. 0.0) Ci’l TO 1000 30. C CHECK SHAPE. 31. Ud TO 1o0003000300...22217 Center 1 ATTN: (Code 460) Cameron Station 1 ATTN: (Code 102-OS) Alexandria, VA 22314 6 ATTN: (Code 1021P) 1 ATTN: (Code 200) Commander Naval

  12. Multiple channel optical data acquisition system

    DOEpatents

    Fasching, G.E.; Goff, D.R.

    1985-02-22

    A multiple channel optical data acquisition system is provided in which a plurality of remote sensors monitoring specific process variable are interrogated by means of a single optical fiber connecting the remote station/sensors to a base station. The remote station/sensors derive all power from light transmitted through the fiber from the base station. Each station/sensor is individually accessed by means of a light modulated address code sent over the fiber. The remote station/sensors use a single light emitting diode to both send and receive light signals to communicate with the base station and provide power for the remote station. The system described can power at least 100 remote station/sensors over an optical fiber one mile in length.

  13. Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Naval Computer & Telecommunications Area Master Station, Eastern Pacific, Radio Transmitter Facility Lualualei, Marine Barracks, Intersection of Tower Drive & Morse Street, Makaha, Honolulu County, HI

  14. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †

    PubMed Central

    Murdani, Muhammad Harist; Hong, Bonghee

    2018-01-01

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366

  15. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.

    PubMed

    Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee

    2018-03-24

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.

  16. H2FIRST Reference Station Design Task: Project Deliverable 2-2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratt, Joseph; Terlip, Danny; Ainscough, Chris

    2015-04-20

    This report presents near-term station cost results and discusses cost trends of different station types. It compares various vehicle rollout scenarios and projects realistic near-term station utilization values using the station infrastructure rollout in California as an example. It describes near-term market demands and matches those to cost-effective station concepts. Finally, the report contains detailed designs for five selected stations, which include piping and instrumentation diagrams, bills of materials, and several site-specific layout studies that incorporate the setbacks required by NFPA 2, the National Fire Protection Association Hydrogen Technologies Code. This work identified those setbacks as a significant factor affectingmore » the ability to site a hydrogen station, particularly liquid stations at existing gasoline stations. For all station types, utilization has a large influence on the financial viability of the station.« less

  17. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  18. "Hour of Code": Can It Change Students' Attitudes toward Programming?

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2016-01-01

    The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…

  19. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  20. Stuck on Screens: Patterns of Computer and Gaming Station Use in Youth Seen in a Psychiatric Clinic

    PubMed Central

    Baer, Susan; Bogusz, Elliot; Green, David A.

    2011-01-01

    Objective: Computer and gaming-station use has become entrenched in the culture of our youth. Parents of children with psychiatric disorders report concerns about overuse, but research in this area is limited. The goal of this study is to evaluate computer/gaming-station use in adolescents in a psychiatric clinic population and to examine the relationship between use and functional impairment. Method: 102 adolescents, ages 11–17, from out-patient psychiatric clinics participated. Amount of computer/gaming-station use, type of use (gaming or non-gaming), and presence of addictive features were ascertained along with emotional/functional impairment. Multivariate linear regression was used to examine correlations between patterns of use and impairment. Results: Mean screen time was 6.7±4.2 hrs/day. Presence of addictive features was positively correlated with emotional/functional impairment. Time spent on computer/gaming-station use was not correlated overall with impairment after controlling for addictive features, but non-gaming time was positively correlated with risky behavior in boys. Conclusions: Youth with psychiatric disorders are spending much of their leisure time on the computer/gaming-station and a substantial subset show addictive features of use which is associated with impairment. Further research to develop measures and to evaluate risk is needed to identify the impact of this problem. PMID:21541096

  1. Guidelines for developing vectorizable computer programs

    NASA Technical Reports Server (NTRS)

    Miner, E. W.

    1982-01-01

    Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.

  2. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  3. Integrated dynamic analysis simulation of space stations with controllable solar arrays (supplemental data and analyses)

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    Space station and solar array data and the analyses which were performed in support of the integrated dynamic analysis study. The analysis methods and the formulated digital simulation were developed. Control systems for space station altitude control and solar array orientation control include generic type control systems. These systems have been digitally coded and included in the simulation.

  4. Enhanced fault-tolerant quantum computing in d-level systems.

    PubMed

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  5. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  6. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  7. Nonuniform code concatenation for universal fault-tolerant quantum computing

    NASA Astrophysics Data System (ADS)

    Nikahd, Eesa; Sedighi, Mehdi; Saheb Zamani, Morteza

    2017-09-01

    Using transversal gates is a straightforward and efficient technique for fault-tolerant quantum computing. Since transversal gates alone cannot be computationally universal, they must be combined with other approaches such as magic state distillation, code switching, or code concatenation to achieve universality. In this paper we propose an alternative approach for universal fault-tolerant quantum computing, mainly based on the code concatenation approach proposed in [T. Jochym-O'Connor and R. Laflamme, Phys. Rev. Lett. 112, 010505 (2014), 10.1103/PhysRevLett.112.010505], but in a nonuniform fashion. The proposed approach is described based on nonuniform concatenation of the 7-qubit Steane code with the 15-qubit Reed-Muller code, as well as the 5-qubit code with the 15-qubit Reed-Muller code, which lead to two 49-qubit and 47-qubit codes, respectively. These codes can correct any arbitrary single physical error with the ability to perform a universal set of fault-tolerant gates, without using magic state distillation.

  8. Goldstone (GDSCC) administrative computing

    NASA Technical Reports Server (NTRS)

    Martin, H.

    1981-01-01

    The GDSCC Data Processing Unit provides various administrative computing services for Goldstone. Those activities, including finance, manpower and station utilization, deep-space station scheduling and engineering change order (ECO) control are discussed.

  9. Computer Analysis of Electromagnetic Field Exposure Hazard for Space Station Astronauts during Extravehicular Activity

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Kelley, James S.; Panneton, Robert B.; Arndt, G. Dickey

    1995-01-01

    In order to estimate the RF radiation hazards to astronauts and electronics equipment due to various Space Station transmitters, the electric fields around the various Space Station antennas are computed using the rigorous Computational Electromagnetics (CEM) techniques. The Method of Moments (MoM) was applied to the UHF and S-band low gain antennas. The Aperture Integration (AI) method and the Geometrical Theory of Diffraction (GTD) method were used to compute the electric field intensities for the S- and Ku-band high gain antennas. As a result of this study, The regions in which the electric fields exceed the specified exposure levels for the Extravehicular Mobility Unit (EMU) electronics equipment and Extravehicular Activity (EVA) astronaut are identified for various Space Station transmitters.

  10. Feasibility of Acoustic Doppler Velocity Meters for the Production of Discharge Records from U.S. Geological Survey Streamflow-Gaging Stations

    USGS Publications Warehouse

    Morlock, Scott E.; Nguyen, Hieu T.; Ross, Jerry H.

    2002-01-01

    It is feasible to use acoustic Doppler velocity meters (ADVM's) installed at U.S. Geological Survey (USGS) streamflow-gaging stations to compute records of river discharge. ADVM's are small acoustic current meters that use the Doppler principle to measure water velocities in a two-dimensional plane. Records of river discharge can be computed from stage and ADVM velocity data using the 'index velocity' method. The ADVM-measured velocities are used as an estimator or 'index' of the mean velocity in the channel. In evaluations of ADVM's for the computation of records of river discharge, the USGS installed ADVM's at three streamflow-gaging stations in Indiana: Kankakee River at Davis, Fall Creek at Millersville, and Iroquois River near Foresman. The ADVM evaluation study period was from June 1999 to February 2001. Discharge records were computed, using ADVM data from each station. Discharge records also were computed using conventional stage-discharge methods of the USGS. The records produced from ADVM and conventional methods were compared with discharge record hydrographs and statistics. Overall, the records compared closely from the Kankakee River and Fall Creek stations. For the Iroquois River station, variable backwater was present and affected the comparison; because the ADVM record compensates for backwater, the ADVM record may be superior to the conventional record. For the three stations, the ADVM records were judged to be of a quality acceptable to USGS standards for publications and near realtime ADVM-computed discharges are served on USGS real-time data World Wide Web pages.

  11. Green's function methods in heavy ion shielding

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Costen, Robert C.; Shinn, Judy L.; Badavi, Francis F.

    1993-01-01

    An analytic solution to the heavy ion transport in terms of Green's function is used to generate a highly efficient computer code for space applications. The efficiency of the computer code is accomplished by a nonperturbative technique extending Green's function over the solution domain. The computer code can also be applied to accelerator boundary conditions to allow code validation in laboratory experiments.

  12. A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing

    NASA Astrophysics Data System (ADS)

    Heudin, J. C.; Metivier, C.; Demigny, D.; Maurin, T.; Zavidovique, B.; Devos, F.

    1987-05-01

    It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.

  13. Analytical modeling of operating characteristics of premixing-prevaporizing fuel-air mixing passages. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.

    1982-01-01

    A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.

  14. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  15. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  16. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  17. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  18. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Method of computing coverage. 80.771 Section 80.771 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method...

  19. Automated Meteor Fluxes with a Wide-Field Meteor Camera Network

    NASA Technical Reports Server (NTRS)

    Blaauw, R. C.; Campbell-Brown, M. D.; Cooke, W.; Weryk, R. J.; Gill, J.; Musci, R.

    2013-01-01

    Within NASA, the Meteoroid Environment Office (MEO) is charged to monitor the meteoroid environment in near ]earth space for the protection of satellites and spacecraft. The MEO has recently established a two ]station system to calculate automated meteor fluxes in the millimeter ]size ]range. The cameras each consist of a 17 mm focal length Schneider lens on a Watec 902H2 Ultimate CCD video camera, producing a 21.7 x 16.3 degree field of view. This configuration has a red ]sensitive limiting meteor magnitude of about +5. The stations are located in the South Eastern USA, 31.8 kilometers apart, and are aimed at a location 90 km above a point 50 km equidistant from each station, which optimizes the common volume. Both single station and double station fluxes are found, each having benefits; more meteors will be detected in a single camera than will be seen in both cameras, producing a better determined flux, but double station detections allow for non ]ambiguous shower associations and permit speed/orbit determinations. Video from the cameras are fed into Linux computers running the ASGARD (All Sky and Guided Automatic Real ]time Detection) software, created by Rob Weryk of the University of Western Ontario Meteor Physics Group. ASGARD performs the meteor detection/photometry, and invokes the MILIG and MORB codes to determine the trajectory, speed, and orbit of the meteor. A subroutine in ASGARD allows for the approximate shower identification in single station meteors. The ASGARD output is used in routines to calculate the flux in units of #/sq km/hour. The flux algorithm employed here differs from others currently in use in that it does not assume a single height for all meteors observed in the common camera volume. In the MEO system, the volume is broken up into a set of height intervals, with the collecting areas determined by the radiant of active shower or sporadic source. The flux per height interval is summed to obtain the total meteor flux. As ASGARD also computes the meteor mass from the photometry, a mass flux can be also calculated. Weather conditions in the southeastern United States are seldom ideal, which introduces the difficulty of a variable sky background. First a weather algorithm indicates if sky conditions are clear enough to calculate fluxes, at which point a limiting magnitude algorithm is employed. The limiting magnitude algorithm performs a fit of stellar magnitudes vs camera intensities. The stellar limiting magnitude is derived from this and easily converted to a limiting meteor magnitude for the active shower or sporadic source.

  20. Automated apparatus and method of generating native code for a stitching machine

    NASA Technical Reports Server (NTRS)

    Miller, Jeffrey L. (Inventor)

    2000-01-01

    A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.

  1. Computer codes developed and under development at Lewis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1992-01-01

    The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.

  2. A local network integrated into a balloon-borne apparatus

    NASA Astrophysics Data System (ADS)

    Imori, Masatosi; Ueda, Ikuo; Shimamura, Kotaro; Maeno, Tadashi; Murata, Takahiro; Sasaki, Makoto; Matsunaga, Hiroyuki; Matsumoto, Hiroshi; Shikaze, Yoshiaki; Anraku, Kazuaki; Matsui, Nagataka; Yamagami, Takamasa

    A local network is incorporated into an apparatus for a balloon-borne experiment. A balloon-borne system implemented in the apparatus is composed of subsystems interconnected through a local network, which introduces modular architecture into the system. The network decomposes the balloon-borne system into subsystems, which are similarly structured from the point of view that the systems is kept under the control of a ground station. The subsystem is functionally self-contained and electrically independent. A computer is integrated into a subsystem, keeping the subsystem under the control. An independent group of batteries, being dedicated to a subsystem, supplies the whole electricity of the subsystem. The subsystem could be turned on and off independently of the other subsystems. So communication among the subsystems needs to be based on such a protocol that could guarantee the independence of the individual subsystems. An Omninet protocol is employed to network the subsystems. A ground station sends commands to the balloon-borne system. The command is received and executed at the system, then results of the execution are returned to the ground station. Various commands are available so that the system borne on a balloon could be controlled and monitored remotely from the ground station. A subsystem responds to a specific group of commands. A command is received by a transceiver subsystem and then transferred through the network to the subsystem to which the command is addressed. Then the subsystem executes the command and returns results to the transceiver subsystem, where the results are telemetered to the ground station. The network enhances independence of the individual subsystems, which enables programs of the individual subsystems to be coded independently. Independence facilitates development and debugging of programs, improving the quality of the system borne on a balloon.

  3. Inverse identification of unknown finite-duration air pollutant release from a point source in urban environment

    NASA Astrophysics Data System (ADS)

    Kovalets, Ivan V.; Efthimiou, George C.; Andronopoulos, Spyros; Venetsanos, Alexander G.; Argyropoulos, Christos D.; Kakosimos, Konstantinos E.

    2018-05-01

    In this work, we present an inverse computational method for the identification of the location, start time, duration and quantity of emitted substance of an unknown air pollution source of finite time duration in an urban environment. We considered a problem of transient pollutant dispersion under stationary meteorological fields, which is a reasonable assumption for the assimilation of available concentration measurements within 1 h from the start of an incident. We optimized the calculation of the source-receptor function by developing a method which requires integrating as many backward adjoint equations as the available measurement stations. This resulted in high numerical efficiency of the method. The source parameters are computed by maximizing the correlation function of the simulated and observed concentrations. The method has been integrated into the CFD code ADREA-HF and it has been tested successfully by performing a series of source inversion runs using the data of 200 individual realizations of puff releases, previously generated in a wind tunnel experiment.

  4. Low-flow analysis and selected flow statistics representative of 1930-2002 for streamflow-gaging stations in or near West Virginia

    USGS Publications Warehouse

    Wiley, Jeffrey B.

    2006-01-01

    Five time periods between 1930 and 2002 are identified as having distinct patterns of annual minimum daily mean flows (minimum flows). Average minimum flows increased around 1970 at many streamflow-gaging stations in West Virginia. Before 1930, however, there might have been a period of minimum flows greater than any period identified between 1930 and 2002. The effects of climate variability are probably the principal causes of the differences among the five time periods. Comparisons of selected streamflow statistics are made between values computed for the five identified time periods and values computed for the 1930-2002 interval for 15 streamflow-gaging stations. The average difference between statistics computed for the five time periods and the 1930-2002 interval decreases with increasing magnitude of the low-flow statistic. The greatest individual-station absolute difference was 582.5 percent greater for the 7-day 10-year low flow computed for 1970-1979 compared to the value computed for 1930-2002. The hydrologically based low flows indicate approximately equal or smaller absolute differences than biologically based low flows. The average 1-day 3-year biologically based low flow (1B3) and 4-day 3-year biologically based low flow (4B3) are less than the average 1-day 10-year hydrologically based low flow (1Q10) and 7-day 10-year hydrologic-based low flow (7Q10) respectively, and range between 28.5 percent less and 13.6 percent greater. Seasonally, the average difference between low-flow statistics computed for the five time periods and 1930-2002 is not consistent between magnitudes of low-flow statistics, and the greatest difference is for the summer (July 1-September 30) and fall (October 1-December 31) for the same time period as the greatest difference determined in the annual analysis. The greatest average difference between 1B3 and 4B3 compared to 1Q10 and 7Q10, respectively, is in the spring (April 1-June 30), ranging between 11.6 and 102.3 percent greater. Statistics computed for the individual station's record period may not represent the statistics computed for the period 1930 to 2002 because (1) station records are available predominantly after about 1970 when minimum flows were greater than the average between 1930 and 2002 and (2) some short-term station records are mostly during dry periods, whereas others are mostly during wet periods. A criterion-based sampling of the individual station's record periods at stations was taken to reduce the effects of statistics computed for the entire record periods not representing the statistics computed for 1930-2002. The criterion used to sample the entire record periods is based on a comparison between the regional minimum flows and the minimum flows at the stations. Criterion-based sampling of the available record periods was superior to record-extension techniques for this study because more stations were selected and areal distribution of stations was more widespread. Principal component and correlation analyses of the minimum flows at 20 stations in or near West Virginia identify three regions of the State encompassing stations with similar patterns of minimum flows: the Lower Appalachian Plateaus, the Upper Appalachian Plateaus, and the Eastern Panhandle. All record periods of 10 years or greater between 1930 and 2002 where the average of the regional minimum flows are nearly equal to the average for 1930-2002 are determined as representative of 1930-2002. Selected statistics are presented for the longest representative record period that matches the record period for 77 stations in West Virginia and 40 stations near West Virginia. These statistics can be used to develop equations for estimating flow in ungaged stream locations.

  5. The development of fuel performance models at the European institute for transuranium elements

    NASA Astrophysics Data System (ADS)

    Lassmann, K.; Ronchi, C.; Small, G. J.

    1989-07-01

    The design and operational performance of fuel rods for nuclear power stations has been the subject of detailed experimental research for over thirty years. In the last two decades the continuous demands for greater economy in conjunction with more stringent safety criteria have led to an increasing reliance on computer simulations. Conditions within a fuel rod must be calculated both for normal operation and for proposed reactor faults. It has thus been necessary to build up a reliable, theoretical understanding of the intricate physical, mechanical and chemical processes occurring under a wide range of conditions to obtain a quantitative insight into the behaviour of the fuel. A prime requirement, which has also proved to be the most taxing, is to predict the conditions under which failure of the cladding might occur, particularly in fuel nearing the end of its useful life. In this paper the general requirements of a fuel performance code are discussed briefly and an account is given of the basic concepts of code construction. An overview is then given of recent progress at the European Institute for Transuranium Elements in the development of a fuel rod performance code for general application and of more detailed mechanistic models for fission product behaviour.

  6. 47 CFR 73.185 - Computation of interfering signal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...

  7. 47 CFR 73.185 - Computation of interfering signal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...

  8. 47 CFR 73.185 - Computation of interfering signal.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...

  9. 47 CFR 73.185 - Computation of interfering signal.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...

  10. 47 CFR 73.185 - Computation of interfering signal.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Computation of interfering signal. 73.185... RADIO BROADCAST SERVICES AM Broadcast Stations § 73.185 Computation of interfering signal. (a) Measured... paragraphs (a) (1) or (2) of this section. (b) For skywave signals from stations operating on all channels...

  11. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru

    2010-12-15

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less

  12. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    NASA Astrophysics Data System (ADS)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.

    2010-12-01

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.

  13. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  14. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    NASA Technical Reports Server (NTRS)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  15. IET. Coupling station (TAN620) and service room section and details. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Coupling station (TAN-620) and service room section and details. Interior electrical features inside coupling station. Cable terminal assembly for patch panel for plug. Ralph M. Parsons 902-4-ANP-620-E 401. Date: February 1954. Approved by INEEL Classification Office for public release. INEEL index code no. 035-0620-10-693-106958 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  16. Web Card - Clean Cities Plug-In Electric Vehicle Handbook for Public Charging Station Hosts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    A 2" x 3-1/4" web card which has a quick response code for accessing the PEV Handbook for Public Charging Station Hosts via a smart phone. The cards are intended to be handed out instead of the handbook.

  17. Computer Description of the Field Artillery Ammunition Supply Vehicle

    DTIC Science & Technology

    1983-04-01

    Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and

  18. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  19. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  20. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  1. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...

  2. Computer/gaming station use in youth: Correlations among use, addiction and functional impairment

    PubMed Central

    Baer, Susan; Saran, Kelly; Green, David A

    2012-01-01

    OBJECTIVE: Computer/gaming station use is ubiquitous in the lives of youth today. Overuse is a concern, but it remains unclear whether problems arise from addictive patterns of use or simply excessive time spent on use. The goal of the present study was to evaluate computer/gaming station use in youth and to examine the relationship between amounts of use, addictive features of use and functional impairment. METHOD: A total of 110 subjects (11 to 17 years of age) from local schools participated. Time spent on television, video gaming and non-gaming recreational computer activities was measured. Addictive features of computer/gaming station use were ascertained, along with emotional/behavioural functioning. Multiple linear regressions were used to understand how youth functioning varied with time of use and addictive features of use. RESULTS: Mean (± SD) total screen time was 4.5±2.4 h/day. Addictive features of use were consistently correlated with functional impairment across multiple measures and informants, whereas time of use, after controlling for addiction, was not. CONCLUSIONS: Youth are spending many hours each day in front of screens. In the absence of addictive features of computer/gaming station use, time spent is not correlated with problems; however, youth with addictive features of use show evidence of poor emotional/ behavioural functioning. PMID:24082802

  3. Antenna pattern study, task 2

    NASA Technical Reports Server (NTRS)

    Harper, Warren

    1989-01-01

    Two electromagnetic scattering codes, NEC-BSC and ESP3, were delivered and installed on a NASA VAX computer for use by Marshall Space Flight Center antenna design personnel. The existing codes and certain supplementary software were updated, the codes installed on a computer that will be delivered to the customer, to provide capability for graphic display of the data to be computed by the use of the codes and to assist the customer in the solution of specific problems that demonstrate the use of the codes. With the exception of one code revision, all of these tasks were performed.

  4. Surface wave phase velocities from 2-D surface wave tomography studies in the Anatolian plate

    NASA Astrophysics Data System (ADS)

    Arif Kutlu, Yusuf; Erduran, Murat; Çakır, Özcan; Vinnik, Lev; Kosarev, Grigoriy; Oreshin, Sergey

    2014-05-01

    We study the Rayleigh and Love surface wave fundamental mode propagation beneath the Anatolian plate. To examine the inter-station phase velocities a two-station method is used along with the Multiple Filter Technique (MFT) in the Computer Programs in Seismology (Herrmann and Ammon, 2004). The near-station waveform is deconvolved from the far-station waveform removing the propagation effects between the source and the station. This method requires that the near and far stations are aligned with the epicentre on a great circle path. The azimuthal difference of the earthquake to the two-stations and the azimuthal difference between the earthquake and the station are restricted to be smaller than 5o. We selected 3378 teleseismic events (Mw >= 5.7) recorded by 394 broadband local stations with high signal-to-noise ratio within the years 1999-2013. Corrected for the instrument response suitable seismogram pairs are analyzed with the two-station method yielding a collection of phase velocity curves in various period ranges (mainly in the range 25-185 sec). Diffraction from lateral heterogeneities, multipathing, interference of Rayleigh and Love waves can alter the dispersion measurements. In order to obtain quality measurements, we select only smooth portions of the phase velocity curves, remove outliers and average over many measurements. We discard these average phase velocity curves suspected of suffering from phase wrapping errors by comparing them with a reference Earth model (IASP91 by Kennett and Engdahl, 1991). The outlined analysis procedure yields 3035 Rayleigh and 1637 Love individual phase velocity curves. To obtain Rayleigh and Love wave travel times for a given region we performed 2-D tomographic inversion for which the Fast Marching Surface Tomography (FMST) code developed by N. Rawlinson at the Australian National University was utilized. This software package is based on the multistage fast marching method by Rawlinson and Sambridge (2004a, 2004b). The azimuthal coverage of the respective two-station paths is proper to analyze the observed dispersion curves in terms of both azimuthal and radial anisotropy beneath the study region. This research is supported by Joint Research Project of the Scientific and Research Council of Turkey (TUBİTAK- Grant number 111Y190) and the Russian Federation for Basic Research (RFBR).

  5. Cloud Compute for Global Climate Station Summaries

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; May, B.; Cogbill, P.

    2017-12-01

    Global Climate Station Summaries are simple indicators of observational normals which include climatic data summarizations and frequency distributions. These typically are statistical analyses of station data over 5-, 10-, 20-, 30-year or longer time periods. The summaries are computed from the global surface hourly dataset. This dataset totaling over 500 gigabytes is comprised of 40 different types of weather observations with 20,000 stations worldwide. NCEI and the U.S. Navy developed these value added products in the form of hourly summaries from many of these observations. Enabling this compute functionality in the cloud is the focus of the project. An overview of approach and challenges associated with application transition to the cloud will be presented.

  6. Micrometeoroid and Orbital Debris Risk Assessment With Bumper 3

    NASA Technical Reports Server (NTRS)

    Hyde, J.; Bjorkman, M.; Christiansen, E.; Lear, D.

    2017-01-01

    The Bumper 3 computer code is the primary tool used by NASA for micrometeoroid and orbital debris (MMOD) risk analysis. Bumper 3 (and its predecessors) have been used to analyze a variety of manned and unmanned spacecraft. The code uses NASA's latest micrometeoroid (MEM-R2) and orbital debris (ORDEM 3.0) environment definition models and is updated frequently with ballistic limit equations that describe the hypervelocity impact performance of spacecraft materials. The Bumper 3 program uses these inputs along with a finite element representation of spacecraft geometry to provide a deterministic calculation of the expected number of failures. The Bumper 3 software is configuration controlled by the NASA/JSC Hypervelocity Impact Technology (HVIT) Group. This paper will demonstrate MMOD risk assessment techniques with Bumper 3 used by NASA's HVIT Group. The Permanent Multipurpose Module (PMM) was added to the International Space Station in 2011. A Bumper 3 MMOD risk assessment of this module will show techniques used to create the input model and assign the property IDs. The methodology used to optimize the MMOD shielding for minimum mass while still meeting structural penetration requirements will also be demonstrated.

  7. MELCOR simulations of the severe accident at Fukushima Daiichi Unit 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey; Gauntt, Randall; Kalinich, Donald

    In response to the accident at the Fukushima Daiichi nuclear power station in Japan, the U.S. Nuclear Regulatory Commission and U.S. Department of Energy agreed to jointly sponsor an accident reconstruction study as a means of assessing the severe accident modeling capability of the MELCOR code. Objectives of the project included reconstruction of the accident progressions using computer models and accident data, and validation of the MELCOR code and the Fukushima models against plant data. A MELCOR 2.1 model of the Fukushima Daiichi Unit 3 reactor is developed using plant-specific information and accident-specific boundary conditions, which involve considerable uncertainty duemore » to the inherent nature of severe accidents. Publicly available thermal-hydraulic data and radioactivity release estimates have evolved significantly since the accidents. Such data are expected to continually change as the reactors are decommissioned and more measurements are performed. As a result, the MELCOR simulations in this work primarily use boundary conditions that are based on available plant data as of May 2012.« less

  8. Experimental unsteady pressures at flutter on the Supercritical Wing Benchmark Model

    NASA Technical Reports Server (NTRS)

    Dansberry, Bryan E.; Durham, Michael H.; Bennett, Robert M.; Rivera, Jose A.; Silva, Walter A.; Wieseman, Carol D.; Turnock, David L.

    1993-01-01

    This paper describes selected results from the flutter testing of the Supercritical Wing (SW) model. This model is a rigid semispan wing having a rectangular planform and a supercritical airfoil shape. The model was flutter tested in the Langley Transonic Dynamics Tunnel (TDT) as part of the Benchmark Models Program, a multi-year wind tunnel activity currently being conducted by the Structural Dynamics Division of NASA Langley Research Center. The primary objective of this program is to assist in the development and evaluation of aeroelastic computational fluid dynamics codes. The SW is the second of a series of three similar models which are designed to be flutter tested in the TDT on a flexible mount known as the Pitch and Plunge Apparatus. Data sets acquired with these models, including simultaneous unsteady surface pressures and model response data, are meant to be used for correlation with analytical codes. Presented in this report are experimental flutter boundaries and corresponding steady and unsteady pressure distribution data acquired over two model chords located at the 60 and 95 percent span stations.

  9. MELCOR simulations of the severe accident at Fukushima Daiichi Unit 3

    DOE PAGES

    Cardoni, Jeffrey; Gauntt, Randall; Kalinich, Donald; ...

    2014-05-01

    In response to the accident at the Fukushima Daiichi nuclear power station in Japan, the U.S. Nuclear Regulatory Commission and U.S. Department of Energy agreed to jointly sponsor an accident reconstruction study as a means of assessing the severe accident modeling capability of the MELCOR code. Objectives of the project included reconstruction of the accident progressions using computer models and accident data, and validation of the MELCOR code and the Fukushima models against plant data. A MELCOR 2.1 model of the Fukushima Daiichi Unit 3 reactor is developed using plant-specific information and accident-specific boundary conditions, which involve considerable uncertainty duemore » to the inherent nature of severe accidents. Publicly available thermal-hydraulic data and radioactivity release estimates have evolved significantly since the accidents. Such data are expected to continually change as the reactors are decommissioned and more measurements are performed. As a result, the MELCOR simulations in this work primarily use boundary conditions that are based on available plant data as of May 2012.« less

  10. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...

  11. Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB

    NASA Technical Reports Server (NTRS)

    Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.

    2017-01-01

    Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.

  12. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  13. Geoid undulation computations at laser tracking stations

    NASA Technical Reports Server (NTRS)

    Despotakis, Vasilios K.

    1987-01-01

    Geoid undulation computations were performed at 29 laser stations distributed around the world using a combination of terrestrial gravity data within a cap of radius 2 deg and a potential coefficient set up to 180 deg. The traditional methods of Stokes' and Meissl's modification together with the Molodenskii method and the modified Sjoberg method were applied. Performing numerical tests based on global error assumptions regarding the terrestrial data and the geopotential set it was concluded that the modified Sjoberg method is the most accurate and promising technique for geoid undulation computations. The numerical computations for the geoid undulations using all the four methods resulted in agreement with the ellipsoidal minus orthometric value of the undulations on the order of 60 cm or better for most of the laser stations in the eastern United States, Australia, Japan, Bermuda, and Europe. A systematic discrepancy of about 2 meters for most of the western United States stations was detected and verified by using two relatively independent data sets. For oceanic laser stations in the western Atlantic and Pacific oceans that have no terrestrial data available, the adjusted GEOS-3 and SEASAT altimeter data were used for the computation of the geoid undulation in a collocation method.

  14. SDTM - SYSTEM DESIGN TRADEOFF MODEL FOR SPACE STATION FREEDOM RELEASE 1.1

    NASA Technical Reports Server (NTRS)

    Chamberlin, R. G.

    1994-01-01

    Although extensive knowledge of space station design exists, the information is widely dispersed. The Space Station Freedom Program (SSFP) needs policies and procedures that ensure the use of consistent design objectives throughout its organizational hierarchy. The System Design Tradeoff Model (SDTM) produces information that can be used for this purpose. SDTM is a mathematical model of a set of possible designs for Space Station Freedom. Using the SDTM program, one can find the particular design which provides specified amounts of resources to Freedom's users at the lowest total (or life cycle) cost. One can also compare alternative design concepts by changing the set of possible designs, while holding the specified user services constant, and then comparing costs. Finally, both costs and user services can be varied simultaneously when comparing different designs. SDTM selects its solution from a set of feasible designs. Feasibility constraints include safety considerations, minimum levels of resources required for station users, budget allocation requirements, time limitations, and Congressional mandates. The total, or life cycle, cost includes all of the U.S. costs of the station: design and development, purchase of hardware and software, assembly, and operations throughout its lifetime. The SDTM development team has identified, for a variety of possible space station designs, the subsystems that produce the resources to be modeled. The team has also developed formulas for the cross consumption of resources by other resources, as functions of the amounts of resources produced. SDTM can find the values of station resources, so that subsystem designers can choose new design concepts that further reduce the station's life cycle cost. The fundamental input to SDTM is a set of formulas that describe the subsystems which make up a reference design. Most of the formulas identify how the resources required by each subsystem depend upon the size of the subsystem. Some of the formulas describe how the subsystem costs depend on size. The formulas can be complicated and nonlinear (if nonlinearity is needed to describe how designs change with size). SDTM's outputs are amounts of resources, life-cycle costs, and marginal costs. SDTM will run on IBM PC/XTs, ATs, and 100% compatibles with 640K of RAM and at least 3Mb of fixed-disk storage. A printer which can print in 132-column mode is also required, and a mathematics co-processor chip is highly recommended. This code is written in Turbo C 2.0. However, since the developers used a modified version of the proprietary Vitamin C source code library, the complete source code is not available. The executable is provided, along with all non-proprietary source code. This program was developed in 1989.

  15. 77 FR 26543 - UGI Storage Company; Notice of Request Under Blanket Authorization

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ... approximately 3,450 horsepower (hp) of gas fired compression at its existing Palmer Station in Tioga County... compressor units and one 690 hp unit at the Palmer Station located at the downstream terminus of the TL-94...-3-12; 8:45 am] BILLING CODE 6717-01-P ...

  16. 47 CFR 80.357 - Working frequencies for Morse code and data transmission.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... a receive only frequency by ship stations. It is used by U.S. Coast Guard coast stations for NB-DP....0 6285.0 8342.0 12422.0 16619.0 22242.0 25161.5 8343.5 12453.0 16650.0 22273.0 16681.0 W2 4187.5...

  17. 78 FR 9745 - Kewaunee Power Station; Application for Amendment to Facility Operating License

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... FURTHER INFORMATION CONTACT: Karl Feintuch, Project Manager, Office of Nuclear Reactor Regulation, U.S... Licensing, Office of Nuclear Reactor Regulation. [FR Doc. 2013-03037 Filed 2-8-13; 8:45 am] BILLING CODE... NUCLEAR REGULATORY COMMISSION [Docket No. 50-305; NRC-2013-0028] Kewaunee Power Station...

  18. Comparison of the 2-, 25-, and 100-year recurrence interval floods computed from observed data with the 1995 urban flood-frequency estimating equations for Georgia

    USGS Publications Warehouse

    Inman, Ernest J.

    1997-01-01

    Flood-frequency relations were computed for 28 urban stations, for 2-, 25-, and 100-year recurrence interval floods and the computations were compared to corresponding recurrence interval floods computed from the estimating equations from a 1995 investigation. Two stations were excluded from further comparisons or analyses because neither station had a significant flood during the period of observed record. The comparisons, based on the student's t-test statistics at the 0.05 level of significance, indicate that the mean residuals of the 25- and 100-year floods were negatively biased by 26.2 percent and 31.6 percent, respectively, at the 26 stations. However, the mean residuals of the 2-year floods were 2.5 percent lower than the mean of the 2-year floods computed from the equations, and were not significantly biased. The reason for this negative bias is that the period of observed record at the 26 stations was a relatively dry period. At 25 of the 26 stations, the two highest simulated peaks used to develop the estimating equations occurred many years before the observed record began. However, no attempt was made to adjust the estimating equations because higher peaks could occur after the period of observed record and an adjustment to the equations would cause an underestimation of design floods.

  19. Interesting viewpoints to those who will put Ada into practice

    NASA Technical Reports Server (NTRS)

    Carlsson, Arne

    1986-01-01

    Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.

  20. 3D electrical conductivity tomography of volcanoes

    NASA Astrophysics Data System (ADS)

    Soueid Ahmed, A.; Revil, A.; Byrdina, S.; Coperey, A.; Gailler, L.; Grobbe, N.; Viveiros, F.; Silva, C.; Jougnot, D.; Ghorbani, A.; Hogg, C.; Kiyan, D.; Rath, V.; Heap, M. J.; Grandis, H.; Humaida, H.

    2018-05-01

    Electrical conductivity tomography is a well-established galvanometric method for imaging the subsurface electrical conductivity distribution. We characterize the conductivity distribution of a set of volcanic structures that are different in terms of activity and morphology. For that purpose, we developed a large-scale inversion code named ECT-3D aimed at handling complex topographical effects like those encountered in volcanic areas. In addition, ECT-3D offers the possibility of using as input data the two components of the electrical field recorded at independent stations. Without prior information, a Gauss-Newton method with roughness constraints is used to solve the inverse problem. The roughening operator used to impose constraints is computed on unstructured tetrahedral elements to map complex geometries. We first benchmark ECT-3D on two synthetic tests. A first test using the topography of Mt. St Helens volcano (Washington, USA) demonstrates that we can successfully reconstruct the electrical conductivity field of an edifice marked by a strong topography and strong variations in the resistivity distribution. A second case study is used to demonstrate the versatility of the code in using the two components of the electrical field recorded on independent stations along the ground surface. Then, we apply our code to real data sets recorded at (i) a thermally active area of Yellowstone caldera (Wyoming, USA), (ii) a monogenetic dome on Furnas volcano (the Azores, Portugal), and (iii) the upper portion of the caldera of Kīlauea (Hawai'i, USA). The tomographies reveal some of the major structures of these volcanoes as well as identifying alteration associated with high surface conductivities. We also review the petrophysics underlying the interpretation of the electrical conductivity of fresh and altered volcanic rocks and molten rocks to show that electrical conductivity tomography cannot be used as a stand-alone technique due to the non-uniqueness in interpreting electrical conductivity tomograms. That said, new experimental data provide evidence regarding the strong role of alteration in the vicinity of preferential fluid flow paths including magmatic conduits and hydrothermal vents.

  1. Development of a model and computer code to describe solar grade silicon production processes

    NASA Technical Reports Server (NTRS)

    Gould, R. K.; Srivastava, R.

    1979-01-01

    Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.

  2. Description of real-time Ada software implementation of a power system monitor for the Space Station Freedom PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Ludwig, Kimberly; Mackin, Michael; Wright, Theodore

    1991-01-01

    The Ada language software development to perform the electrical system monitoring functions for the NASA Lewis Research Center's Power Management and Distribution (PMAD) DC testbed is described. The results of the effort to implement this monitor are presented. The PMAD DC testbed is a reduced-scale prototype of the electrical power system to be used in the Space Station Freedom. The power is controlled by smart switches known as power control components (or switchgear). The power control components are currently coordinated by five Compaq 382/20e computers connected through an 802.4 local area network. One of these computers is designated as the control node with the other four acting as subsidiary controllers. The subsidiary controllers are connected to the power control components with a Mil-Std-1553 network. An operator interface is supplied by adding a sixth computer. The power system monitor algorithm is comprised of several functions including: periodic data acquisition, data smoothing, system performance analysis, and status reporting. Data is collected from the switchgear sensors every 100 milliseconds, then passed through a 2 Hz digital filter. System performance analysis includes power interruption and overcurrent detection. The reporting mechanism notifies an operator of any abnormalities in the system. Once per second, the system monitor provides data to the control node for further processing, such as state estimation. The system monitor required a hardware time interrupt to activate the data acquisition function. The execution time of the code was optimized using an assembly language routine. The routine allows direct vectoring of the processor to Ada language procedures that perform periodic control activities. A summary of the advantages and side effects of this technique are discussed.

  3. Glocalized New Age Spirituality: A Mental Map of the New Central Bus Station in Tel Aviv, Deciphered through Its Visual Codes and Based on Ethno-Visual Research

    ERIC Educational Resources Information Center

    Ben-Peshat, Malka; Sitton, Shoshana

    2011-01-01

    We present here the findings of an ethno-visual research study involving the creation of a mental map of images, artifacts and practices in Tel Aviv's New Central Bus Station. This huge and complex building, part bus station, part shopping mall, has become a stage for multicultural encounters and interactions among diverse communities of users.…

  4. Calendar Year 2002 Pollution Prevention Annual Data Summary (P2ADS)

    DTIC Science & Technology

    2003-07-01

    trfkb.navy.mil NAVAL WEAPONS STATION EARLE COLTS NECK, NJ 11. Success Description: NWS Earle receives most of its hazardous waste from home ported...offload operations must be done within very A-5 Appendix A COMMANDER U.S. ATLANTIC FLEET (LANTFLT) NAVAL WEAPONS STATION EARLE COLTS NECK, NJ...2339 Code: N8E Fax: (732) 866-1290 Email: dswalwel@earle.navy.mil NAVAL WEAPONS STATION YORKTOWN, VA 12. Success Description: Aqueous weapons

  5. Evolving technologies for Space Station Freedom computer-based workstations

    NASA Technical Reports Server (NTRS)

    Jensen, Dean G.; Rudisill, Marianne

    1990-01-01

    Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.

  6. Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO

    NASA Technical Reports Server (NTRS)

    Stallworth, R.; Meyers, C. A.; Stinson, H. C.

    1989-01-01

    Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.

  7. Computational Predictions of the Performance Wright 'Bent End' Propellers

    NASA Technical Reports Server (NTRS)

    Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)

    2002-01-01

    Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.

  8. Williams works on computer in the U.S. Laboratory during Expedition 13

    NASA Image and Video Library

    2006-04-15

    ISS013-E-07975 (15 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  9. Williams uses computer in the U.S. Laboratory during Expedition 13

    NASA Image and Video Library

    2006-04-11

    ISS013-E-05853 (11 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  10. Fabrication and Evaluation of InSb CID Arrays

    DTIC Science & Technology

    1976-08-01

    Eck l Mail Stop 55 Santa Barbara Research Center 75 Coromar Drive Goleta, California 93017 Stephen P. Emmons ’■ Mail Stop 134 Texas Instruments...Attn: Code 2629 ° Attn: Code 2627 6 Defense Documentation Center, Bldg. 5 - S47031 Cameron Station, Alexandrias Va. 22314 12

  11. Proceduracy: Computer Code Writing in the Continuum of Literacy

    ERIC Educational Resources Information Center

    Vee, Annette

    2010-01-01

    This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…

  12. Computer Code Aids Design Of Wings

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Darden, Christine M.

    1993-01-01

    AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.

  13. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  14. APC: A New Code for Atmospheric Polarization Computations

    NASA Technical Reports Server (NTRS)

    Korkin, Sergey V.; Lyapustin, Alexei I.; Rozanov, Vladimir V.

    2014-01-01

    A new polarized radiative transfer code Atmospheric Polarization Computations (APC) is described. The code is based on separation of the diffuse light field into anisotropic and smooth (regular) parts. The anisotropic part is computed analytically. The smooth regular part is computed numerically using the discrete ordinates method. Vertical stratification of the atmosphere, common types of bidirectional surface reflection and scattering by spherical particles or spheroids are included. A particular consideration is given to computation of the bidirectional polarization distribution function (BPDF) of the waved ocean surface.

  15. Improving Station Performance by Building Isolation Walls in the Tunnel

    NASA Astrophysics Data System (ADS)

    Jia, Yan; Horn, Nikolaus; Leohardt, Roman

    2014-05-01

    Conrad Observatory is situated far away from roads and industrial areas on the Trafelberg in Lower Austria. At the end of the seismic tunnel, the main seismic instrument of the Observatory with a station code CONA is located. This station is one of the most important seismic stations in the Austrian Seismic Network (network code OE). The seismic observatory consists of a 145m long gallery and an underground laboratory building with several working areas. About 25 meters away from the station CONA, six temporary seismic stations were implemented for research purposes. Two of them were installed with the same equipment as CONA, while the remaining four stations were set up with digitizers having lower noise and higher resolution (Q330HR) and sensors with the same type (STS-2). In order to prevent possible disturbances by air pressure and temperature fluctuation, three walls were built inside of the tunnel. The first wall is located ca 63 meters from the tunnel entrance, while a set of double walls with a distance of 1.5 meters is placed about 53 meters from the first isolation wall but between the station CONA and the six temporary stations. To assess impact of the isolation walls on noise reduction and detection performance, investigations are conducted in two steps. The first study is carried out by comparing the noise level and detection performance between the station CONA behind the double walls and the stations in front of the double walls for verifying the noise isolation by the double walls. To evaluate the effect of the single wall, station noise level and detection performance were studied by comparing the results before and after the installation of the wall. Results and discussions will be presented. Additional experiment is conducted by filling insulation material inside of the aluminium boxes of the sensors (above and around the sensors). This should help us to determine an optimal insulation of the sensors with respect to pressure and temperature fluctuations.

  16. Synthetic Flight Training System Study

    DTIC Science & Technology

    1983-12-23

    Distribution unlimited IC. SUPPLEMENTARY NOTiS - 19. KEY WORDS (Continue on reveree side if necoeeary and Identify by block nunber) Visual Systems Computer ...platforms, instructional features, computer hardware and software, student stations, etc. DOR 1473 EDITON OF INMOV6S ISOSOLETE Unclassified SECURITY... Computational Systems .................................... 4-I I 4.5.3 Visual Processing Systems .......................... 4-13 4.5.4 Instructor Stations

  17. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  18. Regionalization by fuzzy expert system based approach optimized by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chavoshi, Sattar; Azmin Sulaiman, Wan Nor; Saghafian, Bahram; Bin Sulaiman, Md. Nasir; Manaf, Latifah Abd

    2013-04-01

    SummaryIn recent years soft computing methods are being increasingly used to model complex hydrologic processes. These methods can simulate the real life processes without prior knowledge of the exact relationship between their components. The principal aim of this paper is perform hydrological regionalization based on soft computing concepts in the southern strip of the Caspian Sea basin, north of Iran. The basin with an area of 42,400 sq. km has been affected by severe floods in recent years that caused damages to human life and properties. Although some 61 hydrometric stations and 31 weather stations with 44 years of observed data (1961-2005) are operated in the study area, previous flood studies in this region have been hampered by insufficient and/or reliable observed rainfall-runoff records. In order to investigate the homogeneity (h) of catchments and overcome incompatibility that may occur on boundaries of cluster groups, a fuzzy expert system (FES) approach is used which incorporates physical and climatic characteristics, as well as flood seasonality and geographic location. Genetic algorithm (GA) was employed to adjust parameters of FES and optimize the system. In order to achieve the objective, a MATLAB programming code was developed which considers the heterogeneity criteria of less than 1 (H < 1) as the satisfying criteria. The adopted approach was found superior to the conventional hydrologic regionalization methods in the region because it employs greater number of homogeneity parameters and produces lower values of heterogeneity criteria.

  19. Phantom torso experiment on the international space station; flight measurements and calculations

    NASA Astrophysics Data System (ADS)

    Atwell, W.; Semones, E.; Cucinotta, F.

    The Phantom Torso Experiment (PTE) first flew on the 10-day Space Shuttle mission STS-91 in June 1998 during a period near solar minimum. The PTE was re- f l o w n on the I ternational Space Station (ISS) Increment 2 mission from April-n A u g u s t 2001 during a period near solar maximum. The experiment was located with a suite of other radiation experiments in the US Lab module Human Research Facility (HRF) rack. The objective of the experiment was to measure space radiation exposures at several radiosensitive critical body organs (brain, thyroid, heart/lung, stomach and colon) and two locations on the surface (skin) of a modified RandoTM phantom. Prior to flight, active solid -state silicon dosimeters were located at the RandoTM critical body organ locations and passive dosimeters were placed at the two surface locations. Using a mathematically modified Computerized Anatomical Male (CAM) model, shielding distributions were generated for the five critical body organ and two skin locations. These shielding distributions were then combined with the ISS HRF rack shielding distribution to account for the total shielding "seen" by the PTE. Using the trapped proton and galactic cosmic radiation environment models and high -energy particle transport codes, absorbed dose, dose equivalent, and LET (linear energy transfer) values were computed for the seven dose point locations of interest. The results of these computations are compared with the actual flight measurements.

  20. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  1. Calculation of Water Drop Trajectories to and About Arbitrary Three-Dimensional Bodies in Potential Airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1980-01-01

    Calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Any subsonic, external, non-lifting flow can be accommodated; flow into, but not through, inlets also can be simulated. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Code descriptions include operating instructions, card inputs and printouts for example problems, and listing of the FORTRAN codes. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  2. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  3. Williams uses laptop computer in the U.S. Laboratory taken during Expedition 13

    NASA Image and Video Library

    2006-06-22

    ISS013-E-40000 (22 June 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  4. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  5. Objective structured clinical examination "Death Certificate" station - Computer-based versus conventional exam format.

    PubMed

    Biolik, A; Heide, S; Lessig, R; Hachmann, V; Stoevesandt, D; Kellner, J; Jäschke, C; Watzke, S

    2018-04-01

    One option for improving the quality of medical post mortem examinations is through intensified training of medical students, especially in countries where such a requirement exists regardless of the area of specialisation. For this reason, new teaching and learning methods on this topic have recently been introduced. These new approaches include e-learning modules or SkillsLab stations; one way to objectify the resultant learning outcomes is by means of the OSCE process. However, despite offering several advantages, this examination format also requires considerable resources, in particular in regards to medical examiners. For this reason, many clinical disciplines have already implemented computer-based OSCE examination formats. This study investigates whether the conventional exam format for the OSCE forensic "Death Certificate" station could be replaced with a computer-based approach in future. For this study, 123 students completed the OSCE "Death Certificate" station, using both a computer-based and conventional format, half starting with the Computer the other starting with the conventional approach in their OSCE rotation. Assignment of examination cases was random. The examination results for the two stations were compared and both overall results and the individual items of the exam checklist were analysed by means of inferential statistics. Following statistical analysis of examination cases of varying difficulty levels and correction of the repeated measures effect, the results of both examination formats appear to be comparable. Thus, in the descriptive item analysis, while there were some significant differences between the computer-based and conventional OSCE stations, these differences were not reflected in the overall results after a correction factor was applied (e.g. point deductions for assistance from the medical examiner was possible only at the conventional station). Thus, we demonstrate that the computer-based OSCE "Death Certificate" station is a cost-efficient and standardised format for examination that yields results comparable to those from a conventional format exam. Moreover, the examination results also indicate the need to optimize both the test itself (adjusting the degree of difficulty of the case vignettes) and the corresponding instructional and learning methods (including, for example, the use of computer programmes to complete the death certificate in small group formats in the SkillsLab). Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  6. Computation of Reacting Flows in Combustion Processes

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Chen, Kuo-Huey

    1997-01-01

    The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.

  7. Hydrologic Observatory Data Telemetry Network in an Extreme Environment

    NASA Astrophysics Data System (ADS)

    Irving, K.; Kane, D.

    2007-12-01

    A network of hydrological research data stations on the North Slope of Alaska using radio telemetry to gather data in "near real time" will be described. The network consists of approximately 25 research stations, 10 repeater stations, and 3 Internet-connected base stations (though data is also collected at repeater stations and research stations may also function as repeaters). With this operational network, radio link redundancy is sufficient to reach any research station from any base station. The data network is driven in "pull" mode using software running on computers in Fairbanks, and emphasis is placed on reliably collecting and storing data as found on the remote data loggers. Work is underway to deploy dynamic routing software on the controlling computers, at which point the network will be capable of automatically working around problems which may include icing on antennas, satellite sun outages, animal damage, and many others.

  8. Human computer interface guide, revision A

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Human Computer Interface Guide, SSP 30540, is a reference document for the information systems within the Space Station Freedom Program (SSFP). The Human Computer Interface Guide (HCIG) provides guidelines for the design of computer software that affects human performance, specifically, the human-computer interface. This document contains an introduction and subparagraphs on SSFP computer systems, users, and tasks; guidelines for interactions between users and the SSFP computer systems; human factors evaluation and testing of the user interface system; and example specifications. The contents of this document are intended to be consistent with the tasks and products to be prepared by NASA Work Package Centers and SSFP participants as defined in SSP 30000, Space Station Program Definition and Requirements Document. The Human Computer Interface Guide shall be implemented on all new SSFP contractual and internal activities and shall be included in any existing contracts through contract changes. This document is under the control of the Space Station Control Board, and any changes or revisions will be approved by the deputy director.

  9. The RMI Space Weather and Navigation Systems (SWANS) Project

    NASA Astrophysics Data System (ADS)

    Warnant, Rene; Lejeune, Sandrine; Wautelet, Gilles; Spits, Justine; Stegen, Koen; Stankov, Stan

    The SWANS (Space Weather and Navigation Systems) research and development project (http://swans.meteo.be) is an initiative of the Royal Meteorological Institute (RMI) under the auspices of the Belgian Solar-Terrestrial Centre of Excellence (STCE). The RMI SWANS objectives are: research on space weather and its effects on GNSS applications; permanent mon-itoring of the local/regional geomagnetic and ionospheric activity; and development/operation of relevant nowcast, forecast, and alert services to help professional GNSS/GALILEO users in mitigating space weather effects. Several SWANS developments have already been implemented and available for use. The K-LOGIC (Local Operational Geomagnetic Index K Calculation) system is a nowcast system based on a fully automated computer procedure for real-time digital magnetogram data acquisition, data screening, and calculating the local geomagnetic K index. Simultaneously, the planetary Kp index is estimated from solar wind measurements, thus adding to the service reliability and providing forecast capabilities as well. A novel hybrid empirical model, based on these ground-and space-based observations, has been implemented for nowcasting and forecasting the geomagnetic index, issuing also alerts whenever storm-level activity is indicated. A very important feature of the nowcast/forecast system is the strict control on the data input and processing, allowing for an immediate assessment of the output quality. The purpose of the LIEDR (Local Ionospheric Electron Density Reconstruction) system is to acquire and process data from simultaneous ground-based GNSS TEC and digital ionosonde measurements, and subsequently to deduce the vertical electron density distribution. A key module is the real-time estimation of the ionospheric slab thickness, offering additional infor-mation on the local ionospheric dynamics. The RTK (Real Time Kinematic) status mapping provides a quick look at the small-scale ionospheric effects on the RTK precision for several GPS stations in Belgium. The service assesses the effect of small-scale ionospheric irregularities by monitoring the high-frequency TEC rate of change at any given station. This assessment results in a (colour) code assigned to each station, code ranging from "quiet" (green) to "extreme" (red) and referring to the local ionospheric conditions. Alerts via e-mail are sent to subscribed users when disturbed conditions are observed. SoDIPE (Software for Determining the Ionospheric Positioning Error) estimates the position-ing error due to the ionospheric conditions only (called "ionospheric error") in high-precision positioning applications (RTK in particular). For each of the Belgian Active Geodetic Network (AGN) baselines, SoDIPE computes the ionospheric error and its median value (every 15 min-utes). Again, a (colour) code is assigned to each baseline, ranging from "nominal" (green) to "extreme" (red) error level. Finally, all available baselines (drawn in colour corresponding to error level) are displayed on a map of Belgium. The future SWANS work will focus on regional ionospheric monitoring and developing various other nowcast and forecast services.

  10. Studies of Heat Transfer in Complex Internal Flows.

    DTIC Science & Technology

    1982-01-01

    D.C. 20362 (Tel 202-692-6874) Mr. Richard S. Carlton Director, Engines Division, Code 523 NC #4 Naval Sea Systems Command Washington, D.C. 20362...Walter Ritz Code 033C Naval Ships Systems Engineering Station Philadelphia, Pennsylvania 19112 (Tel. 215-755-3841) Dr. Simion Kuo United Tech. Res

  11. Multi-GPU and multi-CPU accelerated FDTD scheme for vibroacoustic applications

    NASA Astrophysics Data System (ADS)

    Francés, J.; Otero, B.; Bleda, S.; Gallego, S.; Neipp, C.; Márquez, A.; Beléndez, A.

    2015-06-01

    The Finite-Difference Time-Domain (FDTD) method is applied to the analysis of vibroacoustic problems and to study the propagation of longitudinal and transversal waves in a stratified media. The potential of the scheme and the relevance of each acceleration strategy for massively computations in FDTD are demonstrated in this work. In this paper, we propose two new specific implementations of the bi-dimensional scheme of the FDTD method using multi-CPU and multi-GPU, respectively. In the first implementation, an open source message passing interface (OMPI) has been included in order to massively exploit the resources of a biprocessor station with two Intel Xeon processors. Moreover, regarding CPU code version, the streaming SIMD extensions (SSE) and also the advanced vectorial extensions (AVX) have been included with shared memory approaches that take advantage of the multi-core platforms. On the other hand, the second implementation called the multi-GPU code version is based on Peer-to-Peer communications available in CUDA on two GPUs (NVIDIA GTX 670). Subsequently, this paper presents an accurate analysis of the influence of the different code versions including shared memory approaches, vector instructions and multi-processors (both CPU and GPU) and compares them in order to delimit the degree of improvement of using distributed solutions based on multi-CPU and multi-GPU. The performance of both approaches was analysed and it has been demonstrated that the addition of shared memory schemes to CPU computing improves substantially the performance of vector instructions enlarging the simulation sizes that use efficiently the cache memory of CPUs. In this case GPU computing is slightly twice times faster than the fine tuned CPU version in both cases one and two nodes. However, for massively computations explicit vector instructions do not worth it since the memory bandwidth is the limiting factor and the performance tends to be the same than the sequential version with auto-vectorisation and also shared memory approach. In this scenario GPU computing is the best option since it provides a homogeneous behaviour. More specifically, the speedup of GPU computing achieves an upper limit of 12 for both one and two GPUs, whereas the performance reaches peak values of 80 GFlops and 146 GFlops for the performance for one GPU and two GPUs respectively. Finally, the method is applied to an earth crust profile in order to demonstrate the potential of our approach and the necessity of applying acceleration strategies in these type of applications.

  12. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  13. Final report for the Tera Computer TTI CRADA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, G.S.; Pavlakos, C.; Silva, C.

    1997-01-01

    Tera Computer and Sandia National Laboratories have completed a CRADA, which examined the Tera Multi-Threaded Architecture (MTA) for use with large codes of importance to industry and DOE. The MTA is an innovative architecture that uses parallelism to mask latency between memories and processors. The physical implementation is a parallel computer with high cross-section bandwidth and GaAs processors designed by Tera, which support many small computation threads and fast, lightweight context switches between them. When any thread blocks while waiting for memory accesses to complete, another thread immediately begins execution so that high CPU utilization is maintained. The Tera MTAmore » parallel computer has a single, global address space, which is appealing when porting existing applications to a parallel computer. This ease of porting is further enabled by compiler technology that helps break computations into parallel threads. DOE and Sandia National Laboratories were interested in working with Tera to further develop this computing concept. While Tera Computer would continue the hardware development and compiler research, Sandia National Laboratories would work with Tera to ensure that their compilers worked well with important Sandia codes, most particularly CTH, a shock physics code used for weapon safety computations. In addition to that important code, Sandia National Laboratories would complete research on a robotic path planning code, SANDROS, which is important in manufacturing applications, and would evaluate the MTA performance on this code. Finally, Sandia would work directly with Tera to develop 3D visualization codes, which would be appropriate for use with the MTA. Each of these tasks has been completed to the extent possible, given that Tera has just completed the MTA hardware. All of the CRADA work had to be done on simulators.« less

  14. Design of an MSAT-X mobile transceiver and related base and gateway stations

    NASA Technical Reports Server (NTRS)

    Fang, Russell J. F.; Bhaskar, Udaya; Hemmati, Farhad; Mackenthun, Kenneth M.; Shenoy, Ajit

    1987-01-01

    This paper summarizes the results of a design study of the mobile transceiver, base station, and gateway station for NASA's proposed Mobile Satellite Experiment (MSAT-X). Major ground segment system design issues such as frequency stability control, modulation method, linear predictive coding vocoder algorithm, and error control technique are addressed. The modular and flexible transceiver design is described in detail, including the core, RF/IF, modem, vocoder, forward error correction codec, amplitude-companded single sideband, and input/output modules, as well as the flexible interface. Designs for a three-carrier base station and a 10-carrier gateway station are also discussed, including the interface with the controllers and with the public-switched telephone networks at the gateway station. Functional specifications are given for the transceiver, the base station, and the gateway station.

  15. Design of an MSAT-X mobile transceiver and related base and gateway stations

    NASA Astrophysics Data System (ADS)

    Fang, Russell J. F.; Bhaskar, Udaya; Hemmati, Farhad; Mackenthun, Kenneth M.; Shenoy, Ajit

    This paper summarizes the results of a design study of the mobile transceiver, base station, and gateway station for NASA's proposed Mobile Satellite Experiment (MSAT-X). Major ground segment system design issues such as frequency stability control, modulation method, linear predictive coding vocoder algorithm, and error control technique are addressed. The modular and flexible transceiver design is described in detail, including the core, RF/IF, modem, vocoder, forward error correction codec, amplitude-companded single sideband, and input/output modules, as well as the flexible interface. Designs for a three-carrier base station and a 10-carrier gateway station are also discussed, including the interface with the controllers and with the public-switched telephone networks at the gateway station. Functional specifications are given for the transceiver, the base station, and the gateway station.

  16. Operations analysis (study 2.1). Program listing for the LOVES computer code

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1974-01-01

    A listing of the LOVES computer program is presented. The program is coded partially in SIMSCRIPT and FORTRAN. This version of LOVES is compatible with both the CDC 7600 and the UNIVAC 1108 computers. The code has been compiled, loaded, and executed successfully on the EXEC 8 system for the UNIVAC 1108.

  17. 47 CFR 27.73 - WCS, AMT, and Goldstone coordination requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 202-741-6030, or go to: http://www/archives.gov/federal_ register/code_of_federal_regulations/ibr... Aeronautics and Space Administration (NASA) within 145 kilometers of the Goldstone, CA earth station site (35... the NASA Goldstone, CA earth station or from an AMT site, operating in the 2305-2320 or 2345-2360 MHz...

  18. 47 CFR 27.73 - WCS, AMT, and Goldstone coordination requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 202-741-6030, or go to: http://www/archives.gov/federal_ register/code_of_federal_regulations/ibr... Aeronautics and Space Administration (NASA) within 145 kilometers of the Goldstone, CA earth station site (35... the NASA Goldstone, CA earth station or from an AMT site, operating in the 2305-2320 or 2345-2360 MHz...

  19. Analysis of the Length of Braille Texts in English Braille American Edition, the Nemeth Code, and Computer Braille Code versus the Unified English Braille Code

    ERIC Educational Resources Information Center

    Knowlton, Marie; Wetzel, Robin

    2006-01-01

    This study compared the length of text in English Braille American Edition, the Nemeth code, and the computer braille code with the Unified English Braille Code (UEBC)--also known as Unified English Braille (UEB). The findings indicate that differences in the length of text are dependent on the type of material that is transcribed and the grade…

  20. Single-jet gas cooling of in-beam foils or specimens: Prediction of the convective heat-transfer coefficient

    NASA Astrophysics Data System (ADS)

    Steyn, Gideon; Vermeulen, Christiaan

    2018-05-01

    An experiment was designed to study the effect of the jet direction on convective heat-transfer coefficients in single-jet gas cooling of a small heated surface, such as typically induced by an accelerated ion beam on a thin foil or specimen. The hot spot was provided using a small electrically heated plate. Heat-transfer calculations were performed using simple empirical methods based on dimensional analysis as well as by means of an advanced computational fluid dynamics (CFD) code. The results provide an explanation for the observed turbulent cooling of a double-foil, Havar beam window with fast-flowing helium, located on a target station for radionuclide production with a 66 MeV proton beam at a cyclotron facility.

  1. Validation of the CALSPAN gross-motion-simulation code with actually occurring injury patterns in aircraft accidents.

    PubMed

    Ballo, J M; Dunne, M J; McMeekin, R R

    1978-01-01

    Digital simulation of aircraft-accident kinematics has heretofore been used almost exclusively as a design tool to explore structural load limits, precalculate decelerative forces at various cabin stations, and describe the effect of protective devices in the crash environment. In an effort to determine the value of digital computer simulation of fatal aircraft accidents, a fatality involving an ejection-system failure (out-of-envelope ejection) was modeled, and the injuries actually incurred were compared to those predicted; good agreement was found. The simulation of fatal aircraft accidents is advantageous because of a well-defined endpoint (death), lack of therapeutic intervention, and a static anatomic situation that can be minutely investigated. Such simulation techniques are a useful tool in the study of experimental trauma.

  2. A MATLAB based 3D modeling and inversion code for MT data

    NASA Astrophysics Data System (ADS)

    Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.

    2017-07-01

    The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

  3. Applications of automatic differentiation in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  4. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  5. Performance assessment of KORAT-3D on the ANL IBM-SP computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.

    1999-09-01

    The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less

  6. Improving Aircraft Refueling Procedures at Naval Air Station Oceana

    DTIC Science & Technology

    2012-06-01

    Station (NAS) Oceana, VA, using aircraft waiting time for fuel as a measure of performance. We develop a computer-assisted discrete-event simulation to...Station (NAS) Oceana, VA, using aircraft waiting time for fuel as a measure of performance. We develop a computer-assisted discrete-event simulation...server queue, with general interarrival and service time distributions gpm Gallons per minute JDK Java development kit M/M/1 Single-server queue

  7. Manned space station environmental control and life support system computer-aided technology assessment program

    NASA Technical Reports Server (NTRS)

    Hall, J. B., Jr.; Pickett, S. J.; Sage, K. H.

    1984-01-01

    A computer program for assessing manned space station environmental control and life support systems technology is described. The methodology, mission model parameters, evaluation criteria, and data base for 17 candidate technologies for providing metabolic oxygen and water to the crew are discussed. Examples are presented which demonstrate the capability of the program to evaluate candidate technology options for evolving space station requirements.

  8. JPRS Report, Soviet Union, Foreign Military Review, No. 8, August 1987

    DTIC Science & Technology

    1988-01-28

    Hinkley Point (1.5 million) and Hartlepool (1.3 million). In recent years the country has begun building large hydro- electric pumped storage power ...antenna 6. Interface equipment 7. Data transmission line terminal 8. Computer 9. Power supply plant control station 10. Radio-relay station terminals... stations and data transmission line, interface equipment, and power distribution unit (Fig. 3). The parallel computer, which performs operations on

  9. Kononenko uses laptop computer in the SM Transfer Compartment

    NASA Image and Video Library

    2012-03-21

    ISS030-E-161167 (21 March 2012) --- Russian cosmonaut Oleg Kononenko, Expedition 30 flight engineer, uses a computer in the transfer compartment of the International Space Station?s Zvezda Service Module. Russia's Zarya module is visible in the background.

  10. Weighted triangulation adjustment

    USGS Publications Warehouse

    Anderson, Walter L.

    1969-01-01

    The variation of coordinates method is employed to perform a weighted least squares adjustment of horizontal survey networks. Geodetic coordinates are required for each fixed and adjustable station. A preliminary inverse geodetic position computation is made for each observed line. Weights associated with each observed equation for direction, azimuth, and distance are applied in the formation of the normal equations in-the least squares adjustment. The number of normal equations that may be solved is twice the number of new stations and less than 150. When the normal equations are solved, shifts are produced at adjustable stations. Previously computed correction factors are applied to the shifts and a most probable geodetic position is found for each adjustable station. Pinal azimuths and distances are computed. These may be written onto magnetic tape for subsequent computation of state plane or grid coordinates. Input consists of punch cards containing project identification, program options, and position and observation information. Results listed include preliminary and final positions, residuals, observation equations, solution of the normal equations showing magnitudes of shifts, and a plot of each adjusted and fixed station. During processing, data sets containing irrecoverable errors are rejected and the type of error is listed. The computer resumes processing of additional data sets.. Other conditions cause warning-errors to be issued, and processing continues with the current data set.

  11. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  12. AERODYNAMIC AND BLADING DESIGN OF MULTISTAGE AXIAL FLOW COMPRESSORS

    NASA Technical Reports Server (NTRS)

    Crouse, J. E.

    1994-01-01

    The axial-flow compressor is used for aircraft engines because it has distinct configuration and performance advantages over other compressor types. However, good potential performance is not easily obtained. The designer must be able to model the actual flows well enough to adequately predict aerodynamic performance. This computer program has been developed for computing the aerodynamic design of a multistage axial-flow compressor and, if desired, the associated blading geometry input for internal flow analysis. The aerodynamic solution gives velocity diagrams on selected streamlines of revolution at the blade row edges. The program yields aerodynamic and blading design results that can be directly used by flow and mechanical analysis codes. Two such codes are TSONIC, a blade-to-blade channel flow analysis code (COSMIC program LEW-10977), and MERIDL, a more detailed hub-to-shroud flow analysis code (COSMIC program LEW-12966). The aerodynamic and blading design program can reduce the time and effort required to obtain acceptable multistage axial-flow compressor configurations by generating good initial solutions and by being compatible with available analysis codes. The aerodynamic solution assumes steady, axisymmetric flow so that the problem is reduced to solving the two-dimensional flow field in the meridional plane. The streamline curvature method is used for the iterative aerodynamic solution at stations outside of the blade rows. If a blade design is desired, the blade elements are defined and stacked within the aerodynamic solution iteration. The blade element inlet and outlet angles are established by empirical incidence and deviation angles to the relative flow angles of the velocity diagrams. The blade element centerline is composed of two segments tangentially joined at a transition point. The local blade angle variation of each element can be specified as a fourth-degree polynomial function of path distance. Blade element thickness can also be specified with fourth-degree polynomial functions of path distance from the maximum thickness point. Input to the aerodynamic and blading design program includes the annulus profile, the overall compressor mass flow, the pressure ratio, and the rotative speed. A number of input parameters are also used to specify and control the blade row aerodynamics and geometry. The output from the aerodynamic solution has an overall blade row and compressor performance summary followed by blade element parameters for the individual blade rows. If desired, the blade coordinates in the streamwise direction for internal flow analysis codes and the coordinates on plane sections through blades for fabrication drawings may be stored and printed. The aerodynamic and blading design program for multistage axial-flow compressors is written in FORTRAN IV for batch execution and has been implemented on an IBM 360 series computer with a central memory requirement of approximately 470K of 8 bit bytes. This program was developed in 1981.

  13. Fast H.264/AVC FRExt intra coding using belief propagation.

    PubMed

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  14. 2,445 Hours of Code: What I Learned from Facilitating Hour of Code Events in High School Libraries

    ERIC Educational Resources Information Center

    Colby, Jennifer

    2015-01-01

    This article describes a school librarian's experience with initiating an Hour of Code event for her school's student body. Hadi Partovi of Code.org conceived the Hour of Code "to get ten million students to try one hour of computer science" (Partovi, 2013a), which is implemented during Computer Science Education Week with a goal of…

  15. A Datacenter Backstage: The Knowledge that Supports the Brazilian Seismic Network

    NASA Astrophysics Data System (ADS)

    Calhau, J.; Assumpcao, M.; Collaço, B.; Bianchi, M.; Pirchiner, M.

    2015-12-01

    Historically, Brazilian seismology never had a clear strategic vision about how its data should be acquired, evaluated, stored and shared. Without a data management plan, data (for any practical purpose) could be lost, resulting in a non-uniform coverage that will reduce any chance of local and international collaboration, i.e., data will never become scientific knowledge. Since 2009, huge efforts from four different institutions are establishing the new permanent Brazilian Seismographic Network (RSBR), mainly with resources from PETROBRAS, the Brazilian Government oil company. Four FDSN sub-networks currently compose RSBR, with a total of 80 permanent stations. BL and BR codes (from BRASIS subnet) with 47 stations maintained by University of Sao Paulo (USP) and University of Brasilia (UnB) respectively; NB code (RSISNE subnet), with 16 stations deployed by University of Rio Grande do Norte (UFRN); and ON code (RSIS subnet), with 18 stations operated by the National Observatory (ON) in Rio de Janeiro. Most stations transmit data in real-time via satellite or cell-phone links. Each node acquires its own stations locally, and data is real-time shared using SeedLink. Archived data is distributed via ArcLink and/or FDSNWS services. All nodes use the SeisComP3 system for real-time processing and as a levering back-end. Open-source solutions like Seiscomp3 require some homemade tools to be developed, to help solve the most common daily problems of a data management center: local magnitude into the real-time earthquake processor, website plugins, regional earthquake catalog, contribution with ISC catalog, quality-control tools, data request tools, etc. The main data products and community activities include: kml files, data availability plots, request charts, summer school courses, an Open Lab Day and news interviews. Finally, a good effort was made to establish BRASIS sub-network and the whole RSBR as a unified project, that serves as a communication channel between individuals operating local networks.

  16. Numerical algorithm comparison for the accurate and efficient computation of high-incidence vortical flow

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.

    1991-01-01

    Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.

  17. User's Manual for FEMOM3DR. Version 1.0

    NASA Technical Reports Server (NTRS)

    Reddy, C. J.

    1998-01-01

    FEMoM3DR is a computer code written in FORTRAN 77 to compute radiation characteristics of antennas on 3D body using combined Finite Element Method (FEM)/Method of Moments (MoM) technique. The code is written to handle different feeding structures like coaxial line, rectangular waveguide, and circular waveguide. This code uses the tetrahedral elements, with vector edge basis functions for FEM and triangular elements with roof-top basis functions for MoM. By virtue of FEM, this code can handle any arbitrary shaped three dimensional bodies with inhomogeneous lossy materials; and due to MoM the computational domain can be terminated in any arbitrary shape. The User's Manual is written to make the user acquainted with the operation of the code. The user is assumed to be familiar with the FORTRAN 77 language and the operating environment of the computers on which the code is intended to run.

  18. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGrail, B.P.; Mahoney, L.A.

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less

  19. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  20. Past, Present and Future Advanced ECLS Systems for Human Exploration of Space

    NASA Technical Reports Server (NTRS)

    Mitchell, Kenny

    2004-01-01

    This paper will review the historical record of NASA's regenerative life support systems flight hardware with emphasis on the complexity of spiral development of technology as related to the International Space Station program. A brief summary of what constitutes ECLSS designs for human habitation will be included and will provide illustrations of the complex system/system integration issues. The new technology areas which need to be addressed in our future Code T initiatives will be highlighted. The development status of the current regenerative ECLSS for Space Station will be provided for the Oxygen Generation System and the Water Recovery System. In addition, the NASA is planning to augment the existing ISS capability with a new technology development effort by Code U/Code T for CO2 reduction (Sabatier Reactor). This latest ISS spiral development activity will be highlighted in this paper.

  1. Performance analysis of three dimensional integral equation computations on a massively parallel computer. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Logan, Terry G.

    1994-01-01

    The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.

  2. New Seismic Monitoring Station at Mohawk Ridge, Valles Caldera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Peter Morse

    Two new broadband digital seismic stations were installed in the Valles Caldera in 2011 and 2012. The first is located on the summit of Cerros del Abrigo (station code CDAB) and the second is located on the flanks of San Antonio Mountain (station code SAMT). Seismic monitoring stations in the caldera serve multiple purposes. These stations augment and expand the current coverage of the Los Alamos Seismic Network (LASN), which is operated to support seismic and volcanic hazards studies for LANL and northern New Mexico (Figure 1). They also provide unique continuous seismic data within the caldera that can bemore » used for scientific studies of the caldera’s substructure and detection of very small seismic signals that may indicate changes in the current and evolving state of remnant magma that is known to exist beneath the caldera. Since the installation of CDAB and SAMT, several very small earthquakes have already been detected near San Antonio Mountain just west of SAMT (Figure 2). These are the first events to be seen in that area. Caldera stations also improve the detection and epicenter determination quality for larger local earthquakes on the Pajarito Fault System east of the Preserve and the Nacimiento Uplift to the west. These larger earthquakes are a concern to LANL Seismic Hazards assessments and seismic monitoring of the Los Alamos region, including the VCNP, is a DOE requirement. Currently the next closest seismic stations to the caldera are on Pipeline Road (PPR) just west of Los Alamos, and Peralta Ridge (PER) south of the caldera. There is no station coverage near the resurgent dome, Redondo Peak, in the center of the caldera. Filling this “hole” is the highest priority for the next new LASN station. We propose to install this station in 2018 on Mohawk Ridge just east of Redondito, in the same area already occupied by other scientific installations, such as the MCON flux tower operated by UNM.« less

  3. Computer Description of the M561 Utility Truck

    DTIC Science & Technology

    1984-10-01

    GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom

  4. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  5. The influence of commenting validity, placement, and style on perceptions of computer code trustworthiness: A heuristic-systematic processing approach.

    PubMed

    Alarcon, Gene M; Gamble, Rose F; Ryan, Tyler J; Walter, Charles; Jessup, Sarah A; Wood, David W; Capiola, August

    2018-07-01

    Computer programs are a ubiquitous part of modern society, yet little is known about the psychological processes that underlie reviewing code. We applied the heuristic-systematic model (HSM) to investigate the influence of computer code comments on perceptions of code trustworthiness. The study explored the influence of validity, placement, and style of comments in code on trustworthiness perceptions and time spent on code. Results indicated valid comments led to higher trust assessments and more time spent on the code. Properly placed comments led to lower trust assessments and had a marginal effect on time spent on code; however, the effect was no longer significant after controlling for effects of the source code. Low style comments led to marginally higher trustworthiness assessments, but high style comments led to longer time spent on the code. Several interactions were also found. Our findings suggest the relationship between code comments and perceptions of code trustworthiness is not as straightforward as previously thought. Additionally, the current paper extends the HSM to the programming literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Adiabatic topological quantum computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  7. Adiabatic topological quantum computing

    DOE PAGES

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...

    2015-07-31

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  8. Lessons learned in creating spacecraft computer systems: Implications for using Ada (R) for the space station

    NASA Technical Reports Server (NTRS)

    Tomayko, James E.

    1986-01-01

    Twenty-five years of spacecraft onboard computer development have resulted in a better understanding of the requirements for effective, efficient, and fault tolerant flight computer systems. Lessons from eight flight programs (Gemini, Apollo, Skylab, Shuttle, Mariner, Voyager, and Galileo) and three reserach programs (digital fly-by-wire, STAR, and the Unified Data System) are useful in projecting the computer hardware configuration of the Space Station and the ways in which the Ada programming language will enhance the development of the necessary software. The evolution of hardware technology, fault protection methods, and software architectures used in space flight in order to provide insight into the pending development of such items for the Space Station are reviewed.

  9. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    NASA Astrophysics Data System (ADS)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  10. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  11. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  12. Multi-GNSS precise point positioning (MGPPP) using raw observations

    NASA Astrophysics Data System (ADS)

    Liu, Teng; Yuan, Yunbin; Zhang, Baocheng; Wang, Ningbo; Tan, Bingfeng; Chen, Yongchang

    2017-03-01

    A joint-processing model for multi-GNSS (GPS, GLONASS, BDS and GALILEO) precise point positioning (PPP) is proposed, in which raw code and phase observations are used. In the proposed model, inter-system biases (ISBs) and GLONASS code inter-frequency biases (IFBs) are carefully considered, among which GLONASS code IFBs are modeled as a linear function of frequency numbers. To get the full rank function model, the unknowns are re-parameterized and the estimable slant ionospheric delays and ISBs/IFBs are derived and estimated simultaneously. One month of data in April, 2015 from 32 stations of the International GNSS Service (IGS) Multi-GNSS Experiment (MGEX) tracking network have been used to validate the proposed model. Preliminary results show that RMS values of the positioning errors (with respect to external double-difference solutions) for static/kinematic solutions (four systems) are 6.2 mm/2.1 cm (north), 6.0 mm/2.2 cm (east) and 9.3 mm/4.9 cm (up). One-day stabilities of the estimated ISBs described by STD values are 0.36 and 0.38 ns, for GLONASS and BDS, respectively. Significant ISB jumps are identified between adjacent days for all stations, which are caused by the different satellite clock datums in different days and for different systems. Unlike ISBs, the estimated GLONASS code IFBs are quite stable for all stations, with an average STD of 0.04 ns over a month. Single-difference experiment of short baseline shows that PPP ionospheric delays are more precise than traditional leveling ionospheric delays.

  13. Description of real-time Ada software implementation of a power system monitor for the Space Station Freedom PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Ludwig, Kimberly; Mackin, Michael; Wright, Theodore

    1991-01-01

    The authors describe the Ada language software developed to perform the electrical power system monitoring functions for the NASA Lewis Research Center's Power Management and Distribution (PMAD) DC testbed. The results of the effort to implement this monitor are presented. The PMAD DC testbed is a reduced-scale prototype of the electric power system to be used in Space Station Freedom. The power is controlled by smart switches known as power control components (or switchgear). The power control components are currently coordinated by five Compaq 386/20e computers connected through an 802.4 local area network. The power system monitor algorithm comprises several functions, including periodic data acquisition, data smoothing, system performance analysis, and status reporting. Data are collected from the switchgear sensors every 100 ms, then passed through a 2-Hz digital filter. System performance analysis includes power interruption and overcurrent detection. The system monitor required a hardware timer interrupt to activate the data acquisition function. The execution time of the code was optimized by using an assembly language routine. The routine allows direct vectoring of the processor to Ada language procedures that perform periodic control activities.

  14. Three-dimensional turbopump flowfield analysis

    NASA Technical Reports Server (NTRS)

    Sharma, O. P.; Belford, K. A.; Ni, R. H.

    1992-01-01

    A program was conducted to develop a flow prediction method applicable to rocket turbopumps. The complex nature of a flowfield in turbopumps is described and examples of flowfields are discussed to illustrate that physics based models and analytical calculation procedures based on computational fluid dynamics (CFD) are needed to develop reliable design procedures for turbopumps. A CFD code developed at NASA ARC was used as the base code. The turbulence model and boundary conditions in the base code were modified, respectively, to: (1) compute transitional flows and account for extra rates of strain, e.g., rotation; and (2) compute surface heat transfer coefficients and allow computation through multistage turbomachines. Benchmark quality data from two and three-dimensional cascades were used to verify the code. The predictive capabilities of the present CFD code were demonstrated by computing the flow through a radial impeller and a multistage axial flow turbine. Results of the program indicate that the present code operated in a two-dimensional mode is a cost effective alternative to full three-dimensional calculations, and that it permits realistic predictions of unsteady loadings and losses for multistage machines.

  15. 47 CFR 11.51 - EAS code and Attention Signal Transmission requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Message (EOM) codes using the EAS Protocol. The Attention Signal must precede any emergency audio message... audio messages. No Attention Signal is required for EAS messages that do not contain audio programming... EAS messages in the main audio channel. All DAB stations shall also transmit EAS messages on all audio...

  16. 47 CFR 11.51 - EAS code and Attention Signal Transmission requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Message (EOM) codes using the EAS Protocol. The Attention Signal must precede any emergency audio message... audio messages. No Attention Signal is required for EAS messages that do not contain audio programming... EAS messages in the main audio channel. All DAB stations shall also transmit EAS messages on all audio...

  17. 47 CFR 11.51 - EAS code and Attention Signal Transmission requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Message (EOM) codes using the EAS Protocol. The Attention Signal must precede any emergency audio message... audio messages. No Attention Signal is required for EAS messages that do not contain audio programming... EAS messages in the main audio channel. All DAB stations shall also transmit EAS messages on all audio...

  18. High altitude chemically reacting gas particle mixtures. Volume 3: Computer code user's and applications manual. [rocket nozzle and orbital plume flow fields

    NASA Technical Reports Server (NTRS)

    Smith, S. D.

    1984-01-01

    A users manual for the RAMP2 computer code is provided. The RAMP2 code can be used to model the dominant phenomena which affect the prediction of liquid and solid rocket nozzle and orbital plume flow fields. The general structure and operation of RAMP2 are discussed. A user input/output guide for the modified TRAN72 computer code and the RAMP2F code is given. The application and use of the BLIMPJ module are considered. Sample problems involving the space shuttle main engine and motor are included.

  19. View northeast of a microchip based computer control system installed ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View northeast of a microchip based computer control system installed in the early 1980's to replace Lamokin Tower, at center of photograph; panels 1 and 2 at right of photograph are part of main supervisory board; panel 1 controlled Allen Lane sub-station #7; responsiblity for this portion of the system was transferred to southeast Pennsylvania transit authority (septa) in 1985; panel 2 at extreme right controls catenary switches in a coach storage yard adjacent to the station - Thirtieth Street Station, Power Director Center, Thirtieth & Market Streets in Amtrak Railroad Station, Philadelphia, Philadelphia County, PA

  20. Development of numerical methods for overset grids with applications for the integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    1995-01-01

    Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.

  1. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  2. ISSYS: An integrated synergistic Synthesis System

    NASA Technical Reports Server (NTRS)

    Dovi, A. R.

    1980-01-01

    Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.

  3. Recent plant studies using Victoria 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BIXLER,NATHAN E.; GASSER,RONALD D.

    2000-03-08

    VICTORIA 2.0 is a mechanistic computer code designed to analyze fission product behavior within the reactor coolant system (RCS) during a severe nuclear reactor accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS and secondary circuits. These predictions account for the chemical and aerosol processes that affect radionuclide behavior. VICTORIA 2.0 was released in early 1999; a new version VICTORIA 2.1, is now under development. The largest improvements in VICTORIA 2.1 are connected with the thermochemical database, which is being revised andmore » expanded following the recommendations of a peer review. Three risk-significant severe accident sequences have recently been investigated using the VICTORIA 2.0 code. The focus here is on how various chemistry options affect the predictions. Additionally, the VICTORIA predictions are compared with ones made using the MELCOR code. The three sequences are a station blackout in a GE BWR and steam generator tube rupture (SGTR) and pump-seal LOCA sequences in a 3-loop Westinghouse PWR. These sequences cover a range of system pressures, from fully depressurized to full system pressure. The chief results of this study are the fission product fractions that are retained in the core, RCS, secondary, and containment and the fractions that are released into the environment.« less

  4. EPIC Computer Cards

    NASA Image and Video Library

    2011-12-29

    ISS030-E-017776 (29 Dec. 2011) --- Working in chorus with the International Space Station team in Houston?s Mission Control Center, this astronaut and his Expedition 30 crewmates on the station install a set of Enhanced Processor and Integrated Communications (EPIC) computer cards in one of seven primary computers onboard. The upgrade will allow more experiments to operate simultaneously, and prepare for the arrival of commercial cargo ships later this year.

  5. Work-related health disorders among Saudi computer users.

    PubMed

    Jomoah, Ibrahim M

    2014-01-01

    The present study was conducted to investigate the prevalence of musculoskeletal disorders and eye and vision complaints among the computer users of King Abdulaziz University (KAU), Saudi Arabian Airlines (SAUDIA), and Saudi Telecom Company (STC). Stratified random samples of the work stations and operators at each of the studied institutions were selected and the ergonomics of the work stations were assessed and the operators' health complaints were investigated. The average ergonomic score of the studied work station at STC, KAU, and SAUDIA was 81.5%, 73.3%, and 70.3, respectively. Most of the examined operators use computers daily for ≤ 7 hours, yet they had some average incidences of general complaints (e.g., headache, body fatigue, and lack of concentration) and relatively high level of incidences of eye and vision complaints and musculoskeletal complaints. The incidences of the complaints have been found to increase with the (a) decrease in work station ergonomic score, (b) progress of age and duration of employment, (c) smoking, (d) use of computers, (e) lack of work satisfaction, and (f) history of operators' previous ailments. It has been recommended to improve the ergonomics of the work stations, set up training programs, and conduct preplacement and periodical examinations for operators.

  6. Work-Related Health Disorders among Saudi Computer Users

    PubMed Central

    Jomoah, Ibrahim M.

    2014-01-01

    The present study was conducted to investigate the prevalence of musculoskeletal disorders and eye and vision complaints among the computer users of King Abdulaziz University (KAU), Saudi Arabian Airlines (SAUDIA), and Saudi Telecom Company (STC). Stratified random samples of the work stations and operators at each of the studied institutions were selected and the ergonomics of the work stations were assessed and the operators' health complaints were investigated. The average ergonomic score of the studied work station at STC, KAU, and SAUDIA was 81.5%, 73.3%, and 70.3, respectively. Most of the examined operators use computers daily for ≤ 7 hours, yet they had some average incidences of general complaints (e.g., headache, body fatigue, and lack of concentration) and relatively high level of incidences of eye and vision complaints and musculoskeletal complaints. The incidences of the complaints have been found to increase with the (a) decrease in work station ergonomic score, (b) progress of age and duration of employment, (c) smoking, (d) use of computers, (e) lack of work satisfaction, and (f) history of operators' previous ailments. It has been recommended to improve the ergonomics of the work stations, set up training programs, and conduct preplacement and periodical examinations for operators. PMID:25383379

  7. High-resolution seismicity catalog of Italian peninsula in the period 1981-2015

    NASA Astrophysics Data System (ADS)

    Michele, M.; Latorre, D.; Castello, B.; Di Stefano, R.; Chiaraluce, L.

    2017-12-01

    In order to provide an updated reference catalog of Italian seismicity, the absolute location of the last 35 years (1981-2015) of seismic activity was computed with a three-dimensional VP and VS velocity model covering the whole Italian territory. The NonLinLoc code (Lomax et al., 2000), which is based on a probabilistic approach, was used to provide a complete and robust description of the uncertainties associated to the locations corresponding to the hypocentral solutions with the highest probability density. Moreover, the code using a finite difference approximation of the eikonal equation (Podvin and Lecomte, 1991), allows to manage very contrasted velocity models in the arrival time computation. To optimize the earthquakes location, we included the station corrections in the inverse problem. For each year, the number of available earthquakes depends on both the network detection capability and the occurrence of major seismic sequences. The starting earthquakes catalog was based on 2.6 million P and 1.9 million S arrival time picks for 278.607 selected earthquakes, recorded at least by 3 seismic stations of the Italian seismic network. The new catalog compared to the previous ones consisting of hypocentral locations retrieved with linearized location methods, shows a very good improvement as testified by the location parameters assessing the quality of the solution (i.e., RMS, azimuthal gap, formal error on horizontal and vertical components). In addition, we used the distance between the expected and the maximum likelihood hypocenter location to establish the unimodal (high-resolved location) or multimodal (poor-resolved location) character of the probability distribution. We used these parameters to classify the resulting locations in four classes (A, B, C and D) considering the simultaneous goodness of the previous parameters. The upper classes (A and B) include the 65% of the relocated earthquake, while the lowest class (D) only includes the 7% of the seismicity. We present the new catalog, consisting of 272.847 events, showing some example of earthquakes location related to the background as well as small to large seismic sequences occurred in Italy the last 35 years.

  8. User's manual for a two-dimensional, ground-water flow code on the Octopus computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.

    1978-08-30

    A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.

  9. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  10. Agricultural Spraying

    NASA Technical Reports Server (NTRS)

    1986-01-01

    AGDISP, a computer code written for Langley by Continuum Dynamics, Inc., aids crop dusting airplanes in targeting pesticides. The code is commercially available and can be run on a personal computer by an inexperienced operator. Called SWA+H, it is used by the Forest Service, FAA, DuPont, etc. DuPont uses the code to "test" equipment on the computer using a laser system to measure particle characteristics of various spray compounds.

  11. The adaption and use of research codes for performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    1987-05-01

    Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less

  12. Topological color codes on Union Jack lattices: a stable implementation of the whole Clifford group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katzgraber, Helmut G.; Theoretische Physik, ETH Zurich, CH-8093 Zurich; Bombin, H.

    We study the error threshold of topological color codes on Union Jack lattices that allow for the full implementation of the whole Clifford group of quantum gates. After mapping the error-correction process onto a statistical mechanical random three-body Ising model on a Union Jack lattice, we compute its phase diagram in the temperature-disorder plane using Monte Carlo simulations. Surprisingly, topological color codes on Union Jack lattices have a similar error stability to color codes on triangular lattices, as well as to the Kitaev toric code. The enhanced computational capabilities of the topological color codes on Union Jack lattices with respectmore » to triangular lattices and the toric code combined with the inherent robustness of this implementation show good prospects for future stable quantum computer implementations.« less

  13. Safe pill-dispensing.

    PubMed

    Testa, Massimiliano; Pollard, John

    2007-01-01

    Each patient is supplied with a smart-card containing a Radio Frequency IDentification (RFID) chip storing a unique identification code. The patient places the Smart-card on a pill-dispenser unit containing an RFID reader. The RFID chip is read and the code sent to a Base-station via a wireless Bluetooth link. A database containing both patient details and treatment information is queried at the Base-station using the RFID as the search key. The patient's treatment data (i.e., drug names, quantities, time, etc.) are retrieved and sent back to the pill-dispenser unit via Bluetooth. Appropriate quantities of the required medications are automatically dispensed, unless the patient has already taken his/her daily dose. Safe, confidential communication and operation is ensured.

  14. Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter

    DTIC Science & Technology

    2007-08-31

    latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced

  15. Dispel4py: An Open-Source Python library for Data-Intensive Seismology

    NASA Astrophysics Data System (ADS)

    Filgueira, Rosa; Krause, Amrey; Spinuso, Alessandro; Klampanos, Iraklis; Danecek, Peter; Atkinson, Malcolm

    2015-04-01

    Scientific workflows are a necessary tool for many scientific communities as they enable easy composition and execution of applications on computing resources while scientists can focus on their research without being distracted by the computation management. Nowadays, scientific communities (e.g. Seismology) have access to a large variety of computing resources and their computational problems are best addressed using parallel computing technology. However, successful use of these technologies requires a lot of additional machinery whose use is not straightforward for non-experts: different parallel frameworks (MPI, Storm, multiprocessing, etc.) must be used depending on the computing resources (local machines, grids, clouds, clusters) where applications are run. This implies that for achieving the best applications' performance, users usually have to change their codes depending on the features of the platform selected for running them. This work presents dispel4py, a new open-source Python library for describing abstract stream-based workflows for distributed data-intensive applications. Special care has been taken to provide dispel4py with the ability to map abstract workflows to different platforms dynamically at run-time. Currently dispel4py has four mappings: Apache Storm, MPI, multi-threading and sequential. The main goal of dispel4py is to provide an easy-to-use tool to develop and test workflows in local resources by using the sequential mode with a small dataset. Later, once a workflow is ready for long runs, it can be automatically executed on different parallel resources. dispel4py takes care of the underlying mappings by performing an efficient parallelisation. Processing Elements (PE) represent the basic computational activities of any dispel4Py workflow, which can be a seismologic algorithm, or a data transformation process. For creating a dispel4py workflow, users only have to write very few lines of code to describe their PEs and how they are connected by using Python, which is widely supported on many platforms and is popular in many scientific domains, such as in geosciences. Once, a dispel4py workflow is written, a user only has to select which mapping they would like to use, and everything else (parallelisation, distribution of data) is carried on by dispel4py without any cost to the user. Among all dispel4py features we would like to highlight the following: * The PEs are connected by streams and not by writing to and reading from intermediate files, avoiding many IO operations. * The PEs can be stored into a registry. Therefore, different users can recombine PEs in many different workflows. * dispel4py has been enriched with a provenance mechanism to support runtime provenance analysis. We have adopted the W3C-PROV data model, which is accessible via a prototypal browser-based user interface and a web API. It supports the users with the visualisation of graphical products and offers combined operations to access and download the data, which may be selectively stored at runtime, into dedicated data archives. dispel4py has been already used by seismologists in the VERCE project to develop different seismic workflows. One of them is the Seismic Ambient Noise Cross-Correlation workflow, which preprocesses and cross-correlates traces from several stations. First, this workflow was tested on a local machine by using a small number of stations as input data. Later, it was executed on different parallel platforms (SuperMUC cluster, and Terracorrelator machine), automatically scaling up by using MPI and multiprocessing mappings and up to 1000 stations as input data. The results show that the dispel4py achieves scalable performance in both mappings tested on different parallel platforms.

  16. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  17. The Albuquerque Seismological Laboratory Data Quality Analyzer

    NASA Astrophysics Data System (ADS)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the website to diagnose changes in noise levels or other anomalous data. This allows for errors to be corrected quickly and efficiently. The code is designed to be flexible for adding metrics and portable for use in other networks. We anticipate further development of the DQA by improving the existing web-interface, adding more metrics, adding an interface to facilitate the verification of historic station metadata and performance, and an interface to allow better monitoring of data quality goals.

  18. Computer Assisted Communication within the Classroom: Interactive Lecturing.

    ERIC Educational Resources Information Center

    Herr, Richard B.

    At the University of Delaware student-teacher communication within the classroom was enhanced through the implementation of a versatile, yet cost efficient, application of computer technology. A single microcomputer at a teacher's station controls a network of student keypad/display stations to provide individual channels of continuous…

  19. Malenchenko uses a computer in the SM during Joint Operations

    NASA Image and Video Library

    2008-03-21

    S123-E-008370 (21 March 2008) --- Cosmonaut Yuri I. Malenchenko, Expedition 16 flight engineer representing Russia's Federal Space Agency, uses a computer in the Zvezda Service Module of the International Space Station while Space Shuttle Endeavour (STS-123) is docked with the station.

  20. Analysis of airborne antenna systems using geometrical theory of diffraction and moment method computer codes

    NASA Technical Reports Server (NTRS)

    Hartenstein, Richard G., Jr.

    1985-01-01

    Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.

  1. THC-MP: High performance numerical simulation of reactive transport and multiphase flow in porous media

    NASA Astrophysics Data System (ADS)

    Wei, Xiaohui; Li, Weishan; Tian, Hailong; Li, Hongliang; Xu, Haixiao; Xu, Tianfu

    2015-07-01

    The numerical simulation of multiphase flow and reactive transport in the porous media on complex subsurface problem is a computationally intensive application. To meet the increasingly computational requirements, this paper presents a parallel computing method and architecture. Derived from TOUGHREACT that is a well-established code for simulating subsurface multi-phase flow and reactive transport problems, we developed a high performance computing THC-MP based on massive parallel computer, which extends greatly on the computational capability for the original code. The domain decomposition method was applied to the coupled numerical computing procedure in the THC-MP. We designed the distributed data structure, implemented the data initialization and exchange between the computing nodes and the core solving module using the hybrid parallel iterative and direct solver. Numerical accuracy of the THC-MP was verified through a CO2 injection-induced reactive transport problem by comparing the results obtained from the parallel computing and sequential computing (original code). Execution efficiency and code scalability were examined through field scale carbon sequestration applications on the multicore cluster. The results demonstrate successfully the enhanced performance using the THC-MP on parallel computing facilities.

  2. Code of Ethical Conduct for Computer-Using Educators: An ICCE Policy Statement.

    ERIC Educational Resources Information Center

    Computing Teacher, 1987

    1987-01-01

    Prepared by the International Council for Computers in Education's Ethics and Equity Committee, this code of ethics for educators using computers covers nine main areas: curriculum issues, issues relating to computer access, privacy/confidentiality issues, teacher-related issues, student issues, the community, school organizational issues,…

  3. Embedding Secure Coding Instruction into the IDE: Complementing Early and Intermediate CS Courses with ESIDE

    ERIC Educational Resources Information Center

    Whitney, Michael; Lipford, Heather Richter; Chu, Bill; Thomas, Tyler

    2018-01-01

    Many of the software security vulnerabilities that people face today can be remediated through secure coding practices. A critical step toward the practice of secure coding is ensuring that our computing students are educated on these practices. We argue that secure coding education needs to be included across a computing curriculum. We are…

  4. Calculation of water drop trajectories to and about arbitrary three-dimensional lifting and nonlifting bodies in potential airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1985-01-01

    Subsonic, external flow about nonlifting bodies, lifting bodies or combinations of lifting and nonlifting bodies is calculated by a modified version of the Hess lifting code. Trajectory calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Inlet flow can be accommodated, and high Mach number compressibility effects are corrected for approximately. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  5. Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.

    DTIC Science & Technology

    1990-09-01

    IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve

  6. A COTS-Based Replacement Strategy for Aging Avionics Computers

    DTIC Science & Technology

    2001-12-01

    Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace

  7. Modeling environmental bias and computing velocity field from data of Terra Nova Bay GPS network in Antarctica by means of a quasi-observation processing approach

    USGS Publications Warehouse

    Casula, Giuseppe; Dubbini, Marco; Galeandro, Angelo

    2007-01-01

    A semi-permanent GPS network of about 30 vertices has been installed at Terra Nova Bay (TNB) near Ross Sea in Antarctica. A permanent GPS station TNB1 based on an Ashtech Z-XII dual frequency P-code GPS receiver with ASH700936D_M Choke Ring Antenna has been mounted on a reinforced concrete pillar built on bedrock since October 1998 and has recorded continuously up to the present. The semi-permanent network has been routinely surveyed every summer using high quality dual frequency GPS receivers with 24 hour sessions at 15 sec rate; data, metadata and solutions will be available to the scientific community at (http://www.geodant.unimore.it). We present the results of a distributed session approach applied to processing GPS data of the TNB GPS network, and based on Gamit/Globk 10.2-3 GPS analysis software. The results are in good agreement with other authors' computations and with many of the theoretical models.

  8. GPS Monitor Station Upgrade Program at the Naval Research Laboratory

    NASA Technical Reports Server (NTRS)

    Galysh, Ivan J.; Craig, Dwin M.

    1996-01-01

    One of the measurements made by the Global Positioning System (GPS) monitor stations is to measure the continuous pseudo-range of all the passing GPS satellites. The pseudo-range contains GPS and monitor station clock errors as well as GPS satellite navigation errors. Currently the time at the GPS monitor station is obtained from the GPS constellation and has an inherent inaccuracy as a result. Improved timing accuracy at the GPS monitoring stations will improve GPS performance. The US Naval Research Laboratory (NRL) is developing hardware and software for the GPS monitor station upgrade program to improve the monitor station clock accuracy. This upgrade will allow a method independent of the GPS satellite constellation of measuring and correcting monitor station time to US Naval Observatory (USNO) time. THe hardware consists of a high performance atomic cesium frequency standard (CFS) and a computer which is used to ensemble the CFS with the two CFS's currently located at the monitor station by use of a dual-mixer system. The dual-mixer system achieves phase measurements between the high-performance CFS and the existing monitor station CFS's to within 400 femtoseconds. Time transfer between USNO and a given monitor station is achieved via a two way satellite time transfer modem. The computer at the monitor station disciplines the CFS based on a comparison of one pulse per second sent from the master site at USNO. The monitor station computer is also used to perform housekeeping functions, as well as recording the health status of all three CFS's. This information is sent to the USNO through the time transfer modem. Laboratory time synchronization results in the sub nanosecond range have been observed and the ability to maintain the monitor station CFS frequency to within 3.0 x 10 (sup minus 14) of the master site at USNO.

  9. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  10. The STAGS computer code

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.

    1978-01-01

    Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.

  11. Holonomic surface codes for fault-tolerant quantum computation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Devitt, Simon J.; You, J. Q.; Nori, Franco

    2018-02-01

    Surface codes can protect quantum information stored in qubits from local errors as long as the per-operation error rate is below a certain threshold. Here we propose holonomic surface codes by harnessing the quantum holonomy of the system. In our scheme, the holonomic gates are built via auxiliary qubits rather than the auxiliary levels in multilevel systems used in conventional holonomic quantum computation. The key advantage of our approach is that the auxiliary qubits are in their ground state before and after each gate operation, so they are not involved in the operation cycles of surface codes. This provides an advantageous way to implement surface codes for fault-tolerant quantum computation.

  12. Comparison of two- and three-dimensional flow computations with laser anemometer measurements in a transonic compressor rotor

    NASA Technical Reports Server (NTRS)

    Chima, R. V.; Strazisar, A. J.

    1982-01-01

    Two and three dimensional inviscid solutions for the flow in a transonic axial compressor rotor at design speed are compared with probe and laser anemometers measurements at near-stall and maximum-flow operating points. Experimental details of the laser anemometer system and computational details of the two dimensional axisymmetric code and three dimensional Euler code are described. Comparisons are made between relative Mach number and flow angle contours, shock location, and shock strength. A procedure for using an efficient axisymmetric code to generate downstream pressure input for computationally expensive Euler codes is discussed. A film supplement shows the calculations of the two operating points with the time-marching Euler code.

  13. Burbank works on the EPIC in the Node 2

    NASA Image and Video Library

    2012-02-28

    ISS030-E-114433 (29 Feb. 2012) --- In the International Space Station?s Destiny laboratory, NASA astronaut Dan Burbank, Expedition 30 commander, upgrades Multiplexer/Demultiplexer (MDM) computers and Portable Computer System (PCS) laptops and installs the Enhanced Processor & Integrated Communications (EPIC) hardware in the Payload 1 (PL-1) MDM.

  14. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  15. Demographics of emergency medical care at the Indianapolis 500 mile race (1983-1990).

    PubMed

    Bock, H C; Cordell, W H; Hawk, A C; Bowdish, G E

    1992-10-01

    The Indianapolis 500 Mile Race, the largest single-day, single-venue sporting event in the world, is attended by an estimated 400,000 people. Major illness and injury are treated at the Hanna Emergency Medical Center, the track hospital. Minor illness is treated at ten outlying aid stations. We describe the demographics of emergency medical care at the Hanna Emergency Medical Center. Descriptive. Patient care data for patients treated at the medical center are first recorded on paper charts and then coded and transferred to computer. Data regarding patients treated at the medical center during eight consecutive races (1983-1990) were analyzed. Frequency of treatment and medical cardiac arrest rates were calculated. Aid station data and medical center records from nonrace days were not analyzed. The average number of patients treated per year at the track hospital was 139. The total number treated over the eight-year period was 1,113, yielding a frequency of treatment of 0.35 per 1,000. Analysis showed 16.2% of the proprietary treatment codes involved intoxication; 15.4%, lacerations (other than feet); 11.0%, pre-existing conditions; and 8.5%, heat illness. During the eight years, there were four medical cardiac arrests (incidence of 0.0125 per 10,000 spectators), all resulting in death. A fifth spectator died after being struck by a wheel from a race car. There were no driver deaths on race day. Descriptive data regarding medical care of crowds may be useful to emergency specialists who must staff, order supplies, and plan treatment facilities for similar mass gatherings. It is evident from this and other mass-gathering studies that there is a need for consistency in nomenclature and data collection. This will allow more accurate comparisons of emergency medical care between venues.

  16. Preliminary PANSAT ground station software design and use of an expert system to analyze telemetry

    NASA Astrophysics Data System (ADS)

    Lawrence, Gregory W.

    1994-03-01

    The Petite Amateur Navy Satellite (PANSAT) is a communications satellite designed to be used by civilian amateur radio operators. A master ground station is being built at the Naval Postgraduate School. This computer system performs satellite commands, displays telemetry, trouble-shoots problems, and passes messages. The system also controls an open loop tracking antenna. This paper concentrates on the telemetry display, decoding, and interpretation through artificial intelligence (AI). The telemetry is displayed in an easily interpretable format, so that any user can understand the current health of the satellite and be cued as to any problems and possible solutions. Only the master ground station has the ability to receive all telemetry and send commands to the spacecraft; civilian ham users do not have access to this information. The telemetry data is decommutated and analyzed before it is displayed to the user, so that the raw data will not have to be interpreted by ground users. The analysis will use CLIPS imbedded in the code, and derive its inputs from telemetry decommutation. The program is an expert system using a forward chaining set of rules based on the expected operation and parameters of the satellite. By building the rules during the construction and design of the satellite, the telemetry can be well understood and interpreted after the satellite is launched and the designers may no longer be available to provide input to the problem.

  17. Principal facts of gravity stations with gravity and magnetic profiles from the Southwest Nevada Test Site, Nye County, Nevada, as of January, 1982

    USGS Publications Warehouse

    Jansma, P.E.; Snyder, D.B.; Ponce, David A.

    1983-01-01

    Three gravity profiles and principal facts of 2,604 gravity stations in the southwest quadrant of the Nevada Test Site are documented in this data report. The residual gravity profiles show the gravity measurements and the smoothed curves derived from these points that were used in geophysical interpretations. The principal facts include station label, latitude, longitude, elevation, observed gravity value, and terrain correction for each station as well as the derived complete Bouguer and isostatic anomalies, reduced at 2.67 g/cm 3. Accuracy codes, where available, further document the data.

  18. ACTS TDMA network control. [Advanced Communication Technology Satellite

    NASA Technical Reports Server (NTRS)

    Inukai, T.; Campanella, S. J.

    1984-01-01

    This paper presents basic network control concepts for the Advanced Communications Technology Satellite (ACTS) System. Two experimental systems, called the low-burst-rate and high-burst-rate systems, along with ACTS ground system features, are described. The network control issues addressed include frame structures, acquisition and synchronization procedures, coordinated station burst-time plan and satellite-time plan changes, on-board clock control based on ground drift measurements, rain fade control by means of adaptive forward-error-correction (FEC) coding and transmit power augmentation, and reassignment of channel capacities on demand. The NASA ground system, which includes a primary station, diversity station, and master control station, is also described.

  19. EAC: A program for the error analysis of STAGS results for plates

    NASA Technical Reports Server (NTRS)

    Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.

    1989-01-01

    A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.

  20. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  1. On the error statistics of Viterbi decoding and the performance of concatenated codes

    NASA Technical Reports Server (NTRS)

    Miller, R. L.; Deutsch, L. J.; Butman, S. A.

    1981-01-01

    Computer simulation results are presented on the performance of convolutional codes of constraint lengths 7 and 10 concatenated with the (255, 223) Reed-Solomon code (a proposed NASA standard). These results indicate that as much as 0.8 dB can be gained by concatenating this Reed-Solomon code with a (10, 1/3) convolutional code, instead of the (7, 1/2) code currently used by the DSN. A mathematical model of Viterbi decoder burst-error statistics is developed and is validated through additional computer simulations.

  2. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  3. DSN telemetry system performance with convolutionally code data

    NASA Technical Reports Server (NTRS)

    Mulhall, B. D. L.; Benjauthrit, B.; Greenhall, C. A.; Kuma, D. M.; Lam, J. K.; Wong, J. S.; Urech, J.; Vit, L. D.

    1975-01-01

    The results obtained to date and the plans for future experiments for the DSN telemetry system were presented. The performance of the DSN telemetry system in decoding convolutionally coded data by both sequential and maximum likelihood techniques is being determined by testing at various deep space stations. The evaluation of performance models is also an objective of this activity.

  4. Utility of Emulation and Simulation Computer Modeling of Space Station Environmental Control and Life Support Systems

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.

  5. Terminal Homing for Autonomous Underwater Vehicle Docking

    DTIC Science & Technology

    2016-06-01

    underwater domain, accurate navigation. Above the water, light and electromagnetic signals travel well through air and space, mediums that allow for a...DISTRIBUTION CODE 13. ABSTRACT The use of docking stations for autonomous underwater vehicles (AUV) provides the ability to keep a vehicle on...Mechanical and Aerospace Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT The use of docking stations for autonomous underwater

  6. VS30, site amplifications and some comparisons: The Adapazari (Turkey) case

    NASA Astrophysics Data System (ADS)

    Ozcep, Tazegul; Ozcep, Ferhat; Ozel, Oguz

    The aim of this study was to investigate the role of VS30 in site amplifications in the Adapazari region, Turkey. To fulfil this aim, amplifications from VS30 measurements were compared with earthquake data for different soil types in the seismic design codes. The Adapazari area was selected as the study area, and shear-wave velocity distribution was obtained by the multichannel analysis of surface waves (MASWs) method at 100 sites for the top 50 m of soil. Aftershock data following the Mw 7.4 Izmit earthquake of 17 August 1999 gave magnitudes between 4.0 and 5.6 at six stations installed in and around the Adapazari Basin, at Babalı, Şeker, Genç, Hastane, Toyota and Imar. This data was used to estimate site amplifications by the reference-station method. In addition, the fundamental periods of the station sites were estimated by the single station method. Site classifications based on VS30 in the seismic design codes were compared with the fundamental periods and amplification values. It was found that site amplifications (from earthquake data) and relevant spectra (from VS30) are not in good agreement for soils in Adapazari (Turkey).

  7. SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMillan, D.B.

    1960-06-01

    >A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)

  8. Computer-assisted coding and clinical documentation: first things first.

    PubMed

    Tully, Melinda; Carmichael, Angela

    2012-10-01

    Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.

  9. Telemetry Data Collection from Oscar Satellite

    NASA Technical Reports Server (NTRS)

    Haddock, Paul C.; Horan, Stephen

    1998-01-01

    This paper discusses the design, configuration, and operation of a satellite station built for the Center for Space Telemetering and Telecommunications Laboratory in the Klipsch School of Electrical and Computer Engineering Engineering at New Mexico State University (NMSU). This satellite station consists of a computer-controlled antenna tracking system, 2m/70cm transceiver, satellite tracking software, and a demodulator. The satellite station receives satellite,telemetry, allows for voice communications, and will be used in future classes. Currently this satellite station is receiving telemetry from an amateur radio satellite, UoSAT-OSCAR-11. Amateur radio satellites are referred to as Orbiting Satellites Carrying Amateur Radio (OSCAR) satellites as discussed in the next section.

  10. Implementation of Biogas Stations into Smart Heating and Cooling Network

    NASA Astrophysics Data System (ADS)

    Milčák, P.; Konvička, J.; Jasenská, M.

    2016-10-01

    The paper is aimed at the description of implementation of a biogas station into software environment for the "Smart Heating and Cooling Networks". The aim of this project is creation of a software tool for preparation of operation and optimization of treatment of heat/cool in small regions. In this case, the biogas station represents a kind of renewable energy source, which, however, has its own operational specifics which need to be taken into account at the creation of an implementation project. For a specific biogas station, a detailed computational model was elaborated, which is parameterized in particular for an optimization of the total computational time.

  11. Assessment of RELAP5/MOD2 against a pressurizer spray valve inadverted fully opening transient and recovery by natural circulation in Jose Cabrera Nuclear Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arroyo, R.; Rebollo, L.

    1993-06-01

    This document presents the comparison between the simulation results and the plant measurements of a real event that took place in JOSE CABRERA nuclear power plant in August 30th, 1984. The event was originated by the total, continuous and inadverted opening of the pressurizer spray valve PCV-400A. JOSE CABRERA power plant is a single loop Westinghouse PWR belonging to UNION ELECTRICA FENOSA, S.A. (UNION FENOSA), an Spanish utility which participates in the International Code Assessment and Applications Program (ICAP) as a member of UNIDAD ELECTRICA, S.A. (UNESA). This is the second of its two contributions to the Program: the firstmore » one was an application case and this is an assessment one. The simulation has been performed using the RELAP5/MOD2 cycle 36.04 code, running on a CDC CYBER 180/830 computer under NOS 2.5 operating system. The main phenomena have been calculated correctly and some conclusions about the 3D characteristics of the condensation due to the spray and its simulation with a 1D tool have been got.« less

  12. Space station tracking requirements feasibility study, volume 2

    NASA Technical Reports Server (NTRS)

    Udalov, Sergei; Dodds, James

    1988-01-01

    The objective of this feasibility study is to determine analytically the accuracies of various sensors being considered as candidates for Space Station use. Specifically, the studies were performed whether or not the candidate sensors are capable of providing the required accuracy, or if alternate sensor approaches should be investigated. Other topics related to operation in the Space Station environment were considered as directed by NASA-JSC. The following topics are addressed: (1) Space Station GPS; (2) Space Station Radar; (3) Docking Sensors; (4) Space Station Link Analysis; (5) Antenna Switching, Power Control, and AGC Functions for Multiple Access; (6) Multichannel Modems; (7) FTS/EVA Emergency Shutdown; (8) Space Station Information Systems Coding; (9) Wanderer Study; and (10) Optical Communications System Analysis. Brief overviews of the abovementioned topics are given. Wherever applicable, the appropriate appendices provide detailed technical analysis. The report is presented in two volumes. This is Volume 2, containing Appendices K through U.

  13. Space station tracking requirements feasibility study, volume 1

    NASA Technical Reports Server (NTRS)

    Udalov, Sergei; Dodds, James

    1988-01-01

    The objective of this feasibility study is to determine analytically the accuracies of various sensors being considered as candidates for Space Station use. Specifically, the studies were performed whether or not the candidate sensors are capable of providing the required accuracy, or if alternate sensor approaches be investigated. Other topics related to operation in the Space Station environment were considered as directed by NASA-JCS. The following topics are addressed: (1) Space Station GPS; (2) Space Station Radar; (3) Docking Sensors; (4) Space Station Link Analysis; (5) Antenna Switching, Power Control, and AGC Functions for Multiple Access; (6) Multichannel Modems; (7) FTS/EVA Emergency Shutdown; (8) Space Station Information Systems Coding; (9) Wanderer Study; and (10) Optical Communications System Analysis. Brief overviews of the abovementioned topics are given. Wherever applicable, the appropriate appendices provide detailed technical analysis. The report is presented in two volumes. This is Volume 1, containing the main body and Appendices A through J.

  14. Flux concentrations on solar dynamic components due to mispointing

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.

    1992-01-01

    Mispointing of the solar dynamic (SD) concentrator designed for use on Space Station Freedom (SSF) causes the optical axis of the concentrator to be nonparallel to the incoming rays from the Sun. This causes solar flux not to be focused into the aperture hole of the receiver and may position the flux on other SSF components. A Rocketdyne analysis has determined the thermal impact of off-axis radiation due to mispointing on elements of the SD module and photovoltaic (PV) arrays. The conclusion was that flux distributions on some of the radiator components, the two-axis gimbal rings, the truss, and the PV arrays could present problems. The OFFSET computer code was used at Lewis Research Center to further investigate these flux distributions incident on components. The Lewis study included distributions for a greater range of mispoint angles than the Rocketdyne study.

  15. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  16. Bistatic radar cross section of a perfectly conducting rhombus-shaped flat plate

    NASA Astrophysics Data System (ADS)

    Fenn, Alan J.

    1990-05-01

    The bistatic radar cross section of a perfectly conducting flat plate that has a rhombus shape (equilateral parallelogram) is investigated. The Ohio State University electromagnetic surface patch code (ESP version 4) is used to compute the theoretical bistatic radar cross section of a 35- x 27-in rhombus plate at 1.3 GHz over the bistatic angles 15 deg to 142 deg. The ESP-4 computer code is a method of moments FORTRAN-77 program which can analyze general configurations of plates and wires. This code has been installed and modified at Lincoln Laboratory on a SUN 3 computer network. Details of the code modifications are described. Comparisons of the method of moments simulations and measurements of the rhombus plate are made. It is shown that the ESP-4 computer code provides a high degree of accuracy in the calculation of copolarized and cross-polarized bistatic radar cross section patterns.

  17. ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less

  18. Navier-Stokes Simulation of Homogeneous Turbulence on the CYBER 205

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.; Rogallo, R. S.

    1984-01-01

    A computer code which solves the Navier-Stokes equations for three dimensional, time-dependent, homogenous turbulence has been written for the CYBER 205. The code has options for both 64-bit and 32-bit arithmetic. With 32-bit computation, mesh sizes up to 64 (3) are contained within core of a 2 million 64-bit word memory. Computer speed timing runs were made for various vector lengths up to 6144. With this code, speeds a little over 100 Mflops have been achieved on a 2-pipe CYBER 205. Several problems encountered in the coding are discussed.

  19. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E.

    1985-01-01

    The tether control law to retrieve the satellite was modified in order to have a smooth retrieval trajectory of the satellite that minimizes the thruster activation. The satellite thrusters were added to the rotational dynamics computer code and a preliminary control logic was implemented to simulate them during the retrieval maneuver. The high resolution computer code for modelling the three dimensional dynamics of untensioned tether, SLACK3, was made fully operative and a set of computer simulations of possible tether breakages was run. The distribution of the electric field around an electrodynamic tether in vacuo severed at some length from the shuttle was computed with a three dimensional electrodynamic computer code.

  20. Experimental and computational surface and flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.

    1990-01-01

    The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.

  1. Computer search for binary cyclic UEP codes of odd length up to 65

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Chi-Chang; Lin, Shu

    1990-01-01

    Using an exhaustive computation, the unequal error protection capabilities of all binary cyclic codes of odd length up to 65 that have minimum distances at least 3 are found. For those codes that can only have upper bounds on their unequal error protection capabilities computed, an analytic method developed by Dynkin and Togonidze (1976) is used to show that the upper bounds meet the exact unequal error protection capabilities.

  2. A Combinatorial Geometry Computer Description of the MEP-021A Generator Set

    DTIC Science & Technology

    1979-02-01

    Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] 󈧚*7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack

  3. Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.

    1972-01-01

    A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.

  4. Reconstruction of the effects of the 2004 Sumatra tsunami on the peculiar morphology of the Seychelles Islands: an application to the island of Praslin

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Tinti, S.; Pagnoni, G.; Gallazzi, S. C.; Armigliato, A.

    2009-12-01

    The Seychelles archipelago is located 1600 km east to the African coasts, in front of Kenya. The 26 December 2004 Sumatra tsunami hit these islands killing two people and causing huge damage to structures and facilities. The impact was more moderate than it could be, because the highest waves arrived during the lowest tide cycle. The difference between low and high tide is about 1.4 meters and this situation limited substantially the inundation inland. The maximun observed runups were no greater than 4 meters above sea level. All the Seychelles islands lie on a very shallow platform. This platform differentiates from the surrounding sea bottom with a rapid change of the bathymetry that leads the ocean depth from 2 km to 70-80 m over a very short horizontal distance. This peculiar morphology of the bathymetry has very interesting effects on the tsunami propagation. In facts the platform is capable of modifying significantly the tsunami signal with respect to the surrounding open sea. The main island of the archipelago is Mahé. Here the tsunami was recorded by the Pointe La Rue station that is located at the end of the international airport in the east side of the island. Praslin is the second largest island of the group of the Seychelles Archipelago and it was chosen as benchmark for testing numerical models by the research teams involved in the framework of the EU-funded SCHEMA (Scenarios for Hazard-induced Emergencies Management) project. The Tsunami Research Team of the Bologna University, Italy, is partner in the project and here it presents the results obtained for Praslin, computing the inundation maps for the 2004 case, basing on the source model proposed by PMEL/NOAA (M=9.3, average slip 18 m, L=700 km, W=100-150 km). Here we present the results concerning the propagation and inundation in the island of Praslin that have been computed by means of the UBO-TSUFD code developed and maintained by the Tsunami Research Team of the University of Bologna. The code solves both linear and non-linear shallow water equations with a leap-frog algorithm over staggered nested grids. The high resolution bathymetry and topography in Praslin island area were provided by the GSC Geosciences Consultant (Bagneux, France), coordinator of SCHEMA. The first goal of the study is trying to reproduce the signal recorded at the Pointe La Rue station in order to test the reliability of the numerical code. Moreover, the effects of the 2004 Sumatra tsunami on the island of Praslin are shown, providing detailed inundation maps and maximum elevation and velocity fields computed with a spatial resolution of 8 meters. Finally, an analysis of the effects of the Seychelles platform on the tsunami is shown and discussed.

  5. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  6. Non-specific physical symptoms in relation to actual and perceived proximity to mobile phone base stations and powerlines

    PubMed Central

    2011-01-01

    Background Evidence about a possible causal relationship between non-specific physical symptoms (NSPS) and exposure to electromagnetic fields (EMF) emitted by sources such as mobile phone base stations (BS) and powerlines is insufficient. So far little epidemiological research has been published on the contribution of psychological components to the occurrence of EMF-related NSPS. The prior objective of the current study is to explore the relative importance of actual and perceived proximity to base stations and psychological components as determinants of NSPS, adjusting for demographic, residency and area characteristics. Methods Analysis was performed on data obtained in a cross-sectional study on environment and health in 2006 in the Netherlands. In the current study, 3611 adult respondents (response rate: 37%) in twenty-two Dutch residential areas completed a questionnaire. Self-reported instruments included a symptom checklist and assessment of environmental and psychological characteristics. The computation of the distance between household addresses and location of base stations and powerlines was based on geo-coding. Multilevel regression models were used to test the hypotheses regarding the determinants related to the occurrence of NSPS. Results After adjustment for demographic and residential characteristics, analyses yielded a number of statistically significant associations: Increased report of NSPS was predominantly predicted by higher levels of self-reported environmental sensitivity; perceived proximity to base stations and powerlines, lower perceived control and increased avoidance (coping) behavior were also associated with NSPS. A trend towards a moderator effect of perceived environmental sensitivity on the relation between perceived proximity to BS and NSPS was verified (p = 0.055). There was no significant association between symptom occurrence and actual distance to BS or powerlines. Conclusions Perceived proximity to BS, psychological components and socio-demographic characteristics are associated with the report of symptomatology. Actual distance to the EMF source did not show up as determinant of NSPS. PMID:21631930

  7. A computer program to determine the possible daily release window for sky target experiments

    NASA Technical Reports Server (NTRS)

    Michaud, N. H.

    1973-01-01

    A computer program is presented which is designed to determine the daily release window for sky target experiments. Factors considered in the program include: (1) target illumination by the sun at release time and during the tracking period; (2) look angle elevation above local horizon from each tracking station to the target; (3) solar depression angle from the local horizon of each tracking station during the experimental period after target release; (4) lunar depression angle from the local horizon of each tracking station during the experimental period after target release; and (5) total sky background brightness as seen from each tracking station while viewing the target. Program output is produced in both graphic and data form. Output data can be plotted for a single calendar month or year. The numerical values used to generate the plots are furnished to permit a more detailed review of the computed daily release windows.

  8. Computed tomographic atlas for the new international lymph node map for lung cancer: A radiation oncologist perspective.

    PubMed

    Lynch, Rod; Pitson, Graham; Ball, David; Claude, Line; Sarrut, David

    2013-01-01

    To develop a reproducible definition for each mediastinal lymph node station based on the new TNM classification for lung cancer. This paper proposes an atlas using the new international lymph node map used in the seventh edition of the TNM classification for lung cancer. Four radiation oncologists and 1 diagnostic radiologist were involved in the project to put forward a reproducible radiologic description for the lung lymph node stations. The International Association for the Study of Lung Cancer lymph node definitions for stations 1 to 11 have been described and illustrated on axial computed tomographic scan images using a certified radiotherapy planning system. This atlas will assist both diagnostic radiologists and radiation oncologists in accurately defining the lymph node stations on computed tomographic scan in patients diagnosed with lung cancer. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  9. Unaligned instruction relocation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.

    In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unalignedmore » ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.« less

  10. Unaligned instruction relocation

    DOEpatents

    Bertolli, Carlo; O'Brien, John K.; Sallenave, Olivier H.; Sura, Zehra N.

    2018-01-23

    In one embodiment, a computer-implemented method includes receiving source code to be compiled into an executable file for an unaligned instruction set architecture (ISA). Aligned assembled code is generated, by a computer processor. The aligned assembled code complies with an aligned ISA and includes aligned processor code for a processor and aligned accelerator code for an accelerator. A first linking pass is performed on the aligned assembled code, including relocating a first relocation target in the aligned accelerator code that refers to a first object outside the aligned accelerator code. Unaligned assembled code is generated in accordance with the unaligned ISA and includes unaligned accelerator code for the accelerator and unaligned processor code for the processor. A second linking pass is performed on the unaligned assembled code, including relocating a second relocation target outside the unaligned accelerator code that refers to an object in the unaligned accelerator code.

  11. Computer algorithm for coding gain

    NASA Technical Reports Server (NTRS)

    Dodd, E. E.

    1974-01-01

    Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.

  12. 19. YAZOO BACKWATER PUMPING STATION MODEL, YAZOO RIVER BASIN. ELECTRONICS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. YAZOO BACKWATER PUMPING STATION MODEL, YAZOO RIVER BASIN. ELECTRONICS ENGINEER AT DATA COLLECTION COMPUTER ROOM. - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS

  13. System and method for transferring telemetry data between a ground station and a control center

    NASA Technical Reports Server (NTRS)

    Ray, Timothy J. (Inventor); Ly, Vuong T. (Inventor)

    2012-01-01

    Disclosed herein are systems, computer-implemented methods, and tangible computer-readable media for coordinating communications between a ground station, a control center, and a spacecraft. The method receives a call to a simple, unified application programmer interface implementing communications protocols related to outer space, when instruction relates to receiving a command at the control center for the ground station generate an abstract message by agreeing upon a format for each type of abstract message with the ground station and using a set of message definitions to configure the command in the agreed upon format, encode the abstract message to generate an encoded message, and transfer the encoded message to the ground station, and perform similar actions when the instruction relates to receiving a second command as a second encoded message at the ground station from the control center and when the determined instruction type relates to transmitting information to the control center.

  14. A review on the inter-frequency biases of GLONASS carrier-phase data

    NASA Astrophysics Data System (ADS)

    Geng, Jianghui; Zhao, Qile; Shi, Chuang; Liu, Jingnan

    2017-03-01

    GLONASS ambiguity resolution (AR) between inhomogeneous stations requires correction of inter-frequency phase biases (IFPBs) (a "station" here is an integral ensemble of a receiver, an antenna, firmware, etc.). It has been elucidated that IFPBs as a linear function of channel numbers are not physical in nature, but actually originate in differential code-phase biases (DCPBs). Although IFPBs have been prevalently recognized, an unanswered question is whether IFPBs and DCPBs are equivalent in enabling GLONASS AR. Besides, general strategies for the DCPB estimation across a large network of heterogeneous stations are still under investigation within the GNSS community, such as whether one DCPB per receiver type (rather than individual stations) suffices, as tentatively suggested by the IGS (International GNSS Service), and what accuracy we are able to and ought to achieve for DCPB products. In this study, we review the concept of DCPBs and point out that IFPBs are only approximate derivations from DCPBs, and are potentially problematic if carrier-phase hardware biases differ by up to several millimeters across frequency channels. We further stress the station and observable specific properties of DCPBs which cannot be thoughtlessly ignored as conducted conventionally. With 212 days of data from 200 European stations, we estimated DCPBs per stations by resolving ionosphere-free ambiguities of ˜ 5.3 cm wavelengths, and compared them to the presumed truth benchmarks computed directly with L1 and L2 data on ultra-short baselines. On average, the accuracy of our DCPB products is around 0.7 ns in RMS. According to this uncertainty estimates, we could unambiguously confirm that DCPBs can typically differ substantially by up to 30 ns among receivers of identical types and over 10 ns across different observables. In contrast, a DCPB error of more than 6 ns will decrease the fixing rate of ionosphere-free ambiguities by over 20 %, due to their smallest frequency spacing and highest sensitivity to DCPB errors. Therefore, we suggest that (1) the rigorous DCPB model should be implemented instead of the classic, but inaccurate IFPB model; (2) DCPBs of sub-ns accuracy can be achieved over a large network by efficiently resolving ionosphere-free ambiguities; (3) DCPBs should be estimated and applied on account of their station and observable specific properties, especially for ambiguities of short wavelengths.

  15. SCaN Network Ground Station Receiver Performance for Future Service Support

    NASA Technical Reports Server (NTRS)

    Estabrook, Polly; Lee, Dennis; Cheng, Michael; Lau, Chi-Wung

    2012-01-01

    Objectives: Examine the impact of providing the newly standardized CCSDS Low Density Parity Check (LDPC) codes to the SCaN return data service on the SCaN SN and DSN ground stations receivers: SN Current Receiver: Integrated Receiver (IR). DSN Current Receiver: Downlink Telemetry and Tracking (DTT) Receiver. Early Commercial-Off-The-Shelf (COTS) prototype of the SN User Service Subsystem Component Replacement (USS CR) Narrow Band Receiver. Motivate discussion of general issues of ground station hardware design to enable simple and cheap modifications for support of future services.

  16. A computer system for the storage and retrieval of gravity data, Kingdom of Saudi Arabia

    USGS Publications Warehouse

    Godson, Richard H.; Andreasen, Gordon H.

    1974-01-01

    A computer system has been developed for the systematic storage and retrieval of gravity data. All pertinent facts relating to gravity station measurements and computed Bouguer values may be retrieved either by project name or by geographical coordinates. Features of the system include visual display in the form of printer listings of gravity data and printer plots of station locations. The retrieved data format interfaces with the format of GEOPAC, a system of computer programs designed for the analysis of geophysical data.

  17. Anderson uses laptop computer in the U.S. Laboratory during Joint Operations

    NASA Image and Video Library

    2007-06-13

    S117-E-07134 (12 June 2007) --- Astronaut Clayton Anderson, Expedition 15 flight engineer, uses a computer near the Microgravity Science Glovebox (MSG) in the Destiny laboratory of the International Space Station while Space Shuttle Atlantis (STS-117) was docked with the station. Astronaut Sunita Williams, flight engineer, is at right.

  18. A Computerized Weather Station for the Apple IIe.

    ERIC Educational Resources Information Center

    Lorson, Mark V.

    Predicting weather conditions is a topic of interest for students who want to make plans for outside activities. This paper discusses the development of an inexpensive computer-interfaced classroom weather station using an Apple IIe computer that provides the viewer with up to the minute digital readings of inside and outside temperature,…

  19. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  20. Time coded distribution via broadcasting stations

    NASA Technical Reports Server (NTRS)

    Leschiutta, S.; Pettiti, V.; Detoma, E.

    1979-01-01

    The distribution of standard time signals via AM and FM broadcasting stations presents the distinct advantages to offer a wide area coverage and to allow the use of inexpensive receivers, but the signals are radiated a limited number of times per day, are not usually available during the night, and no full and automatic synchronization of a remote clock is possible. As an attempt to overcome some of these problems, a time coded signal with a complete date information is diffused by the IEN via the national broadcasting networks in Italy. These signals are radiated by some 120 AM and about 3000 FM and TV transmitters around the country. In such a way, a time ordered system with an accuracy of a couple of milliseconds is easily achieved.

  1. Near-station terrain corrections for gravity data by a surface-integral technique

    USGS Publications Warehouse

    Gettings, M.E.

    1982-01-01

    A new method of computing gravity terrain corrections by use of a digitizer and digital computer can result in substantial savings in the time and manual labor required to perform such corrections by conventional manual ring-chart techniques. The method is typically applied to estimate terrain effects for topography near the station, for example within 3 km of the station, although it has been used successfully to a radius of 15 km to estimate corrections in areas where topographic mapping is poor. Points (about 20) that define topographic maxima, minima, and changes in the slope gradient are picked on the topographic map, within the desired radius of correction about the station. Particular attention must be paid to the area immediately surrounding the station to ensure a good topographic representation. The horizontal and vertical coordinates of these points are entered into the computer, usually by means of a digitizer. The computer then fits a multiquadric surface to the input points to form an analytic representation of the surface. By means of the divergence theorem, the gravity effect of an interior closed solid can be expressed as a surface integral, and the terrain correction is calculated by numerical evaluation of the integral over the surfaces of a cylinder, The vertical sides of which are at the correction radius about the station, the flat bottom surface at the topographic minimum, and the upper surface given by the multiquadric equation. The method has been tested with favorable results against models for which an exact result is available and against manually computed field-station locations in areas of rugged topography. By increasing the number of points defining the topographic surface, any desired degree of accuracy can be obtained. The method is more objective than manual ring-chart techniques because no average compartment elevations need be estimated ?

  2. Improved Iterative Decoding of Network-Channel Codes for Multiple-Access Relay Channel.

    PubMed

    Majumder, Saikat; Verma, Shrish

    2015-01-01

    Cooperative communication using relay nodes is one of the most effective means of exploiting space diversity for low cost nodes in wireless network. In cooperative communication, users, besides communicating their own information, also relay the information of other users. In this paper we investigate a scheme where cooperation is achieved using a common relay node which performs network coding to provide space diversity for two information nodes transmitting to a base station. We propose a scheme which uses Reed-Solomon error correcting code for encoding the information bit at the user nodes and convolutional code as network code, instead of XOR based network coding. Based on this encoder, we propose iterative soft decoding of joint network-channel code by treating it as a concatenated Reed-Solomon convolutional code. Simulation results show significant improvement in performance compared to existing scheme based on compound codes.

  3. Design and optimization of a portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.

  4. Development of a thermal and structural analysis procedure for cooled radial turbines

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Deanna, Russell G.

    1988-01-01

    A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.

  5. COMPUTATION OF GLOBAL PHOTOCHEMISTRY WITH SMVGEAR II (R823186)

    EPA Science Inventory

    A computer model was developed to simulate global gas-phase photochemistry. The model solves chemical equations with SMVGEAR II, a sparse-matrix, vectorized Gear-type code. To obtain SMVGEAR II, the original SMVGEAR code was modified to allow computation of different sets of chem...

  6. Automation of Precise Time Reference Stations (PTRS)

    NASA Astrophysics Data System (ADS)

    Wheeler, P. J.

    1985-04-01

    The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.

  7. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Overview and summary

    NASA Technical Reports Server (NTRS)

    1989-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned Marshall Space Flight Center (MSFC) Payload Training Complex (PTC) required to meet this need will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs. This study was performed August 1988 to October 1989. Thus, the results are based on the SSFP August 1989 baseline, i.e., pre-Langley configuration/budget review (C/BR) baseline. Some terms, e.g., combined trainer, are being redefined. An overview of the study activities and a summary of study results are given here.

  8. Computational strategies for three-dimensional flow simulations on distributed computer systems. Ph.D. Thesis Semiannual Status Report, 15 Aug. 1993 - 15 Feb. 1994

    NASA Technical Reports Server (NTRS)

    Weed, Richard Allen; Sankar, L. N.

    1994-01-01

    An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.

  9. Computerized systems analysis and optimization of aircraft engine performance, weight, and life cycle costs

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1979-01-01

    The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.

  10. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  11. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Briones, Janette C.; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was con- ducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round- trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  12. Performance of an optical relay satellite using Reed-Solomon coding over a cascaded optical PPM and BPSK channel

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Naderi, F.

    1982-01-01

    The nature of the optical/microwave interface aboard the relay satellite is considered. To allow for the maximum system flexibility, without overburdening either the optical or RF channel, demodulating the optical on board the relay satellite but leaving the optical channel decoding to be performed at the ground station is examined. The occurrence of erasures in the optical channel is treated. A hard decision on the erasure (i.e., the relay selecting a symbol at random in case of erasure occurrence) seriously degrades the performance of the overall system. Coding the erasure occurrences at the relay and transmitting this information via an extra bit to the ground station where it can be used by the decoder is suggested. Many examples with varying bit/photon energy efficiency and for the noisy and noiseless optical channel are considered. It is shown that coding the erasure occurrences dramatically improves the performance of the cascaded channel relative to the case of hard decision on the erasure by the relay.

  13. Low-flow characteristics of Indiana streams

    USGS Publications Warehouse

    Fowler, K.K.; Wilson, J.T.

    1996-01-01

    Knowledge of low-flow characteristics of streams is essential for management of water resources. Low-flow characteristics are presented for 229 continuous-record, streamflow-gaging stations and 285 partial-record stations in Indiana. Low- flow-frequency characteristics were computed for 210 continuous-record stations that had at least 10 years of record, and flow-duration curves were computed for all continuous-record stations. Low-flow-frequency and flow-duration analyses are based on available streamflow records through September 1993. Selected low-flow-frequency curves were computed for annual low flows and seasonal low flows. The four seasons are represented by the 3-month groups of March-May, June-August, September-November, and December- February. The 7-day, 10-year and the 7-day, 2 year low flows were estimated for 285 partial-record stations, which are ungaged sites where streamflow measurements were made at base flow. The same low-flow characteristics were estimated for 19 continuous-record stations where less than 10 years of record were available. Precipitation and geology directly influence the streams in Indiana. Streams in the northern, glaciated part of the State tend to have higher sustained base flows than those in the nonglaciated southern part. Flow at several of the continuous-record gaging stations is affected by some form of regulation or diversion. Low-flow characteristics for continuous-record stations at which flow is affected by regulation are determined using the period of record affected by regulation; natural flows prior to regulation are not used.

  14. A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle

    DTIC Science & Technology

    1984-12-01

    program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit

  15. Space Station needs, attributes and architectural options study. Volume 7-4A: Data book, architecture, technology and programmatics, part A

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Various parameters of the orbital space station are discussed. The space station environment, data management system, communication and tracking, environmental control, and life support system are considered. Specific topics reviewed include crew work stations, restraint systems, stowage, computer hardware, and expert systems.

  16. A single-station empirical model for TEC over the Antarctic Peninsula using GPS-TEC data

    NASA Astrophysics Data System (ADS)

    Feng, Jiandi; Wang, Zhengtao; Jiang, Weiping; Zhao, Zhenzhen; Zhang, Bingbing

    2017-02-01

    Compared with regional or global total electron content (TEC) empirical models, single-station TEC empirical models may exhibit higher accuracy in describing TEC spatial and temporal variations for a single station. In this paper, a new single-station empirical total electron content (TEC) model, called SSM-month, for the O'Higgins Station in the Antarctic Peninsula is proposed by using Global Positioning System (GPS)-TEC data from 01 January 2004 to 30 June 2015. The diurnal variation of TEC in the O'Higgins Station may have changing features in different months, sometimes even in opposite forms, because of ionospheric phenomena, such as the Mid-latitude Summer Nighttime Anomaly (MSNA). To avoid the influence of different diurnal variations, the concept of monthly modeling is proposed in this study. The SSM-month model, which is established by month (including 12 submodels that correspond to the 12 months), can effectively describe the diurnal variation of TEC in different months. Each submodel of the SSM-month model exhibits good agreement with GPS-TEC input data. Overall, the SSM-month model fits the input data with a bias of 0.03 TECU (total electron content unit, 1 TECU = 1016 el m-2) and a standard deviation of 2.78 TECU. This model, which benefits from the modeling method, can effectively describe the MSNA phenomenon without implementing any modeling correction. TEC data derived from Center for Orbit Determination in Europe global ionosphere maps (CODE GIMs), International Reference Ionosphere 2012 (IRI2012), and NeQuick are compared with the SSM-month model in the years of 2001 and 2015-2016. Result shows that the SSM-month model exhibits good consistency with CODE GIMs, which is better than that of IRI2012 and NeQuick, in the O'Higgins Station on the test days.

  17. A qualitative phenomenological study: Enhanced, risk-based FAA oversight on part 145 maintenance practices

    NASA Astrophysics Data System (ADS)

    Sheehan, Bryan G.

    The purpose of this qualitative phenomenological study was to examine the phenomenon of enhanced, risk-based Federal Aviation Administration (FAA) oversight of Part 145 repair stations that performed aircraft maintenance for Part 121 air carriers between 2007 and 2014 in Oklahoma. Specifically, this research was utilized to explore what operational changes have occurred in the domestic Part 145 repair station industry such as variations in management or hiring practices, training, recordkeeping and technical data, inventory and aircraft parts supply-chain logistics, equipment, and facilities. After interviewing 12 managers from Part 145 repair stations in Oklahoma, six major theme codes emerged from the data: quality of oversight before 2007, quality of oversight after 2007, advantages of oversight, disadvantages of oversight, status quo of oversight, and process improvement . Of those six major theme codes, 17 subthemes appeared from the data that were used to explain the phenomenon of enhanced oversight in the Part 145 repair station industry. Forty-two percent of the participants indicated a weak FAA oversight system that has hindered the continuous process improvement program in their repair stations. Some of them were financially burdened after hiring additional full-time quality assurance inspectors to specifically manage enhanced FAA oversight. Notwithstanding, the participants of the study indicated that the FAA must apply its surveillance on a more standardized and consistent basis. They want to see this standardization in how FAA inspectors interpret regulations and practice the same quality of oversight for all repair stations, particularly those that are repeat violators and fail to comply with federal aviation regulations. They believed that when the FAA enforces standardization on a consistent basis, repair stations can become more efficient and safer in the performance of their scope of work for the U.S. commercial air transportation industry.

  18. Concordia CCD - A Geoscope station in continental Antarctica

    NASA Astrophysics Data System (ADS)

    Maggi, A.; Lévêque, J.; Thoré, J.; Bes de Berc, M.; Bernard, A.; Danesi, S.; Morelli, A.; Delladio, A.; Sorrentino, D.; Stutzmann, E.; Geoscope Team

    2010-12-01

    Concordia (Dome C, Antarctica) has had a permanent seismic station since 2005. It is run by EOST and INGV in collaboration with the French and Italian polar institutes (IPEV and PNRA). It is installed in an ice-vault, at 12m depth, distant 1km from the permanent scientific base at Concordia. The temperature in the vault is a constant -55°C. The data quality at the station has improved continuously since its installation. In 2007, the station was declared at ISC as an open station with station code CCD (ConCorDia), with data available upon request. It is only the second permanent station in the Antarctic continent, after South Pole. In 2010, CCD was included in the Geoscope network. Data from CCD starting in 2007 are now freely available from the Geoscope Data Center and IRIS. We present an analysis of the data quality at CCD, and describe the technical difficulties of operating an observatory-quality seismic station in the extreme environmental conditons present in continental Antarctica.

  19. Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.

    1999-01-01

    A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.

  20. Centralized vs. decentralized nursing stations: effects on nurses' functional use of space and work environment.

    PubMed

    Zborowsky, Terri; Bunker-Hellmich, Lou; Morelli, Agneta; O'Neill, Mike

    2010-01-01

    Evidence-based findings of the effects of nursing station design on nurses' work environment and work behavior are essential to improve conditions and increase retention among these fundamental members of the healthcare delivery team. The purpose of this exploratory study was to investigate how nursing station design (i.e., centralized and decentralized nursing station layouts) affected nurses' use of space, patient visibility, noise levels, and perceptions of the work environment. Advances in information technology have enabled nurses to move away from traditional centralized paper-charting stations to smaller decentralized work stations and charting substations located closer to, or inside of, patient rooms. Improved understanding of the trade-offs presented by centralized and decentralized nursing station design has the potential to provide useful information for future nursing station layouts. This information will be critical for understanding the nurse environment "fit." The study used an exploratory design with both qualitative and quantitative methods. Qualitative data regarding the effects of nursing station design on nurses' health and work environment were gathered by means of focus group interviews. Quantitative data-gathering techniques included place- and person-centered space use observations, patient visibility assessments, sound level measurements, and an online questionnaire regarding perceptions of the work environment. Nurses on all units were observed most frequently performing telephone, computer, and administrative duties. Time spent using telephones, computers, and performing other administrative duties was significantly higher in the centralized nursing stations. Consultations with medical staff and social interactions were significantly less frequent in decentralized nursing stations. There were no indications that either centralized or decentralized nursing station designs resulted in superior visibility. Sound levels measured in all nursing stations exceeded recommended levels during all shifts. No significant differences were identified in nurses' perceptions of work control-demand-support in centralized and decentralized nursing station designs. The "hybrid" nursing design model in which decentralized nursing stations are coupled with centralized meeting rooms for consultation between staff members may strike a balance between the increase in computer duties and the ongoing need for communication and consultation that addresses the conflicting demands of technology and direct patient care.

  1. Fault tolerant computing: A preamble for assuring viability of large computer systems

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1977-01-01

    The need for fault-tolerant computing is addressed from the viewpoints of (1) why it is needed, (2) how to apply it in the current state of technology, and (3) what it means in the context of the Phoenix computer system and other related systems. To this end, the value of concurrent error detection and correction is described. User protection, program retry, and repair are among the factors considered. The technology of algebraic codes to protect memory systems and arithmetic codes to protect memory systems and arithmetic codes to protect arithmetic operations is discussed.

  2. Review of variations in Mw < 7 earthquake motions on position and TEC (Mw = 6.5 Aegean Sea earthquake sample)

    NASA Astrophysics Data System (ADS)

    Yildirim, Omer; Inyurt, Samed; Mekik, Cetin

    2016-02-01

    Turkey is a country located in the middle latitude zone, where tectonic activity is intensive. Recently, an earthquake of magnitude 6.5 Mw occurred offshore in the Aegean Sea on 24 May 2014 at 09:25 UTC, which lasted about 40 s. The earthquake was also felt in Greece, Romania, and Bulgaria in addition to Turkey. In recent years, ionospheric anomaly detection studies have been carried out because of seismicity with total electron content (TEC) computed from the global navigation satellite system's (GNSS) signal delays and several interesting findings have been published. In this study, both TEC and positional variations have been examined separately following a moderate size earthquake in the Aegean Sea. The correlation of the aforementioned ionospheric variation with the positional variation has also been investigated. For this purpose, a total of 15 stations was used, including four continuously operating reference stations in Turkey (CORS-TR) and stations in the seismic zone (AYVL, CANA, IPSA, and YENC), as well as international GNSS service (IGS) and European reference frame permanent network (EPN) stations. The ionospheric and positional variations of the AYVL, CANA, IPSA, and YENC stations were examined using Bernese v5.0 software. When the precise point positioning TEC (PPP-TEC) values were examined, it was observed that the TEC values were approximately 4 TECU (total electron content unit) above the upper-limit TEC value at four stations located in Turkey, 3 days before the earthquake at 08:00 and 10:00 UTC. At the same stations, on the day before the earthquake at 06:00, 08:00, and 10:00 UTC, the TEC values were approximately 5 TECU below the lower-limit TEC value. The global ionosphere model TEC (GIM-TEC) values published by the Centre for Orbit Determination in Europe (CODE) were also examined. Three days before the earthquake, at all stations, it was observed that the TEC values in the time period between 08:00 and 10:00 UTC were approximately 2 TECU above the upper-limit TEC value; 1 day before the earthquake at 06:00, 08:00, and 10:00 UTC, the TEC values were approximately 4 TECU below the lower-limit TEC value. Again, by using the same 15 stations, positional variation investigation for before and after the earthquake was undertaken for the AYVL, CANA, IPSA, and YENC stations. As a result of the conducted analysis, positional displacements were seen before and after the earthquake at the CANA station, which is the nearest station to the earthquake centre. Before and after the earthquake, positional displacements were observed as 10 and 3 cm respectively.

  3. The Advanced Software Development and Commercialization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallopoulos, E.; Canfield, T.R.; Minkoff, M.

    1990-09-01

    This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less

  4. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  5. Poetry in Programs: A Brief Examination of Software Aesthetics, Including Observations on the History of Programming Styles and Speculations on Post-object Programming

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    This viewgraph presentation provides samples of computer code which have characteristics of poetic verse, and addresses the theoretical underpinnings of artistic coding, as well as how computer language influences software style, and the possible style of future coding.

  6. Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing

    NASA Astrophysics Data System (ADS)

    Salamone, Joseph A., III

    Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.

  7. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  8. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  9. Force user's manual: A portable, parallel FORTRAN

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.; Benten, Muhammad S.; Arenstorf, Norbert S.; Ramanan, Aruna V.

    1990-01-01

    The use of Force, a parallel, portable FORTRAN on shared memory parallel computers is described. Force simplifies writing code for parallel computers and, once the parallel code is written, it is easily ported to computers on which Force is installed. Although Force is nearly the same for all computers, specific details are included for the Cray-2, Cray-YMP, Convex 220, Flex/32, Encore, Sequent, Alliant computers on which it is installed.

  10. A computer-based specification methodology

    NASA Technical Reports Server (NTRS)

    Munck, Robert G.

    1986-01-01

    Standard practices for creating and using system specifications are inadequate for large, advanced-technology systems. A need exists to break away from paper documents in favor of documents that are stored in computers and which are read and otherwise used with the help of computers. An SADT-based system, running on the proposed Space Station data management network, could be a powerful tool for doing much of the required technical work of the Station, including creating and operating the network itself.

  11. Monte Carlo simulation of Ising models by multispin coding on a vector computer

    NASA Astrophysics Data System (ADS)

    Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus

    1984-11-01

    Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.

  12. Evolutionary space station fluids management strategies

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Results are summarized for an 11-month study to define fluid storage and handling strategies and requirements for various specific mission case studies and their associated design impacts on the Space Station. There are a variety of fluid users which require a variety of fluids and use rates. Also, the cryogenic propellants required for NASA's STV, Planetary, and Code Z missions are enormous. The storage methods must accommodate fluids ranging from a high pressure gas or supercritical state fluid to a sub-cooled liquid (and superfluid helium). These requirements begin in the year 1994, reach a maximum of nearly 1800 metric tons in the year 2004, and trail off to the year 2018, as currently planned. It is conceivable that the cryogenic propellant needs for the STV and/or Lunar mission models will be met by LTCSF LH2/LO2 tanksets attached to the SS truss structure. Concepts and corresponding transfer and delivery operations have been presented for STV propellant provisioning from the SS. A growth orbit maneuvering vehicle (OMV) and associated servicing capability will be required to move tanksets from delivery launch vehicles to the SS or co-orbiting platforms. Also, appropriate changes to the software used for OMV operation are necessary to allow for the combined operation of the growth OMV. To support fluid management activities at the Space Station for the experimental payloads and propellant provisioning, there must be truss structure space allocated for fluid carriers and propellant tanksets, and substantial beam strengthening may be required. The Station must have two Mobile Remote Manipulator Systems (MRMS) and the growth OMV propellant handling operations for the STV at the SS. Propellant needs for the Planetary Initiatives and Code Z mission models will most likely be provided by co-orbiting propellant platform(s). Space Station impacts for Code Z mission fluid management activities will be minimal.

  13. Thrust chamber performance using Navier-Stokes solution. [space shuttle main engine viscous nozzle calculation

    NASA Technical Reports Server (NTRS)

    Chan, J. S.; Freeman, J. A.

    1984-01-01

    The viscous, axisymmetric flow in the thrust chamber of the space shuttle main engine (SSME) was computed on the CRAY 205 computer using the general interpolants method (GIM) code. Results show that the Navier-Stokes codes can be used for these flows to study trends and viscous effects as well as determine flow patterns; but further research and development is needed before they can be used as production tools for nozzle performance calculations. The GIM formulation, numerical scheme, and computer code are described. The actual SSME nozzle computation showing grid points, flow contours, and flow parameter plots is discussed. The computer system and run times/costs are detailed.

  14. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  15. Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Beggs, John H.

    1991-01-01

    Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.

  16. Multitasking the code ARC3D. [for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Barton, John T.; Hsiung, Christopher C.

    1986-01-01

    The CRAY multitasking system was developed in order to utilize all four processors and sharply reduce the wall clock run time. This paper describes the techniques used to modify the computational fluid dynamics code ARC3D for this run and analyzes the achieved speedup. The ARC3D code solves either the Euler or thin-layer N-S equations using an implicit approximate factorization scheme. Results indicate that multitask processing can be used to achieve wall clock speedup factors of over three times, depending on the nature of the program code being used. Multitasking appears to be particularly advantageous for large-memory problems running on multiple CPU computers.

  17. Addressing the challenges of standalone multi-core simulations in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Ocaya, R. O.; Terblans, J. J.

    2017-07-01

    Computational modelling in material science involves mathematical abstractions of force fields between particles with the aim to postulate, develop and understand materials by simulation. The aggregated pairwise interactions of the material's particles lead to a deduction of its macroscopic behaviours. For practically meaningful macroscopic scales, a large amount of data are generated, leading to vast execution times. Simulation times of hours, days or weeks for moderately sized problems are not uncommon. The reduction of simulation times, improved result accuracy and the associated software and hardware engineering challenges are the main motivations for many of the ongoing researches in the computational sciences. This contribution is concerned mainly with simulations that can be done on a "standalone" computer based on Message Passing Interfaces (MPI), parallel code running on hardware platforms with wide specifications, such as single/multi- processor, multi-core machines with minimal reconfiguration for upward scaling of computational power. The widely available, documented and standardized MPI library provides this functionality through the MPI_Comm_size (), MPI_Comm_rank () and MPI_Reduce () functions. A survey of the literature shows that relatively little is written with respect to the efficient extraction of the inherent computational power in a cluster. In this work, we discuss the main avenues available to tap into this extra power without compromising computational accuracy. We also present methods to overcome the high inertia encountered in single-node-based computational molecular dynamics. We begin by surveying the current state of the art and discuss what it takes to achieve parallelism, efficiency and enhanced computational accuracy through program threads and message passing interfaces. Several code illustrations are given. The pros and cons of writing raw code as opposed to using heuristic, third-party code are also discussed. The growing trend towards graphical processor units and virtual computing clouds for high-performance computing is also discussed. Finally, we present the comparative results of vacancy formation energy calculations using our own parallelized standalone code called Verlet-Stormer velocity (VSV) operating on 30,000 copper atoms. The code is based on the Sutton-Chen implementation of the Finnis-Sinclair pairwise embedded atom potential. A link to the code is also given.

  18. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  19. Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing

    DTIC Science & Technology

    2008-01-01

    complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING

  20. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  1. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  2. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  3. Development of a 3-D upwind PNS code for chemically reacting hypersonic flowfields

    NASA Technical Reports Server (NTRS)

    Tannehill, J. C.; Wadawadigi, G.

    1992-01-01

    Two new parabolized Navier-Stokes (PNS) codes were developed to compute the three-dimensional, viscous, chemically reacting flow of air around hypersonic vehicles such as the National Aero-Space Plane (NASP). The first code (TONIC) solves the gas dynamic and species conservation equations in a fully coupled manner using an implicit, approximately-factored, central-difference algorithm. This code was upgraded to include shock fitting and the capability of computing the flow around complex body shapes. The revised TONIC code was validated by computing the chemically-reacting (M(sub infinity) = 25.3) flow around a 10 deg half-angle cone at various angles of attack and the Ames All-Body model at 0 deg angle of attack. The results of these calculations were in good agreement with the results from the UPS code. One of the major drawbacks of the TONIC code is that the central-differencing of fluxes across interior flowfield discontinuities tends to introduce errors into the solution in the form of local flow property oscillations. The second code (UPS), originally developed for a perfect gas, has been extended to permit either perfect gas, equilibrium air, or nonequilibrium air computations. The code solves the PNS equations using a finite-volume, upwind TVD method based on Roe's approximate Riemann solver that was modified to account for real gas effects. The dissipation term associated with this algorithm is sufficiently adaptive to flow conditions that, even when attempting to capture very strong shock waves, no additional smoothing is required. For nonequilibrium calculations, the code solves the fluid dynamic and species continuity equations in a loosely-coupled manner. This code was used to calculate the hypersonic, laminar flow of chemically reacting air over cones at various angles of attack. In addition, the flow around the McDonnel Douglas generic option blended-wing-body was computed and comparisons were made between the perfect gas, equilibrium air, and the nonequilibrium air results.

  4. Linear chirp phase perturbing approach for finding binary phased codes

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Binary phased codes have many applications in communication and radar systems. These applications require binary phased codes to have low sidelobes in order to reduce interferences and false detection. Barker codes are the ones that satisfy these requirements and they have lowest maximum sidelobes. However, Barker codes have very limited code lengths (equal or less than 13) while many applications including low probability of intercept radar, and spread spectrum communication, require much higher code lengths. The conventional techniques of finding binary phased codes in literatures include exhaust search, neural network, and evolutionary methods, and they all require very expensive computation for large code lengths. Therefore these techniques are limited to find binary phased codes with small code lengths (less than 100). In this paper, by analyzing Barker code, linear chirp, and P3 phases, we propose a new approach to find binary codes. Experiments show that the proposed method is able to find long low sidelobe binary phased codes (code length >500) with reasonable computational cost.

  5. A study of the dynamics of rotating space stations with elastically connected counterweight and attached flexible appendages. Volume 1: Theory

    NASA Technical Reports Server (NTRS)

    Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.

    1973-01-01

    The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.

  6. RIP-REMOTE INTERACTIVE PARTICLE-TRACER

    NASA Technical Reports Server (NTRS)

    Rogers, S. E.

    1994-01-01

    Remote Interactive Particle-tracing (RIP) is a distributed-graphics program which computes particle traces for computational fluid dynamics (CFD) solution data sets. A particle trace is a line which shows the path a massless particle in a fluid will take; it is a visual image of where the fluid is going. The program is able to compute and display particle traces at a speed of about one trace per second because it runs on two machines concurrently. The data used by the program is contained in two files. The solution file contains data on density, momentum and energy quantities of a flow field at discrete points in three-dimensional space, while the grid file contains the physical coordinates of each of the discrete points. RIP requires two computers. A local graphics workstation interfaces with the user for program control and graphics manipulation, and a remote machine interfaces with the solution data set and performs time-intensive computations. The program utilizes two machines in a distributed mode for two reasons. First, the data to be used by the program is usually generated on the supercomputer. RIP avoids having to convert and transfer the data, eliminating any memory limitations of the local machine. Second, as computing the particle traces can be computationally expensive, RIP utilizes the power of the supercomputer for this task. Although the remote site code was developed on a CRAY, it is possible to port this to any supercomputer class machine with a UNIX-like operating system. Integration of a velocity field from a starting physical location produces the particle trace. The remote machine computes the particle traces using the particle-tracing subroutines from PLOT3D/AMES, a CFD post-processing graphics program available from COSMIC (ARC-12779). These routines use a second-order predictor-corrector method to integrate the velocity field. Then the remote program sends graphics tokens to the local machine via a remote-graphics library. The local machine interprets the graphics tokens and draws the particle traces. The program is menu driven. RIP is implemented on the silicon graphics IRIS 3000 (local workstation) with an IRIX operating system and on the CRAY2 (remote station) with a UNICOS 1.0 or 2.0 operating system. The IRIS 4D can be used in place of the IRIS 3000. The program is written in C (67%) and FORTRAN 77 (43%) and has an IRIS memory requirement of 4 MB. The remote and local stations must use the same user ID. PLOT3D/AMES unformatted data sets are required for the remote machine. The program was developed in 1988.

  7. Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Bartels, Robert E.

    2002-01-01

    A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.

  8. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  9. Spatio-temporal changes of seismic anisotropy in seismogenic zones

    NASA Astrophysics Data System (ADS)

    Saade, M.; Montagner, J.; Roux, P.; Paul, C.; Brenguier, F.; Enescu, B.; Shiomi, K.

    2013-12-01

    Seismic anisotropy plays a key role in the study of stress and strain fields in the earth. Potential temporal change of seismic anisotropy can be interpreted as change of the orientation of cracks in seismogenic zones and thus change of the stress field. Such temporal changes have been observed in seismogenic zones before and after earthquakes (Durand et al. , 2011) but are still not well understood. In this study, from a numerical point of view, we investigate the variations of the polarization of surface waves in anisotropic media. These variations are related to the elastic properties of the medium, in particular to anisotropy. The technique used is based on the calculation of the whole cross-correlation tensor (CCT) of ambient seismic noise. If the sources are randomly distributed in homogeneous medium, it allows us to reconstruct the Green's tensor between two stations continuously and to monitor the region through the use of its fluctuations. Therefore, the temporal change of the Green's cross-correlation tensor enables the monitoring of stress and strain fields. This technique is applied to synthetic seismograms computed in a transversally isotropic medium with horizontal symmetry axis (hereafter referred to an HTI medium) using a code RegSEM (Cupillard et al. , 2012) based on the spectral element method. We designed an experiment in order to investigate the influence of anisotropy on the CCT. In homogeneous, isotropic medium the off-diagonal terms of the Green's tensor are null. The CCT is computed between each pair of stations and then rotated in order to approximate the Green's tensor by minimizing the off-diagonal components. This procedure permits the calculation of the polarization angle of quasi-Rayleigh and quasi-Love waves, and to observe the azimuthal variation of their polarization. The results show that even a small variation of the azimuth of seismic anisotropy with respect to a certain pair of stations can induce, in some cases, a large variation in the horizontal polarization of surface waves along the direction of this pair of stations. It depends on the relative azimuth angle between the pair of stations and the direction of anisotropy, on the amplitude of anisotropy and the frequency band of the signal. Therefore, it is now possible to explain the large, rapid and very localized variations of surface waves horizontal polarization observed by Durand et al. (2011) during the Parkfield earthquake of 2004. Furthermore, some preliminary results about the investigation of seismic anisotropy change caused by the June 13, 2008 Iwate-Miyagi Nairiku earthquake (Mw = 6.9) will be presented.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  11. Progressive fracture of fiber composites

    NASA Technical Reports Server (NTRS)

    Irvin, T. B.; Ginty, C. A.

    1983-01-01

    Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.

  12. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  13. Design geometry and design/off-design performance computer codes for compressors and turbines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1995-01-01

    This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.

  14. PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.

    2017-12-01

    Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.

  15. Multiple grid problems on concurrent-processing computers

    NASA Technical Reports Server (NTRS)

    Eberhardt, D. S.; Baganoff, D.

    1986-01-01

    Three computer codes were studied which make use of concurrent processing computer architectures in computational fluid dynamics (CFD). The three parallel codes were tested on a two processor multiple-instruction/multiple-data (MIMD) facility at NASA Ames Research Center, and are suggested for efficient parallel computations. The first code is a well-known program which makes use of the Beam and Warming, implicit, approximate factored algorithm. This study demonstrates the parallelism found in a well-known scheme and it achieved speedups exceeding 1.9 on the two processor MIMD test facility. The second code studied made use of an embedded grid scheme which is used to solve problems having complex geometries. The particular application for this study considered an airfoil/flap geometry in an incompressible flow. The scheme eliminates some of the inherent difficulties found in adapting approximate factorization techniques onto MIMD machines and allows the use of chaotic relaxation and asynchronous iteration techniques. The third code studied is an application of overset grids to a supersonic blunt body problem. The code addresses the difficulties encountered when using embedded grids on a compressible, and therefore nonlinear, problem. The complex numerical boundary system associated with overset grids is discussed and several boundary schemes are suggested. A boundary scheme based on the method of characteristics achieved the best results.

  16. Interior view to the south of computer work stations in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view to the south of computer work stations in front of elevated work area 1570 on left and elevated glassed in work area 1870 on right - Over-the-Horizon Backscatter Radar Network, Mountain Home Air Force Operations Building, On Desert Street at 9th Avenue Mountain Home Air Force Base, Mountain Home, Elmore County, ID

  17. Computer evaluation of existing and proposed fire lookouts

    Treesearch

    Romain M. Mees

    1976-01-01

    A computer simulation model has been developed for evaluating the fire detection capabilities of existing and proposed lookout stations. The model uses coordinate location of fires and lookouts, tower elevation, and topographic data to judge location of stations, and to determine where a fire can be seen. The model was tested by comparing it with manual detection on a...

  18. Binary weight distributions of some Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Arnold, S.

    1992-01-01

    The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.

  19. A high temperature fatigue life prediction computer code based on the total strain version of StrainRange Partitioning (SRP)

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Saltsman, James F.

    1993-01-01

    A recently developed high-temperature fatigue life prediction computer code is presented and an example of its usage given. The code discussed is based on the Total Strain version of Strainrange Partitioning (TS-SRP). Included in this code are procedures for characterizing the creep-fatigue durability behavior of an alloy according to TS-SRP guidelines and predicting cyclic life for complex cycle types for both isothermal and thermomechanical conditions. A reasonably extensive materials properties database is included with the code.

  20. Turbomachinery Heat Transfer and Loss Modeling for 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Ameri, Ali

    2005-01-01

    This report's contents focus on making use of NASA Glenn on-site computational facilities,to develop, validate, and apply models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes to enhance the capability to compute heat transfer and losses in turbomachiney.

  1. Real-time computer treatment of THz passive device images with the high image quality

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  2. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    NASA Technical Reports Server (NTRS)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  3. Fingerprinting Communication and Computation on HPC Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean

    2010-06-02

    How do we identify what is actually running on high-performance computing systems? Names of binaries, dynamic libraries loaded, or other elements in a submission to a batch queue can give clues, but binary names can be changed, and libraries provide limited insight and resolution on the code being run. In this paper, we present a method for"fingerprinting" code running on HPC machines using elements of communication and computation. We then discuss how that fingerprint can be used to determine if the code is consistent with certain other types of codes, what a user usually runs, or what the user requestedmore » an allocation to do. In some cases, our techniques enable us to fingerprint HPC codes using runtime MPI data with a high degree of accuracy.« less

  4. Practices in source code sharing in astrophysics

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly

    2013-02-01

    While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.

  5. Development of V/STOL methodology based on a higher order panel method

    NASA Technical Reports Server (NTRS)

    Bhateley, I. C.; Howell, G. A.; Mann, H. W.

    1983-01-01

    The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.

  6. Lattice surgery on the Raussendorf lattice

    NASA Astrophysics Data System (ADS)

    Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco

    2018-07-01

    Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.

  7. 40 CFR 1033.110 - Emission diagnostics-general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...

  8. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.

  9. KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, workers check over the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.

    NASA Image and Video Library

    2004-02-03

    KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, workers check over the Italian-built Node 2, a future element of the International Space Station. The second of three Station connecting modules, the Node 2 attaches to the end of the U.S. Lab and provides attach locations for several other elements. Kopra is currently assigned technical duties in the Space Station Branch of the Astronaut Office, where his primary focus involves the testing of crew interfaces for two future ISS modules as well as the implementation of support computers and operational Local Area Network on ISS. Node 2 is scheduled to launch on mission STS-120, Station assembly flight 10A.

  10. Airfoil Vibration Dampers program

    NASA Technical Reports Server (NTRS)

    Cook, Robert M.

    1991-01-01

    The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.

  11. Computer optimization of reactor-thermoelectric space power systems

    NASA Technical Reports Server (NTRS)

    Maag, W. L.; Finnegan, P. M.; Fishbach, L. H.

    1973-01-01

    A computer simulation and optimization code that has been developed for nuclear space power systems is described. The results of using this code to analyze two reactor-thermoelectric systems are presented.

  12. RMT focal plane sensitivity to seismic network geometry and faulting style

    USGS Publications Warehouse

    Johnson, Kendra L.; Hayes, Gavin; Herrmann, Robert B.; Benz, Harley M.; McNamara, Daniel E.; Bergman, Eric A.

    2016-01-01

    Modern tectonic studies often use regional moment tensors (RMTs) to interpret the seismotectonic framework of an earthquake or earthquake sequence; however, despite extensive use, little existing work addresses RMT parameter uncertainty. Here, we quantify how network geometry and faulting style affect RMT sensitivity. We examine how data-model fits change with fault plane geometry (strike and dip) for varying station configurations. We calculate the relative data fit for incrementally varying geometries about a best-fitting solution, applying our workflow to real and synthetic seismograms for both real and hypothetical station distributions and earthquakes. Initially, we conduct purely observational tests, computing RMTs from synthetic seismograms for hypothetical earthquakes and a series of well-behaved network geometries. We then incorporate real data and station distributions from the International Maule Aftershock Deployment (IMAD), which recorded aftershocks of the 2010 MW 8.8 Maule earthquake, and a set of regional stations capturing the ongoing earthquake sequence in Oklahoma and southern Kansas. We consider RMTs computed under three scenarios: (1) real seismic records selected for high data quality; (2) synthetic seismic records with noise computed for the observed source-station pairings and (3) synthetic seismic records with noise computed for all possible station-source pairings. To assess RMT sensitivity for each test, we observe the ‘fit falloff’, which portrays how relative fit changes when strike or dip varies incrementally; we then derive the ranges of acceptable strikes and dips by identifying the span of solutions with relative fits larger than 90 per cent of the best fit. For the azimuthally incomplete IMAD network, Scenario 3 best constrains fault geometry, with average ranges of 45° and 31° for strike and dip, respectively. In Oklahoma, Scenario 3 best constrains fault dip with an average range of 46°; however, strike is best constrained by Scenario 1, with a range of 26°. We draw two main conclusions from this study. (1) Station distribution impacts our ability to constrain RMTs using waveform time-series; however, in some tectonic settings, faulting style also plays a significant role and (2) increasing station density and data quantity (both the number of stations and the number of individual channels) does not necessarily improve RMT constraint. These results may be useful when organizing future seismic deployments (e.g. by concentrating stations in alignment with anticipated nodal planes), and in computing RMTs, either by guiding a more rigorous data selection process for input data or informing variable weighting among the selected data (e.g. by eliminating the transverse component when strike-slip mechanisms are expected).

  13. Nuclear Test Depth Determination with Synthetic Modelling: Global Analysis from PNEs to DPRK-2016

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Stachnik, Joshua; Baker, Ben; Epiphansky, Alexey; Bobrov, Dmitry

    2016-04-01

    Seismic event depth determination is critical for the event screening process at the International Data Center, CTBTO. A thorough determination of the event depth can be conducted mostly through additional special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface making the depth screening criterion not applicable. Further it may result in a heavier workload to manually distinguish between subsurface and deeper crustal events. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the depth phases, cross correlation between observed and theoretic seismograms can provide a basis for the event depth estimation, and so an expansion to the screening process. We applied this approach mostly to events at teleseismic and partially regional distances. The approach was found efficient for the seismic event screening process, with certain caveats related mostly to poorly defined source and receiver crustal models which can shift the depth estimate. An adjustable teleseismic attenuation model (t*) for synthetics was used since this characteristic is not known for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to account for the complex source topography. The software prototype is designed to be used for the Expert Technical Analysis at the IDC. With this, the design effectively reuses the NDC-in-a-Box code and can be comfortably utilized by the NDC users. The package uses Geotool as a front-end for data retrieval and pre-processing. After the event database is compiled, the control is passed to the driver software, running the external processing and plotting toolboxes, which controls the final stage and produces the final result. The modules are mostly Python coded, C-coded (Raysynth3D complex topography regional synthetics) and FORTRAN coded synthetics from the CPS330 software package by Robert Herrmann of Saint Louis University. The extension of this single station depth determination method is under development and uses joint information from all stations participating in processing. It is based on simultaneous depth and moment tensor determination for both short and long period seismic phases. A novel approach recently developed for microseismic event location utilizing only phase waveform information was migrated to a global scale. It should provide faster computation as it does not require intensive synthetic modelling, and might benefit processing noisy signals. A consistent depth estimate for all recent nuclear tests was produced for the vast number of IMS stations (primary and auxiliary) used in processing.

  14. A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.

    1994-01-01

    Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.

  15. Command History for 1989.

    DTIC Science & Technology

    1990-09-01

    13 Bart Kuhn, GM-14 Samantha K. Maddox , GS-04 Mike Nakada, GM- 13 John Wolfe, GM-14 Reynaldo I. Monzon, GS- 12 Jose G. Suarez, GS- 11 19 Product...1410-09 GS-334-09 Janice Whiting Procurement Clerk Code 21 GS-1106-05 Separations Samantha Maddox Hoa T. Lu Supply Clerk Computer Specialist Code 21...Jennifer Thorp Royal S. Magnus Student Aide Personnel Research Psychologist Code 23 Code 12 GW-322-03 GS-180-11 Linda L. Turnmire Yvonne S. Baker Computer

  16. Ascent Aerodynamic Pressure Distributions on WB001

    NASA Technical Reports Server (NTRS)

    Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.

    1996-01-01

    To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.

  17. User's guide for vectorized code EQUIL for calculating equilibrium chemistry on Control Data STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Graves, R. A., Jr.; Weilmuenster, K. J.

    1980-01-01

    A vectorized code, EQUIL, was developed for calculating the equilibrium chemistry of a reacting gas mixture on the Control Data STAR-100 computer. The code provides species mole fractions, mass fractions, and thermodynamic and transport properties of the mixture for given temperature, pressure, and elemental mass fractions. The code is set up for the electrons H, He, C, O, N system of elements. In all, 24 chemical species are included.

  18. Computer code for charge-exchange plasma propagation

    NASA Technical Reports Server (NTRS)

    Robinson, R. S.; Kaufman, H. R.

    1981-01-01

    The propagation of the charge-exchange plasma from an electrostatic ion thruster is crucial in determining the interaction of that plasma with the associated spacecraft. A model that describes this plasma and its propagation is described, together with a computer code based on this model. The structure and calling sequence of the code, named PLASIM, is described. An explanation of the program's input and output is included, together with samples of both. The code is written in ASNI Standard FORTRAN.

  19. Self-Scheduling Parallel Methods for Multiple Serial Codes with Application to WOPWOP

    NASA Technical Reports Server (NTRS)

    Long, Lyle N.; Brentner, Kenneth S.

    2000-01-01

    This paper presents a scheme for efficiently running a large number of serial jobs on parallel computers. Two examples are given of computer programs that run relatively quickly, but often they must be run numerous times to obtain all the results needed. It is very common in science and engineering to have codes that are not massive computing challenges in themselves, but due to the number of instances that must be run, they do become large-scale computing problems. The two examples given here represent common problems in aerospace engineering: aerodynamic panel methods and aeroacoustic integral methods. The first example simply solves many systems of linear equations. This is representative of an aerodynamic panel code where someone would like to solve for numerous angles of attack. The complete code for this first example is included in the appendix so that it can be readily used by others as a template. The second example is an aeroacoustics code (WOPWOP) that solves the Ffowcs Williams Hawkings equation to predict the far-field sound due to rotating blades. In this example, one quite often needs to compute the sound at numerous observer locations, hence parallelization is utilized to automate the noise computation for a large number of observers.

  20. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

Top