Sample records for station verification analysis

  1. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  2. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Verification of earth station antenna... Verification of earth station antenna performance standards. (a)(1) Except for applications for 20/30 GHz earth... the antenna manufacturer on representative equipment in representative configurations, and the test...

  3. Software technology testbed softpanel prototype

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.

  4. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  5. Energy consumption analysis for the Mars deep space station

    NASA Technical Reports Server (NTRS)

    Hayes, N. V.

    1982-01-01

    Results for the energy consumption analysis at the Mars deep space station are presented. It is shown that the major energy consumers are the 64-Meter antenna building and the operations support building. Verification of the antenna's energy consumption is highly dependent on an accurate knowlege of the tracking operations. The importance of a regular maintenance schedule for the watt hour meters installed at the station is indicated.

  6. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Verification of earth station antenna... Verification of earth station antenna performance standards. (a)(1) All applications for transmitting earth... pursuant to § 2.902 of this chapter from the manufacturer of each antenna that the results of a series of...

  7. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 2 2012-10-01 2012-10-01 false Verification of earth station antenna... Verification of earth station antenna performance standards. (a)(1) All applications for transmitting earth... pursuant to § 2.902 of this chapter from the manufacturer of each antenna that the results of a series of...

  8. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 2 2013-10-01 2013-10-01 false Verification of earth station antenna... Verification of earth station antenna performance standards. (a)(1) All applications for transmitting earth... pursuant to § 2.902 of this chapter from the manufacturer of each antenna that the results of a series of...

  9. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  10. Energy consumption analysis of the Venus Deep Space Station (DSS-13)

    NASA Technical Reports Server (NTRS)

    Hayes, N. V.

    1983-01-01

    This report continues the energy consumption analysis and verification study of the tracking stations of the Goldstone Deep Space Communications Complex, and presents an audit of the Venus Deep Space Station (DSS 13). Due to the non-continuous radioastronomy research and development operations at the station, estimations of energy usage were employed in the energy consumption simulation of both the 9-meter and 26-meter antenna buildings. A 17.9% decrease in station energy consumption was experienced over the 1979-1981 years under study. A comparison of the ECP computer simulations and the station's main watt-hour meter readings showed good agreement.

  11. International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned

    NASA Technical Reports Server (NTRS)

    Iovine, John

    2011-01-01

    The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.

  12. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  13. Statistical analysis of NWP rainfall data from Poland..

    NASA Astrophysics Data System (ADS)

    Starosta, Katarzyna; Linkowska, Joanna

    2010-05-01

    A goal of this work is to summarize the latest results of precipitation verification in Poland. In IMGW, COSMO_PL version 4.0 has been running. The model configuration is: 14 km horizontal grid spacing, initial time at 00 UTC and 12 UTC, the forecast range 72 h. The fields from the model had been verified with Polish SYNOP stations. The verification was performed using a new verification tool. For the accumulated precipitation indices FBI, POD, FAR, ETS from contingency table are calculated. In this paper the comparison of monthly and seasonal verification of 6h, 12h, 24h accumulated precipitation in 2009 is presented. Since February 2010 the model with 7 km grid spacing will be running in IMGW. The results of precipitation verification for two different models' resolution will be shown.

  14. Propagation Characteristics of International Space Station Wireless Local Area Network

    NASA Technical Reports Server (NTRS)

    Sham, Catherine C.; Hwn, Shian U.; Loh, Yin-Chung

    2005-01-01

    This paper describes the application of the Uniform Geometrical Theory of Diffraction (UTD) for Space Station Wireless Local Area Networks (WLANs) indoor propagation characteristics analysis. The verification results indicate good correlation between UTD computed and measured signal strength. It is observed that the propagation characteristics are quite different in the Space Station modules as compared with those in the typical indoor WLANs environment, such as an office building. The existing indoor propagation models are not readily applicable to the Space Station module environment. The Space Station modules can be regarded as oversized imperfect waveguides. Two distinct propagation regions separated by a breakpoint exist. The propagation exhibits the guided wave characteristics. The propagation loss in the Space Station, thus, is much smaller than that in the typical office building. The path loss model developed in this paper is applicable for Space Station WLAN RF coverage and link performance analysis.

  15. A study of the dynamics of rotating space stations with elastically connected counterweight and attached flexible appendages. Volume 1: Theory

    NASA Technical Reports Server (NTRS)

    Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.

    1973-01-01

    The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.

  16. Space Station Food System

    NASA Technical Reports Server (NTRS)

    Thurmond, Beverly A.; Gillan, Douglas J.; Perchonok, Michele G.; Marcus, Beth A.; Bourland, Charles T.

    1986-01-01

    A team of engineers and food scientists from NASA, the aerospace industry, food companies, and academia are defining the Space Station Food System. The team identified the system requirements based on an analysis of past and current space food systems, food systems from isolated environment communities that resemble Space Station, and the projected Space Station parameters. The team is resolving conflicts among requirements through the use of trade-off analyses. The requirements will give rise to a set of specifications which, in turn, will be used to produce concepts. Concept verification will include testing of prototypes, both in 1-g and microgravity. The end-item specification provides an overall guide for assembling a functional food system for Space Station.

  17. Space Station Furnace Facility. Volume 2: Appendix 1: Contract End Item specification (CEI), part 1

    NASA Technical Reports Server (NTRS)

    Seabrook, Craig

    1992-01-01

    This specification establishes the performance, design, development, and verification requirements for the Space Station Furnace Facility (SSFF) Core. The definition of the SSFF Core and its interfaces, specifies requirements for the SSFF Core performance, specifies requirements for the SSFF Core design, and construction are presented, and the verification requirements are established.

  18. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  19. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Philip M.; Thoresen, Souzan; Granahan, John; Matty, Chris

    2010-01-01

    The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic changeout, including the analyzer (ORU 02) and the verification gas assembly (ORU 08). The longest lasting ORU 02 was recently replaced after a record service length of 1033 days. The comparatively high performance duration may be attributable to a reduced inlet flow rate into the analyzer, resulting in increased ion pump lifetime; however, there may be other factors as well. A recent schedule slip for delivery of replacement verification gas led to a demonstration that the calibration interval could be extended on a short-term basis. An analysis of ORU 08 performance characteristics indicates that it is possible to temporarily extend the calibration interval from 6 weeks to 12 weeks if necessary.

  20. An unattended verification station for UF6 cylinders: Field trial findings

    NASA Astrophysics Data System (ADS)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  1. Second Generation International Space Station (ISS) Total Organic Carbon Analyzer (TOCA) Verification Testing and On-Orbit Performance Results

    NASA Technical Reports Server (NTRS)

    Bentley, Nicole L.; Thomas, Evan A.; VanWie, Michael; Morrison, Chad; Stinson, Richard G.

    2010-01-01

    The Total Organic Carbon Analyzer (TOGA) is designed to autonomously determine recovered water quality as a function of TOC. The current TOGA has been on the International Space Station since November 2008. Functional checkout and operations revealed complex operating considerations. Specifically, failure of the hydrogen catalyst resulted in the development of an innovative oxidation analysis method. This method reduces the activation time and limits the hydrogen produced during analysis, while retaining the ability to indicate TOC concentrations within 25% accuracy. Subsequent testing and comparison to archived samples returned from the Station and tested on the ground yield high confidence in this method, and in the quality of the recovered water.

  2. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    Arabidopsis thaliana plants are seen inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1 prior to harvest of half the plants. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in NASA Kennedy Space Center's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  3. Internal seismological stations for monitoring a comprehensive test ban theory

    NASA Astrophysics Data System (ADS)

    Dahlman, O.; Israelson, H.

    1980-06-01

    Verification of the compliance with a Comprehensive Test Ban on nuclear explosions is expected to be carried out by a seismological verification system of some fifty globally distributed teleseismic stations designed to monitor underground explosions at large distances (beyond 2000 km). It is attempted to assess various technical purposes that such internal stations might serve in relation to a global network of seismological stations. The assessment is based on estimates of the detection capabilities of hypothetical networks of internal stations. Estimates pertaining to currently used detection techniques (P waves) indicate that a limited number (less than 30) of such stations would not improve significantly upon the detection capability that a global network of stations would have throughout the territories of the US and the USSR. Recently available and not yet fully analyzed data indicate however that very high detection capabilities might be obtained in certain regions.

  4. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, harvests half the Arabidopsis thaliana plants inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  5. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE PAGES

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...

    2017-08-26

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  6. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  7. Space station data system analysis/architecture study. Task 2: Options development, DR-5. Volume 3: Programmatic options

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Task 2 in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make design/programmatic decisions. This volume identifies the preferred options in the programmatic category and characterizes these options with respect to performance attributes, constraints, costs, and risks. The programmatic category includes methods used to administrate/manage the development, operation and maintenance of the SSDS. The specific areas discussed include standardization/commonality; systems management; and systems development, including hardware procurement, software development and system integration, test and verification.

  8. International Space Station External Contamination Status

    NASA Technical Reports Server (NTRS)

    Mikatarian, Ron; Soares, Carlos

    2000-01-01

    PResentation slides examine external contamination requirements; International Space Station (ISS) external contamination sources; ISS external contamination sensitive surfaces; external contamination control; external contamination control for pre-launch verification; flight experiments and observations; the Space Shuttle Orbiter waste water dump, materials outgassing, active vacuum vents; example of molecular column density profile, modeling and analysis tools; sources of outgassing induced contamination analyzed to date, quiescent sources, observations on optical degradation due to induced external contamination in LEO; examples of typical contaminant and depth profiles; and status of the ISS system, material outgassing, thruster plumes, and optical degradation.

  9. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, opens the door to the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1 for a test harvest of half of the Arabidopsis thaliana plants growing within. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  10. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, places Arabidopsis thaliana plants harvested from the Advanced Plant Habitat (APH) Flight Unit No. 1 into a Mini ColdBag that quickly freezes the plants. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  11. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, places Arabidopsis thaliana plants harvested from the Advanced Plant Habitat (APH) Flight Unit No. 1 into an Ultra-low Freezer chilled to -150 degrees Celsius. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  12. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    NASA Technical Reports Server (NTRS)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  13. The PLAID graphics analysis impact on the space program

    NASA Technical Reports Server (NTRS)

    Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.

    1994-01-01

    An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.

  14. Space station WP-04 power system preliminary analysis and design document, volume 3

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Rocketdyne plans to generate a system level specification for the Space Station Electric Power System (EPS) in order to facilitate the usage, accountability, and tracking of overall system level requirements. The origins and status of the verification planning effort are traced and an overview of the Space Station program interactions are provided. The work package level interfaces between the EPS and the other Space Station work packages are outlined. A trade study was performed to determine the peaking split between PV and SD, and specifically to compare the inherent total peaking capability with proportionally shared peaking. In order to determine EPS cost drivers for the previous submittal of DRO2, the life cycle cost (LCC) model was run to identify the more significant costs and the factors contributing to them.

  15. International Space Station Atmosphere Control and Supply, Atmosphere Revitalization, and Water Recovery and Management Subsystem - Verification for Node 1

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 ACS, AR, and WRM design and detailed Element Verification methodologies utilized during the Qualification phase for Node 1.

  16. Baseline and Verification Tests of the Electric Vehicle Associates’ Current Fare Station Wagon.

    DTIC Science & Technology

    1983-01-01

    ELECTRIC Final Test Report VEICLE ASSOCIATES’CURRENT FARE STATION WAGON 27 March 1980 -6 November 1981 6. PERFORMING ORG. REPORT NUMBER * .7. AUTNOR(s) a...Whe,% Doe. Er(,rrrd) -I PREFACE Z..1~ The electric and hybrid vehicle test was conducted by the U.S. Army Mobility Equipment Research and Development...COAST-DOWN D. ELECTRIC AND HYBRID VEHICLE 92 VERIFICATION PROCEDURES 1".f S. -..°.o. . *-.. .,". .. " . ,. . . . . . . % % %d° ILLUSTRATIONS Figure

  17. Advanced Plant Habitat Test Harvest

    NASA Image and Video Library

    2017-08-24

    John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, uses a FluorPen to measure the chlorophyll fluorescence of Arabidopsis thaliana plants inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1. Half the plants were then harvested. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.

  18. Space station structures and dynamics test program

    NASA Technical Reports Server (NTRS)

    Moore, Carleton J.; Townsend, John S.; Ivey, Edward W.

    1987-01-01

    The design, construction, and operation of a low-Earth orbit space station poses unique challenges for development and implementation of new technology. The technology arises from the special requirement that the station be built and constructed to function in a weightless environment, where static loads are minimal and secondary to system dynamics and control problems. One specific challenge confronting NASA is the development of a dynamics test program for: (1) defining space station design requirements, and (2) identifying the characterizing phenomena affecting the station's design and development. A general definition of the space station dynamic test program, as proposed by MSFC, forms the subject of this report. The test proposal is a comprehensive structural dynamics program to be launched in support of the space station. The test program will help to define the key issues and/or problems inherent to large space structure analysis, design, and testing. Development of a parametric data base and verification of the math models and analytical analysis tools necessary for engineering support of the station's design, construction, and operation provide the impetus for the dynamics test program. The philosophy is to integrate dynamics into the design phase through extensive ground testing and analytical ground simulations of generic systems, prototype elements, and subassemblies. On-orbit testing of the station will also be used to define its capability.

  19. Updated one-dimensional hydraulic model of the Kootenai River, Idaho-A supplement to Scientific Investigations Report 2005-5110

    USGS Publications Warehouse

    Czuba, Christiana R.; Barton, Gary J.

    2011-01-01

    The Kootenai Tribe of Idaho, in cooperation with local, State, Federal, and Canadian agency co-managers and scientists, is assessing the feasibility of a Kootenai River habitat restoration project in Boundary County, Idaho. The restoration project is focused on recovery of the endangered Kootenai River white sturgeon (Acipenser transmontanus) population, and simultaneously targets habitat-based recovery of other native river biota. River restoration is a complex undertaking that requires a thorough understanding of the river and floodplain landscape prior to restoration efforts. To assist in evaluating the feasibility of this endeavor, the U.S. Geological Survey developed an updated one-dimensional hydraulic model of the Kootenai River in Idaho between river miles (RMs) 105.6 and 171.9 to characterize the current hydraulic conditions. A previously calibrated model of the study area, based on channel geometry data collected during 2002 and 2003, was the basis for this updated model. New high-resolution bathymetric surveys conducted in the study reach between RMs 138 and 161.4 provided additional detail of channel morphology. A light detection and ranging (LIDAR) survey was flown in the Kootenai River valley in 2005 between RMs 105.6 and 159.5 to characterize the floodplain topography. Six temporary gaging stations installed in 2006-08 between RMs 154.1 and 161.2, combined with five permanent gaging stations in the study reach, provided discharge and water-surface elevations for model calibration and verification. Measured discharges ranging from about 4,800 to 63,000 cubic feet per second (ft3/s) were simulated for calibration events, and calibrated water-surface elevations ranged from about 1,745 to 1,820 feet (ft) throughout the extent of the model. Calibration was considered acceptable when the simulated and measured water-surface elevations at gaging stations differed by less than (+/-)0.15 ft. Model verification consisted of simulating 10 additional events with measured discharges ranging from about 4,900 to 52,000 ft3/s, and comparing simulated and measured water-surface elevations at gaging stations. Average water-surface-elevation error in the verification simulations was 0.05 ft, with the error ranging from -1.17 to 0.94 ft over the range of events and gaging stations. Additional verification included a graphical comparison of measured average velocities that range from 1.0 to 6.2 feet per second to simulated velocities at four sites within the study reach for measured discharges ranging from about 7,400 to 46,600 ft3/s. The availability of high-resolution bathymetric and LIDAR data, along with the additional gaging stations in the study reach, allowed for more detail to be added to the model and a more thorough calibration, sensitivity, and verification analysis to be conducted. Model resolution and performance is most improved between RMs 140 and 160, which includes the 18.3-mile reach of the Kootenai River white sturgeon critical habitat.

  20. Manned Mars mission accommodation: Sprint mission

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Kaszubowski, Martin J.; Ayers, J. Kirk; Llewellyn, Charles P.; Weidman, Deene J.; Meredith, Barry D.

    1988-01-01

    The results of a study conducted at the NASA-LaRC to assess the impacts on the Phase 2 Space Station of Accommodating a Manned Mission to Mars are documented. In addition, several candidate transportation node configurations are presented to accommodate the assembly and verification of the Mars Mission vehicles. This study includes an identification of a life science research program that would need to be completed, on-orbit, prior to mission departure and an assessment of the necessary orbital technology development and demonstration program needed to accomplish the mission. Also included is an analysis of the configuration mass properties and a preliminary analysis of the Space Station control system sizing that would be required to control the station. Results of the study indicate the Phase 2 Space Station can support a manned mission to Mars with the addition of a supporting infrastructure that includes a propellant depot, assembly hangar, and a heavy lift launch vehicle to support the large launch requirements.

  1. Modal Test/Analysis Correlation of Space Station Structures Using Nonlinear Sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlation. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  2. Modal test/analysis correlation of Space Station structures using nonlinear sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlations. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  3. International Space Station Environmental Control and Life Support System: Verification for the Pressurized Mating Adapters

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Pressurized Mating Adapters (PMAs) Environmental Control and Life Support (ECLS) System is comprised of three subsystems: Atmosphere Control and Supply (ACS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). PMA 1 and PMA 2 flew to ISS on Flight 2A and PMA 3 flew to ISS on Flight 3A. This paper provides a summary of the PMAs ECLS design and the detailed Element Verification methodologies utilized during the Qualification phase for the PMAs.

  4. International Space Station Temperature and Humidity Control Subsystem Verification for Node 1

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2007-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 THC subsystem design. The paper will also provide a discussion of the detailed Element Verification methodologies for nominal operation of the Node 1 THC subsystem operations utilized during the Qualification phase.

  5. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  6. Validation and verification of a virtual environment for training naval submarine officers

    NASA Astrophysics Data System (ADS)

    Zeltzer, David L.; Pioch, Nicholas J.

    1996-04-01

    A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.

  7. Space station MSFC-DPD-235/DR no. CM-03 specification, modular space station project, Part 1 CEI

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Contract engineering item specifications for the modular space station are presented. These specifications resulted from the development and allocations of requirements which are concise statements of performance or constraints on performance. Specifications contain requirements for functional performance and for the verification of design solutions.

  8. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    Hagerty, J. J.

    1981-01-01

    The cell preparation station was installed in its new enclosure. Operation verification tests were performed. The detailed layout drawings of the automated lamination station were produced and construction began. All major and most minor components were delivered by vendors. The station framework was built and assembly of components begun.

  9. Adaption of space station technology for lunar operations

    NASA Technical Reports Server (NTRS)

    Garvey, J. M.

    1992-01-01

    Space Station Freedom technology will have the potential for numerous applications in an early lunar base program. The benefits of utilizing station technology in such a fashion include reduced development and facility costs for lunar base systems, shorter schedules, and verification of such technology through space station experience. This paper presents an assessment of opportunities for using station technology in a lunar base program, particularly in the lander/ascent vehicles and surface modules.

  10. Environmental Technology Verification Report for Applikon MARGA Semi-Continuous Ambient Air Monitoring System

    EPA Science Inventory

    The verification test was conducted oer a period of 30 days (October 1 to October 31, 2008) and involved the continuous operation of duplicate semi-continuous monitoring technologies at the Burdens Creek Air Monitoring Site, an existing ambient-air monitoring station located near...

  11. Environmental control and life support system analysis tools for the Space Station era

    NASA Technical Reports Server (NTRS)

    Blakely, R. L.; Rowell, L. F.

    1984-01-01

    This paper describes the concept of a developing emulation, simulation, sizing, and technology assessment program (ESSTAP) which can be used effectively for the various functional disciplines (structures, power, ECLSS, etc.) beginning with the initial system selection and conceptual design processes and continuing on through the mission operation and growth phases of the Space Station for the purpose of minimizing overall program costs. It will discuss the basic requirements for these tools, as currently envisioned for the Environmental Control and Life Support System (ECLSS), identifying their intended and potential uses and applications, and present examples and status of several representative tools. The development and applications of a Space Station Atmospheric Revitalization Subsystem (ARS) demonstration model to be used for concent verification will also be discussed.

  12. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    NASA Technical Reports Server (NTRS)

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  13. Development of a Response Surface Thermal Model for Orion Mated to the International Space Station

    NASA Technical Reports Server (NTRS)

    Miller, Stephen W.; Meier, Eric J.

    2010-01-01

    A study was performed to determine if a Design of Experiments (DOE)/Response Surface Methodology could be applied to on-orbit thermal analysis and produce a set of Response Surface Equations (RSE) that accurately predict vehicle temperatures. The study used an integrated thermal model of the International Space Station and the Orion Outer mold line model. Five separate factors were identified for study: yaw, pitch, roll, beta angle, and the environmental parameters. Twenty external Orion temperatures were selected as the responses. A DOE case matrix of 110 runs was developed. The data from these cases were analyzed to produce an RSE for each of the temperature responses. The initial agreement between the engineering data and the RSE predictions was encouraging, although many RSEs had large uncertainties on their predictions. Fourteen verification cases were developed to test the predictive powers of the RSEs. The verification showed mixed results with some RSE predicting temperatures matching the engineering data within the uncertainty bands, while others had very large errors. While this study to not irrefutably prove that the DOE/RSM approach can be applied to on-orbit thermal analysis, it does demonstrate that technique has the potential to predict temperatures. Additional work is needed to better identify the cases needed to produce the RSEs

  14. Precise and Scalable Static Program Analysis of NASA Flight Software

    NASA Technical Reports Server (NTRS)

    Brat, G.; Venet, A.

    2005-01-01

    Recent NASA mission failures (e.g., Mars Polar Lander and Mars Orbiter) illustrate the importance of having an efficient verification and validation process for such systems. One software error, as simple as it may be, can cause the loss of an expensive mission, or lead to budget overruns and crunched schedules. Unfortunately, traditional verification methods cannot guarantee the absence of errors in software systems. Therefore, we have developed the CGS static program analysis tool, which can exhaustively analyze large C programs. CGS analyzes the source code and identifies statements in which arrays are accessed out of bounds, or, pointers are used outside the memory region they should address. This paper gives a high-level description of CGS and its theoretical foundations. It also reports on the use of CGS on real NASA software systems used in Mars missions (from Mars PathFinder to Mars Exploration Rover) and on the International Space Station.

  15. The Ionizing Radiation Environment on the International Space Station: Performance vs. Expectations for Avionics and Materials

    NASA Technical Reports Server (NTRS)

    Koontz, Steven L.; Boeder, Paul A.; Pankop, Courtney; Reddell, Brandon

    2005-01-01

    The role of structural shielding mass in the design, verification, and in-flight performance of International Space Station (ISS), in both the natural and induced orbital ionizing radiation (IR) environments, is reported.

  16. KSC-04pd1975

    NASA Image and Video Library

    2004-09-22

    KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, workers conduct a post-delivery verification test on a Control Moment Gyro (CMG) that is scheduled to fly on mission STS-114. The CMG will replace another on the International Space Station, which will require a spacewalk.

  17. KSC-04pd1977

    NASA Image and Video Library

    2004-09-22

    KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, workers conduct a post-delivery verification test on a Control Moment Gyro (CMG) that is scheduled to fly on mission STS-114. The CMG will replace another on the International Space Station, which will require a spacewalk.

  18. KSC-04pd1976

    NASA Image and Video Library

    2004-09-22

    KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, workers conduct a post-delivery verification test on a Control Moment Gyro (CMG) that is scheduled to fly on mission STS-114. The CMG will replace another on the International Space Station, which will require a spacewalk.

  19. Six-man, self-contained carbon dioxide concentrator subsystem for Space Station Prototype (SSP) application

    NASA Technical Reports Server (NTRS)

    Kostell, G. D.; Schubert, F. H.; Shumar, J. W.; Hallick, T. M.; Jensen, F. C.

    1974-01-01

    A six man, self contained, electrochemical carbon dioxide concentrating subsystem for space station prototype use was successfully designed, fabricated, and tested. A test program was successfully completed which covered shakedown testing, design verification testing, and acceptance testing.

  20. The role of the real-time simulation facility, SIMFAC, in the design, development and performance verification of the Shuttle Remote Manipulator System (SRMS) with man-in-the-loop

    NASA Technical Reports Server (NTRS)

    Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.

    1980-01-01

    The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.

  1. Acoustic emissions verification testing of International Space Station experiment racks at the NASA Glenn Research Center Acoustical Testing Laboratory

    NASA Astrophysics Data System (ADS)

    Akers, James C.; Passe, Paul J.; Cooper, Beth A.

    2005-09-01

    The Acoustical Testing Laboratory (ATL) at the NASA John H. Glenn Research Center (GRC) in Cleveland, OH, provides acoustic emission testing and noise control engineering services for a variety of specialized customers, particularly developers of equipment and science experiments manifested for NASA's manned space missions. The ATL's primary customer has been the Fluids and Combustion Facility (FCF), a multirack microgravity research facility being developed at GRC for the USA Laboratory Module of the International Space Station (ISS). Since opening in September 2000, ATL has conducted acoustic emission testing of components, subassemblies, and partially populated FCF engineering model racks. The culmination of this effort has been the acoustic emission verification tests on the FCF Combustion Integrated Rack (CIR) and Fluids Integrated Rack (FIR), employing a procedure that incorporates ISO 11201 (``Acoustics-Noise emitted by machinery and equipment-Measurement of emission sound pressure levels at a work station and at other specified positions-Engineering method in an essentially free field over a reflecting plane''). This paper will provide an overview of the test methodology, software, and hardware developed to perform the acoustic emission verification tests on the CIR and FIR flight racks and lessons learned from these tests.

  2. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  3. Design and testing of the Space Station Freedom Propellant Tank Assembly

    NASA Technical Reports Server (NTRS)

    Dudley, D. D.; Thonet, T. A.; Goforth, A. M.

    1992-01-01

    Propellant storage and management functions for the Propulsion Module of the U.S. Space Station Freedom are provided by the Propellant Tank Assembly (PTA). The PTA consists of a surface-tension type propellant acquisition device contained within a welded titanium pressure vessel. The PTA design concept was selected with high reliability and low program risk as primary goals in order to meet stringent NASA structural, expulsion, fracture control and reliability requirements. The PTA design makes use of Shuttle Orbital Maneuvering System and Peacekeeper Propellant Storage Assembly design and analysis techniques. This paper summarizes the PTA design solution and discusses the underlying detailed analyses. In addition, design verification and qualification test activities are discussed.

  4. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  5. NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios

    NASA Astrophysics Data System (ADS)

    Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.

    2012-04-01

    For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PERFORMANCE TEST RESULTS FOR THE A AND A ENVIRONMENTAL SEALS' SEAL ASSIST SYSTEM (SAS), PHASE I--TECHNOLOGY VERIFICATION REPORT

    EPA Science Inventory

    The report presents results of tests determining the efficacy of A&A Environmental Seals, Inc's Seal Assist System (SAS) in preventing natural gas compressor station's compressor rod packing leaks from escaping into the atmosphere. The SAS consists of an Emission Containment Glan...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: PHASE 1-ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®

    EPA Science Inventory

    Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...

  8. Status of DSMT research program

    NASA Technical Reports Server (NTRS)

    Mcgowan, Paul E.; Javeed, Mehzad; Edighoffer, Harold H.

    1991-01-01

    The status of the Dynamic Scale Model Technology (DSMT) research program is presented. DSMT is developing scale model technology for large space structures as part of the Control Structure Interaction (CSI) program at NASA Langley Research Center (LaRC). Under DSMT a hybrid-scale structural dynamics model of Space Station Freedom was developed. Space Station Freedom was selected as the focus structure for DSMT since the station represents the first opportunity to obtain flight data on a complex, three-dimensional space structure. Included is an overview of DSMT including the development of the space station scale model and the resulting hardware. Scaling technology was developed for this model to achieve a ground test article which existing test facilities can accommodate while employing realistically scaled hardware. The model was designed and fabricated by the Lockheed Missile and Space Co., and is assembled at LaRc for dynamic testing. Also, results from ground tests and analyses of the various model components are presented along with plans for future subassembly and matted model tests. Finally, utilization of the scale model for enhancing analysis verification of the full-scale space station is also considered.

  9. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.

  10. Space station prototype Sabatier reactor design verification testing

    NASA Technical Reports Server (NTRS)

    Cusick, R. J.

    1974-01-01

    A six-man, flight prototype carbon dioxide reduction subsystem for the SSP ETC/LSS (Space Station Prototype Environmental/Thermal Control and Life Support System) was developed and fabricated for the NASA-Johnson Space Center between February 1971 and October 1973. Component design verification testing was conducted on the Sabatier reactor covering design and off-design conditions as part of this development program. The reactor was designed to convert a minimum of 98 per cent hydrogen to water and methane for both six-man and two-man reactant flow conditions. Important design features of the reactor and test conditions are described. Reactor test results are presented that show design goals were achieved and off-design performance was stable.

  11. Preliminary report on the Black Thunder, Wyoming CTBT R and D experiment quicklook report: LLNL input from regional stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P.E.; Glenn, L.A.

    This report presents a preliminary summary of the data recorded at three regional seismic stations from surface blasting at the Black Thunder Coal Mine in northeast Wyoming. The regional stations are part of a larger effort that includes many more seismic stations in the immediate vicinity of the mine. The overall purpose of this effort is to characterize the source function and propagation characteristics of large typical surface mine blasts. A detailed study of source and propagation features of conventional surface blasts is a prerequisite to attempts at discriminating this type of blasting activity from other sources of seismic events.more » The Black Thunder Seismic experiment is a joint verification effort to determine seismic source and path effects that result from very large, but routine ripple-fired surface mining blasts. Studies of the data collected will be for the purpose of understanding how the near-field and regional seismic waveforms from these surface mining blasts are similar to, and different from, point shot explosions and explosions at greater depth. The Black Hills Station is a Designated Seismic Station that was constructed for temporary occupancy by the Former Soviet Union seismic verification scientists in accordance with the Threshold Test Ban Treaty protocol.« less

  12. Information management system study results. Volume 1: IMS study results

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The information management system (IMS) special emphasis task was performed as an adjunct to the modular space station study, with the objective of providing extended depth of analysis and design in selected key areas of the information management system. Specific objectives included: (1) in-depth studies of IMS requirements and design approaches; (2) design and fabricate breadboard hardware for demonstration and verification of design concepts; (3) provide a technological base to identify potential design problems and influence long range planning (4) develop hardware and techniques to permit long duration, low cost, manned space operations; (5) support SR&T areas where techniques or equipment are considered inadequate; and (6) permit an overall understanding of the IMS as an integrated component of the space station.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: ADI INTERNATIONAL INC. ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®; PHASE II

    EPA Science Inventory

    Verification testing of the ADI International Inc. Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8, 2003 through May 28,...

  14. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 2

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Results of a Space Station Data System Analysis/Architecture Study for the Goddard Space Flight Center are presented. This study, which emphasized a system engineering design for a complete, end-to-end data system, was divided into six tasks: (1); Functional requirements definition; (2) Options development; (3) Trade studies; (4) System definitions; (5) Program plan; and (6) Study maintenance. The Task inter-relationship and documentation flow are described. Information in volume 2 is devoted to Task 3: trade Studies. Trade Studies have been carried out in the following areas: (1) software development test and integration capability; (2) fault tolerant computing; (3) space qualified computers; (4) distributed data base management system; (5) system integration test and verification; (6) crew workstations; (7) mass storage; (8) command and resource management; and (9) space communications. Results are presented for each task.

  15. Space station ECLSS integration analysis: Simplified General Cluster Systems Model, ECLS System Assessment Program enhancements

    NASA Technical Reports Server (NTRS)

    Ferguson, R. E.

    1985-01-01

    The data base verification of the ECLS Systems Assessment Program (ESAP) was documented and changes made to enhance the flexibility of the water recovery subsystem simulations are given. All changes which were made to the data base values are described and the software enhancements performed. The refined model documented herein constitutes the submittal of the General Cluster Systems Model. A source listing of the current version of ESAP is provided in Appendix A.

  16. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  17. Data Rescue for precipitation station network in Slovak Republic

    NASA Astrophysics Data System (ADS)

    Fasko, Pavel; Bochníček, Oliver; Švec, Marek; Paľušová, Zuzana; Markovič, Ladislav

    2016-04-01

    Transparency of archive catalogues presents very important task for the data saving. It helps to the further activities e.g. digitalization and homogenization. For the time being visualization of time series continuation in precipitation stations (approximately 1250 stations) is under way in Slovak Republic since the beginning of observation (meteorological stations gradually began to operate during the second half of the 19th century in Slovakia). Visualization is joined with the activities like verification and accessibility of the data mentioned in the archive catalogue, station localization according to the historical annual books, conversion of coordinates into x-JTSK, y-JTSK and hydrological catchment assignment. Clustering of precipitation stations at the specific hydrological catchment in the map and visualization of the data duration (line graph) will lead to the effective assignment of corresponding precipitation stations for the prolongation of time series. This process should be followed by the process of turn or trend detection and homogenization. The risks and problems at verification of records from archive catalogues, their digitalization, repairs and the way of visualization will be seen in poster. During the searching process of the historical and often short time series, we realized the importance of mainly those stations, located in the middle and higher altitudes. They might be used as replacement for up to now quoted fictive points used at the construction of precipitation maps. Supplementing and enhancing the time series of individual stations will enable to follow changes in precipitation totals during the certain period as well as area totals for individual catchments in various time periods appreciated mainly by hydrologists and agro-climatologists.

  18. ISS Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Laible, Michael R.

    2011-01-01

    The Microgravity performance assessment of the International Space Station (ISS) is comprised of a quasi-steady, structural dynamic and a vibro-acoustic analysis of the ISS assembly-complete vehicle configuration. The Boeing Houston (BHOU) Loads and Dynamics Team is responsible to verify compliance with the ISS System Specification (SSP 41000) and USOS Segment (SSP 41162) microgravity requirements. To verify the ISS environment, a series of accelerometers are on-board to monitor the current environment. This paper summarizes the results of the analysis that was performed for the Verification Analysis Cycle (VAC)-Assembly Complete (AC) and compares it to on-orbit acceleration values currently being reported. The analysis will include the predicted maximum and average environment on-board ISS during multiple activity scenarios

  19. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  20. Completing and sustaining IMS network for the CTBT Verification Regime

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.

    2015-12-01

    The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.

  1. The role of science in treaty verification.

    PubMed

    Gavron, Avigdor

    2005-01-01

    Technologically advanced nations are currently applying more science to treaty verification than ever before. Satellites gather a multitude of information relating to proliferation concerns using thermal imaging analysis, nuclear radiation measurements, and optical and radio frequency signals detection. Ground stations gather complementary signals such as seismic events and radioactive emissions. Export controls in many countries attempt to intercept materials and technical means that could be used for nuclear proliferation. Nevertheless, we have witnessed a plethora of nuclear proliferation episodes, that were undetected (or were belatedly detected) by these technologies--the Indian nuclear tests in 1998, the Libyan nuclear buildup, the Iranian enrichment program and the North Korea nuclear weapons program are some prime examples. In this talk, we will discuss some of the technologies used for proliferation detection. In particular, we will note some of the issues relating to nuclear materials control agreements that epitomize political difficulties as they impact the implementation of science and technology.

  2. EVA Design, Verification, and On-Orbit Operations Support Using Worksite Analysis

    NASA Technical Reports Server (NTRS)

    Hagale, Thomas J.; Price, Larry R.

    2000-01-01

    The International Space Station (ISS) design is a very large and complex orbiting structure with thousands of Extravehicular Activity (EVA) worksites. These worksites are used to assemble and maintain the ISS. The challenge facing EVA designers was how to design, verify, and operationally support such a large number of worksites within cost and schedule. This has been solved through the practical use of computer aided design (CAD) graphical techniques that have been developed and used with a high degree of success over the past decade. The EVA design process allows analysts to work concurrently with hardware designers so that EVA equipment can be incorporated and structures configured to allow for EVA access and manipulation. Compliance with EVA requirements is strictly enforced during the design process. These techniques and procedures, coupled with neutral buoyancy underwater testing, have proven most valuable in the development, verification, and on-orbit support of planned or contingency EVA worksites.

  3. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  4. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  5. Remaining Sites Verification Package for the 100-F-26:15 Miscellaneous Pipelines Associated with the 132-F-6, 1608-F Waste Water Pumping Station, Waste Site Reclassification Form 2007-031

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. M. Dittmer

    2008-03-18

    The 100-F-26:15 waste site consisted of the remnant portions of underground process effluent and floor drain pipelines that originated at the 105-F Reactor. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.

  6. Life support and internal thermal control system design for the Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Humphries, R.; Mitchell, K.; Reuter, J.; Carrasquillo, R.; Beverly, B.

    1991-01-01

    A Review of the Space Station Freedom Environmental Control and Life Support System (ECLSS) as well as the Internal Thermal Control System (ITCS) design, including recent changes resulting from an activity to restructure the program, is provided. The development state of the original Space Station Freedom ECLSS through the restructured configuration is considered and the selection of regenerative subsystems for oxygen and water reclamation is addressed. A survey of the present ground development and verification program is given.

  7. Payload specialist station study. Part 2: CEI specifications (part 1). [space shuttles

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The performance, design, and verification specifications are established for the multifunction display system (MFDS) to be located at the payload station in the shuttle orbiter aft flight deck. The system provides the display units (with video, alphanumerics, and graphics capabilities), associated with electronic units and the keyboards in support of the payload dedicated controls and the displays concept.

  8. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger pause during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  9. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Julie Payette closes a container, part of the equipment to be carried on the SPACEHAB and mission STS-96. She and other crew members Commander Kent Rominger, Pilot Rick Husband, and Mission Speciaists Ellen Ochoa, Tamara Jernigan, Dan Barry and Valery Tokarev of Russia are at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station . Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  10. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Posing on the platform next to the SPACEHAB Logistics Double Module in the SPACEHAB Facility are the STS-96 crew (from left) Mission Specialists Dan Barry, Tamara Jernigan, Valery Tokarev of Russia, and Julie Payette; Pilot Rick Husband; Mission Specialist Ellen Ochoa; and Commander Kent Rominger. The crew is at KSC for a payload Interface Verification Test for their upcoming mission to the International Space Station. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  11. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger smile for the camera during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for the upcoming mission to the International Space Station , Chris Jaskolka of Boeing points out a piece of equipment in the SPACEHAB module to STS-96 Commander Kent Rominger, Mission Specialist Ellen Ochoa and Pilot Rick Husband. Other crew members visiting KSC for the IVT are Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialists Dan Barry and Tamara Jernigan discuss procedures during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  14. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station, STS-96 Mission Specialists Julie Payette, Dan Barry, and Valery Tokarev of Russia, look at a Sequential Shunt Unit in the SPACEHAB Facility. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  15. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (left to right) Mission Specialists Valery Tokarev, Julie Payette (holding a lithium hydroxide canister) and Dan Barry. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  16. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks over equipment during a payload Interface Verification Test for the upcoming mission to the International Space Station. From left are Commander Kent Rominger, Mission Specialists Tamara Jernigan and Valery Tokarev of Russia, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Julie Payette (backs to the camera). They are listening to Chris Jaskolka of Boeing talk about the equipment. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  17. International Space Station Environmental Control and Life Support Emergency Response Verification for Node 1

    NASA Technical Reports Server (NTRS)

    Williams, David E.

    2008-01-01

    The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the Node 1 Emergency Response capability, which includes nominal and off-nominal FDS operation, off nominal ACS operation, and off-nominal THC operation. These subsystems provide the capability to help aid the crew members during an emergency cabin depressurization, a toxic spill, or a fire. The paper will also provide a discussion of the detailed Node 1 ECLS Element Verification methodologies for operation of the Node 1 Emergency Response hardware operations utilized during the Qualification phase.

  18. An IBM PC-based math model for space station solar array simulation

    NASA Technical Reports Server (NTRS)

    Emanuel, E. M.

    1986-01-01

    This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.

  19. Seismological investigation of the National Data Centre Preparedness Exercise 2013

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Hartmann, Gernot; Ross, J. Ole; Ceranna, Lars

    2015-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions conducted on Earth - underground, underwater or in the atmosphere. The verification regime of the CTBT is designed to detect any treaty violation. While the data of the International Monitoring System (IMS) is collected, processed and technically analyzed at the International Data Centre (IDC) of the CTBT-Organization, National Data Centres (NDC) of the member states provide interpretation and advice to their government concerning suspicious detections. The NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies. These exercises should help to evaluate the effectiveness of analysis procedures applied at NDCs and the quality, completeness and usefulness of IDC products for example. The exercise trigger of NPE2013 is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The potential connection between the waveform and radionuclide evidence remains unclear for exercise participants. The verification task was to identify the waveform event and to investigate potential sources of the radionuclide findings. The final question was whether the findings are CTBT relevant and justify a request for On-Site-Inspection in "Frisia". The seismic event was not included in the Reviewed Event Bulletin (REB) of the IDC. The available detections from the closest seismic IMS stations lead to a epicenter accuracy of about 24 km which is not sufficient to specify the 1000 km2 inspection area in case of an OSI. With use of data from local stations and adjusted velocity models the epicenter accuracy could be improved to less than 2 km, which demonstrates the crucial role of national technical means for verification tasks. The seismic NPE2013 event could be identified as induced from natural gas production in the source region. Similar waveforms and comparable spectral characteristic as a set of events in the same region are clear indications. The scenario of a possible treaty violation at the location of the seismic NPE2013 event could be disproved.

  20. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  1. A comparative verification of high resolution precipitation forecasts using model output statistics

    NASA Astrophysics Data System (ADS)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  2. Acceptance test procedure for the L-070 project mechanical equipment and instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1996-04-19

    This document contains the acceptance test procedure for the mechanical equipment and instrumentation installed per the L-070 Project. The specific system to be tested are the pump controls for the 3906 Lift Station and 350-A Lift Station. In addition, verification that signals are being received by the 300 Area Treated Effluent Disposal Facility control system, is also performed.

  3. 78 FR 70499 - An Inquiry Into the Commission's Policies and Rules Regarding AM Radio Service Directional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... Antenna Performance Verification AGENCY: Federal Communications Commission. ACTION: Final rule; correction... as follows: Subpart BB--Disturbance of AM Broadcast Station Antenna Patterns * * * * * Federal...

  4. Ideas for a pattern-oriented approach towards a VERA analysis ensemble

    NASA Astrophysics Data System (ADS)

    Gorgas, T.; Dorninger, M.

    2010-09-01

    Ideas for a pattern-oriented approach towards a VERA analysis ensemble For many applications in meteorology and especially for verification purposes it is important to have some information about the uncertainties of observation and analysis data. A high quality of these "reference data" is an absolute necessity as the uncertainties are reflected in verification measures. The VERA (Vienna Enhanced Resolution Analysis) scheme includes a sophisticated quality control tool which accounts for the correction of observational data and provides an estimation of the observation uncertainty. It is crucial for meteorologically and physically reliable analysis fields. VERA is based on a variational principle and does not need any first guess fields. It is therefore NWP model independent and can also be used as an unbiased reference for real time model verification. For downscaling purposes VERA uses an a priori knowledge on small-scale physical processes over complex terrain, the so called "fingerprint technique", which transfers information from rich to data sparse regions. The enhanced Joint D-PHASE and COPS data set forms the data base for the analysis ensemble study. For the WWRP projects D-PHASE and COPS a joint activity has been started to collect GTS and non-GTS data from the national and regional meteorological services in Central Europe for 2007. Data from more than 11.000 stations are available for high resolution analyses. The usage of random numbers as perturbations for ensemble experiments is a common approach in meteorology. In most implementations, like for NWP-model ensemble systems, the focus lies on error growth and propagation on the spatial and temporal scale. When defining errors in analysis fields we have to consider the fact that analyses are not time dependent and that no perturbation method aimed at temporal evolution is possible. Further, the method applied should respect two major sources of analysis errors: Observation errors AND analysis or interpolation errors. With the concept of an analysis ensemble we hope to get a more detailed sight on both sources of analysis errors. For the computation of the VERA ensemble members a sample of Gaussian random perturbations is produced for each station and parameter. The deviation of perturbations is based on the correction proposals by the VERA QC scheme to provide some "natural" limits for the ensemble. In order to put more emphasis on the weather situation we aim to integrate the main synoptic field structures as weighting factors for the perturbations. Two widely approved approaches are used for the definition of these main field structures: The Principal Component Analysis and a 2D-Discrete Wavelet Transform. The results of tests concerning the implementation of this pattern-supported analysis ensemble system and a comparison of the different approaches are given in the presentation.

  5. Pressure measurements in a low-density nozzle plume for code verification

    NASA Technical Reports Server (NTRS)

    Penko, Paul F.; Boyd, Iain D.; Meissner, Dana L.; Dewitt, Kenneth J.

    1991-01-01

    Measurements of Pitot pressure were made in the exit plane and plume of a low-density, nitrogen nozzle flow. Two numerical computer codes were used to analyze the flow, including one based on continuum theory using the explicit MacCormack method, and the other on kinetic theory using the method of direct-simulation Monte Carlo (DSMC). The continuum analysis was carried to the nozzle exit plane and the results were compared to the measurements. The DSMC analysis was extended into the plume of the nozzle flow and the results were compared with measurements at the exit plane and axial stations 12, 24 and 36 mm into the near-field plume. Two experimental apparatus were used that differed in design and gave slightly different profiles of pressure measurements. The DSMC method compared well with the measurements from each apparatus at all axial stations and provided a more accurate prediction of the flow than the continuum method, verifying the validity of DSMC for such calculations.

  6. Precision Cleaning and Verification Processes Used at Marshall Space Flight Center for Critical Hardware Applications

    NASA Technical Reports Server (NTRS)

    Caruso, Salvadore V.; Cox, Jack A.; McGee, Kathleen A.

    1999-01-01

    This presentation discuss the Marshall Space Flight Center Operations and Responsibilities. These are propulsion, microgravity experiments, international space station, space transportation systems, and advance vehicle research.

  7. International interface design for Space Station Freedom - Challenges and solutions

    NASA Technical Reports Server (NTRS)

    Mayo, Richard E.; Bolton, Gordon R.; Laurini, Daniele

    1988-01-01

    The definition of interfaces for the International Space Station is discussed, with a focus on negotiations between NASA and ESA. The program organization and division of responsibilities for the Space Station are outlined; the basic features of physical and functional interfaces are described; and particular attention is given to the interface management and documentation procedures, architectural control elements, interface implementation and verification, and examples of Columbus interface solutions (including mechanical, ECLSS, thermal-control, electrical, data-management, standardized user, and software interfaces). Diagrams, drawings, graphs, and tables listing interface types are provided.

  8. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  9. Fault management for the Space Station Freedom control center

    NASA Technical Reports Server (NTRS)

    Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet

    1992-01-01

    This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.

  10. Real-time automated failure analysis for on-orbit operations

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which is to provide real-time failure analysis support to controllers at the NASA Johnson Space Center Control Center Complex (CCC) for both Space Station and Space Shuttle on-orbit operations is described. The system employs monitored systems' models of failure behavior and model evaluation algorithms which are domain-independent. These failure models are viewed as a stepping stone to more robust algorithms operating over models of intended function. The described system is designed to meet two sets of requirements. It must provide a useful failure analysis capability enhancement to the mission controller. It must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation. The underlying technology and how it may be used to support operations is also discussed.

  11. Structural Verification of the First Orbital Wonder of the World - The Structural Testing and Analysis of the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond

    2012-01-01

    The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or autonomously. The primary structure of each element was assembled and verified by teams of responsible structural engineers within and among their respective agencies and agency contractors.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks at equipment as part of a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . From left are Mission Specialist Ellen Ochoa (behind the opened storage cover ), Commander Kent Rominger, Pilot Rick Husband (holding a lithium hydroxide canister) and Mission Specialists Dan Barry, Valery Tokarev of Russia and Julie Payette. In the background is TTI interpreter Valentina Maydell. The other crew member at KSC for the IVT is Mission Specialist Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 crew members look over equipment during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. From left are Khristal Parker, with Boeing; Mission Specialist Dan Barry, Pilot Rick Husband, Mission Specialist Tamara Jernigan, and at the far right, Mission Specialist Julie Payette. An unidentified worker is in the background. Also at KSC for the IVT are Commander Kent Rominger and Mission Specialists Ellen Ochoa and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  14. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (left to right) STS-96 Pilot Rick Husband and Mission Specialists Julie Payette and Ellen Ochoa work the straps on the Sequential Shunt Unit (SSU) in front of them. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for its upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  15. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev (in foreground) of the Russian Space Agency closes a container, part of the equipment that will be in the SPACEHAB module on mission STS-96. Behind Tokarev are Pilot Rick Husband (left) and Mission Specialist Dan Barry (right). Other crew members at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station are Commander Kent Rominger and Mission Specialists Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  16. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (kneeling) STS-96 Mission Specialists Julie Payette and Ellen Ochoa, Pilot Rick Husband, and (standing at right) Mission Specialist Dan Barry. At the left is James Behling, with Boeing, explaining some of the equipment that will be on board STS-96. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  17. Detection, Location, and Characterization of Hydroacoustic Signals Using Seafloor Cable Networks Offshore Japan

    NASA Astrophysics Data System (ADS)

    Suyehiro, K.; Sugioka, H.; Watanabe, T.

    2008-12-01

    The hydroacoustic monitoring by the International Monitoring System for CTBT (Comprehensive Nuclear- Test-Ban Treaty) verification system utilizes hydrophone stations (6) and seismic stations (5 and called T- phase stations) for worldwide detection. Some conspicuous signals of natural origin include those from earthquakes, volcanic eruptions, or whale calls. Among artificial sources are non-nuclear explosions and airgun shots. It is important for the IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressure and seismic sensors) may be utilized to increase the capability of IMS. We use these data to compare some selected event parameters with those by IMS. In particular, there have been several unconventional acoustic signals in the western Pacific,which were also captured by IMS hydrophones across the Pacific in the time period of 2007-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals.

  18. Lymph node pick up by separate stations: Option or necessity.

    PubMed

    Morgagni, Paolo; Nanni, Oriana; Carretta, Elisa; Altini, Mattia; Saragoni, Luca; Falcini, Fabio; Garcea, Domenico

    2015-05-27

    To evaluate whether lymph node pick up by separate stations could be an indicator of patients submitted to appropriate surgical treatment. One thousand two hundred and three consecutive gastric cancer patients submitted to radical resection in 7 general hospitals and for whom no information was available on the extension of lymphatic dissection were included in this retrospective study. Patients were divided into 2 groups: group A, where the stomach specimen was directly formalin-fixed and sent to the pathologist, and group B, where lymph nodes were picked up after surgery and fixed for separate stations. Sixty-two point three percent of group A patients showed < 16 retrieved lymph nodes compared to 19.4% of group B (P < 0.0001). Group B (separate stations) patients had significantly higher survival rates than those in group A [46.1 mo (95%CI: 36.5-56.0) vs 27.7 mo (95%CI: 21.3-31.9); P = 0.0001], independently of T or N stage. In multivariate analysis, group A also showed a higher risk of death than group B (HR = 1.24; 95%CI: 1.05-1.46). Separate lymphatic station dissection increases the number of retrieved nodes, leads to better tumor staging, and permits verification of the surgical dissection. The number of dissected stations could potentially be used as an index to evaluate the quality of treatment received.

  19. First International Conference on Ada (R) Programming Language Applications for the NASA Space Station, volume 1

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L. (Editor)

    1986-01-01

    Topics discussed include: test and verification; environment issues; distributed Ada issues; life cycle issues; Ada in Europe; management/training issues; common Ada interface set; and run time issues.

  20. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  1. Advanced Distributed Measurements and Data Processing at the Vibro-Acoustic Test Facility, GRC Space Power Facility, Sandusky, Ohio - an Architecture and an Example

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.; Evans, Richard K.

    2009-01-01

    A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.

  2. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  3. Ground Systems Development Environment (GSDE) interface requirements analysis: Operations scenarios

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Phillips, John

    1991-01-01

    This report is a preliminary assessment of the functional and data interface requirements to the link between the GSDE GS/SPF (Amdahl) and the Space Station Control Center (SSCC) and Space Station Training Facility (SSTF) Integration, Verification, and Test Environments (IVTE's). These interfaces will be involved in ground software development of both the control center and the simulation and training systems. Our understanding of the configuration management (CM) interface and the expected functional characteristics of the Amdahl-IVTE interface is described. A set of assumptions and questions that need to be considered and resolved in order to complete the interface functional and data requirements definitions are presented. A listing of information items defined to describe software configuration items in the GSDE CM system is included. It also includes listings of standard reports of CM information and of CM-related tools in the GSDE.

  4. Real-time on-line space research laboratory environment monitoring with off-line trend and prediction analysis

    NASA Astrophysics Data System (ADS)

    Jules, Kenol; Lin, Paul P.

    2007-06-01

    With the International Space Station currently operational, a significant amount of acceleration data is being down-linked, processed and analyzed daily on the ground on a continuous basis for the space station reduced gravity environment characterization, the vehicle design requirements verification and science data collection. To help understand the impact of the unique spacecraft environment on the science data, an artificial intelligence monitoring system was developed, which detects in near real time any change in the reduced gravity environment susceptible to affect the on-going experiments. Using a dynamic graphical display, the monitoring system allows science teams, at any time and any location, to see the active vibration disturbances, such as pumps, fans, compressor, crew exercise, re-boost and extra-vehicular activities that might impact the reduced gravity environment the experiments are exposed to. The monitoring system can detect both known and unknown vibratory disturbance activities. It can also perform trend analysis and prediction by analyzing past data over many increments (an increment usually lasts 6 months) collected onboard the station for selected disturbances. This feature can be used to monitor the health of onboard mechanical systems to detect and prevent potential systems failures. The monitoring system has two operating modes: online and offline. Both near real-time on-line vibratory disturbance detection and off-line detection and trend analysis are discussed in this paper.

  5. Verifying the operational set-up of a radionuclide air-monitoring station.

    PubMed

    Werzi, R; Padoani, F

    2007-05-01

    A worldwide radionuclide network of 80 stations, part of the International Monitoring System, was designed to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty. After installation, the stations are certified to comply with the minimum requirements laid down by the Preparatory Commission of the Comprehensive Nuclear-Test-Ban Treaty Organization. Among the several certification tests carried out at each station, the verification of the radionuclide activity concentrations is a crucial one and is based on an independent testing of the airflow rate measurement system and of the gamma detector system, as well as on the assessment of the samples collected during parallel sampling and measured at radionuclide laboratories.

  6. Space Station personal hygiene study

    NASA Technical Reports Server (NTRS)

    Prejean, Stephen E.; Booher, Cletis R.

    1986-01-01

    A personal hygiene system is currently under development for Space Station application that will provide capabilities equivalent to those found on earth. This paper addresses the study approach for specifying both primary and contingency personal hygiene systems and provisions for specified growth. Topics covered are system definition and subsystem descriptions. Subsystem interfaces are explored to determine which concurrent NASA study efforts must be monitored during future design phases to stay up-to-date on critical Space Station parameters. A design concept for a three (3) compartment personal hygiene facility is included as a baseline for planned test and verification activities.

  7. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... frequency band of interest and submitted to the Commission. (1) Co-polarized patterns in the elevation plane... than 3 meters in diameter and antennas on simple (manual) drive mounts that are operated at a fixed...

  8. Knowledge based system verification and validation as related to automation of space station subsystems: Rationale for a knowledge based system lifecycle

    NASA Technical Reports Server (NTRS)

    Richardson, Keith; Wong, Carla

    1988-01-01

    The role of verification and validation (V and V) in software has been to support and strengthen the software lifecycle and to ensure that the resultant code meets the standards of the requirements documents. Knowledge Based System (KBS) V and V should serve the same role, but the KBS lifecycle is ill-defined. The rationale of a simple form of the KBS lifecycle is presented, including accommodation to certain critical KBS differences from software development.

  9. Engineering within the assembly, verification, and integration (AIV) process in ALMA

    NASA Astrophysics Data System (ADS)

    Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene

    2010-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered directly in this paper. It is believed that both continuous improvement and the clear definition of the AIV 4-station model were key factors in achieving the goal of bringing the antennas into a state that is well enough characterized in order to smoothly start commissioning activities.

  10. International Space Station Increment-6/8 Microgravity Environment Summary Report November 2002 to April 2004

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; Reckart, Timothy

    2006-01-01

    This summary report presents the analysis results of some of the processed acceleration data measured aboard the International Space Station during the period of November 2002 to April 2004. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-6/8. However, not all of the activities during that period were analyzed in order to keep the size of the report manageable. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System to support microgravity science experiments that require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification as well as in support of the International Space Station support cadre. The International Space Station Increment-6/8 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: 1. The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. 2. The Space Acceleration Measurement System measures vibratory acceleration data in the range of 0.01 to 400 Hz. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment-6/8 from November 2002 to April 2004.

  11. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, a quality technician checks the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  12. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, quality technicians check the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  13. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, quality technicians check components of the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  14. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, quality technicians check the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environment Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  15. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, a quality technician checks the control panel on hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  16. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, James Behling, with Boeing, talks about equipment for mission STS-96 during a payload Interface Verification Test (IVT). Watching are (from left) Mission Specialists Ellen Ochoa, Julie Payette and Dan Berry, and Pilot Rick Husband. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  17. International Space Station Bus Regulation With NASA Glenn Research Center Flywheel Energy Storage System Development Unit

    NASA Technical Reports Server (NTRS)

    Kascak, Peter E.; Kenny, Barbara H.; Dever, Timothy P.; Santiago, Walter; Jansen, Ralph H.

    2001-01-01

    An experimental flywheel energy storage system is described. This system is being used to develop a flywheel based replacement for the batteries on the International Space Station (ISS). Motor control algorithms which allow the flywheel to interface with a simplified model of the ISS power bus, and function similarly to the existing ISS battery system, are described. Results of controller experimental verification on a 300 W-hr flywheel are presented.

  18. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (from left) STS-96 Mission Specialist Julie Payette, Pilot Rick Husband and Mission Specialist Ellen Ochoa learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  19. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  20. Verification and implementation of microburst day potential index (MDPI) and wind INDEX (WINDEX) forecasting tools at Cape Canaveral Air Station

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark

    1996-01-01

    This report details the research, development, utility, verification and transition on wet microburst forecasting and detection the Applied Meteorology Unit (AMU) did in support of ground and launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS). The unforecasted wind event on 16 August 1994 of 33.5 ms-1 (65 knots) at the Shuttle Landing Facility raised the issue of wet microburst detection and forecasting. The AMU researched and analyzed the downburst wind event and determined it was a wet microburst event. A program was developed for operational use on the Meteorological Interactive Data Display System (MIDDS) weather system to analyze, compute and display Theta(epsilon) profiles, the microburst day potential index (MDPI), and wind index (WINDEX) maximum wind gust value. Key microburst nowcasting signatures using the WSR-88D data were highlighted. Verification of the data sets indicated that the MDPI has good potential in alerting the duty forecaster to the potential of wet microburst and the WINDEX values computed from the hourly surface data do have potential in showing a trend for the maximum gust potential. WINDEX should help in filling in the temporal hole between the MDPI on the last Cape Canaveral rawinsonde and the nowcasting radar data tools.

  1. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  2. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    NASA Technical Reports Server (NTRS)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  3. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  4. An Objective Verification of the North American Mesoscale Model for Kennedy Space Center and Cape Canaveral Air Force Station

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The 45th Weather Squadron (45 WS) Launch Weather Officers use the 12-km resolution North American Mesoscale (NAM) model (MesoNAM) text and graphical product forecasts extensively to support launch weather operations. However, the actual performance of the model at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) has not been measured objectively. In order to have tangible evidence of model performance, the 45 WS tasked the Applied Meteorology Unit to conduct a detailed statistical analysis of model output compared to observed values. The model products are provided to the 45 WS by ACTA, Inc. and include hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The objective analysis compared the MesoNAM forecast winds, temperature and dew point, as well as the changes in these parameters over time, to the observed values from the sensors in the KSC/CCAFS wind tower network. Objective statistics will give the forecasters knowledge of the model's strength and weaknesses, which will result in improved forecasts for operations.

  5. The Evolution of the NASA Commercial Crew Program Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy C.

    2016-01-01

    In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.

  6. Lunar base mission technology issues and orbital demonstration requirements on space station

    NASA Technical Reports Server (NTRS)

    Llewellyn, Charles P.; Weidman, Deene J.

    1992-01-01

    The International Space Station has been the object of considerable design, redesign, and alteration since it was originally proposed in early 1984. In the intervening years the station has slowly evolved to a specific design that was thoroughly reviewed by a large agency-wide Critical Evaluation Task Force (CETF). As space station designs continue to evolve, studies must be conducted to determine the suitability of the current design for some of the primary purposes for which the station will be used. This paper concentrates on the technology requirements and issues, the on-orbit demonstration and verification program, and the space station focused support required prior to the establishment of a permanently manned lunar base as identified in the National Commission on Space report. Technology issues associated with the on-orbit assembly and processing of the lunar vehicle flight elements are also discussed.

  7. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  8. Some key considerations in evolving a computer system and software engineering support environment for the space station program

    NASA Technical Reports Server (NTRS)

    Mckay, C. W.; Bown, R. L.

    1985-01-01

    The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.

  9. International Space Station Increment-2 Microgravity Environment Summary Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy

    2002-01-01

    This summary report presents the results of some of the processed acceleration data, collected aboard the International Space Station during the period of May to August 2001, the Increment-2 phase of the station. Two accelerometer systems were used to measure the acceleration levels during activities that took place during the Increment-2 segment. However, not all of the activities were analyzed for this report due to time constraints, lack of precise information regarding some payload operations and other station activities. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments, which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of vehicle microgravity requirements verification. The International Space Station Increment-2 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: 1) The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and the vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. 2) The Space Acceleration Measurement System, which is a high frequency sensor, measures vibratory acceleration data in the range of 0.01 to 300 Hz. This summary report presents analysis of some selected quasisteady and vibratory activities measured by these accelerometers during Increment-2 from May to August 20, 2001.

  10. Study of the performance of the infrasound Station IS48TN during the period 2011 -2017

    NASA Astrophysics Data System (ADS)

    Mejri, Chourouk

    2017-04-01

    The Infrasound Station IS48, in Kesra, Tunisia is part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty. IS48 is managed and maintained by the Tunisian NDC. Its good location in the middle of the Mediterranean Sea allows to have various and interesting detections. Several events were recorded and identified. But since 2010, the performance of the station has began to be noisy , due to a leakage in the pipe array, despite efforts to solve the issue through the pressure test and maintenance . To this issues, PTS decided to upgrade the WNRS at IS48TN.

  11. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, LED plant growth lights are being checked out on the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  12. Expert system verification and validation survey, delivery 4

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  13. Expert system verification and validation survey. Delivery 2: Survey results

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and industry applications. This is the first task of the series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  14. Expert system verification and validation survey. Delivery 5: Revised

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  15. Expert system verification and validation survey. Delivery 3: Recommendations

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  16. ETV REPORT: REMOVAL OF ARSENIC IN DRINKING WATER - PALL CORPORATION MICROZA. MICROFILTRATION SYSTEM

    EPA Science Inventory

    Verification testing of the Pall Corporation Microza. Microfiltration System for arsenic removal was conducted at the Oakland County Drain Commissioner (OCDC) Plum Creek Development well station located in Oakland County, Michigan from August 19 through October 8, 2004. The sourc...

  17. Dust devil signatures in infrasound records of the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Christie, Douglas

    2015-03-01

    We explore whether dust devils have a recognizable signature in infrasound array records, since several Comprehensive Nuclear-Test-Ban Treaty verification stations conducting continuous measurements with microbarometers are in desert areas which see dust devils. The passage of dust devils (and other boundary layer vortices, whether dust laden or not) causes a local temporary drop in pressure: the high-pass time domain filtering in microbarometers results in a "heartbeat" signature, which we observe at the Warramunga station in Australia. We also observe a ~50 min pseudoperiodicity in the occurrence of these signatures and some higher-frequency infrasound. Dust devils do not significantly degrade the treaty verification capability. The pipe arrays for spatial averaging used in infrasound monitoring degrade the detection efficiency of small devils, but the long observation time may allow a useful census of large vortices, and thus, the high-sensitivity infrasonic array data from the monitoring network can be useful in studying columnar vortices in the lower atmosphere.

  18. The Analysis of North Korea's Nuclear Tests by Turkish National Data Center

    NASA Astrophysics Data System (ADS)

    Semin, K.; Meral Ozel, N.; Destici, T. C.; Necmioglu, O.; Kocak, S.

    2013-12-01

    The Democratic People's Republic of Korea (DPRK) announced the conduct of a third underground nuclear test on 12 February 2013 in the northeastern part of the country as the previous tests that were conducted in 2009 and 2006. The latest nuclear test is the best detected nuclear event by the global seismic networks. The magnitude estimates show that each new test increased in size when compared with the previous one. As Turkish NDC (National Data Center), we have analyzed the 2013 and 2009 nuclear tests using seismic data from International Monitoring System (IMS) stations through the International Data Center (IDC) located in Vienna. Discrimination analysis was performed based on mb:Ms magnitude ratio and spectral analysis. We have also applied array based waveform cross-correlation to show the similarity of the nuclear tests and precise arrival time measurements for relative location estimates and basic infrasound analysis using two IMS infrasound stations for the 2013 event. Seismic analysis were performed using softwares such as Geotool, EP (Event processor from Norsar) and Seismic Analysis Code (SAC) and the infrasound data were analyzed by using PMCC from CEA-France. The IMS network is operating under the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The CTBTO verification system is under continuous development, also making use of the state of the art technologies and methodologies.

  19. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (left) and Commander Kent Rominger (second from right) listen to Lynn Ashby (far right), with JSC, talking about the SPACEHAB equipment in front of them during a payload Interface Verification Test (IVT). In the background behind Tokarev is TTI interpreter Valentina Maydell. Other STS-96 crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Dan Barry, Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  20. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (second from left) and Commander Kent Rominger learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. At the far left looking on is TTI interpreter Valentina Maydell. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Ellen Ochoa, Tamara Jernigan, Dan Barry and Julie Payette. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  1. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Tamara Jernigan checks over instructions while Mission Specialist Dan Barry looks up from the Sequential Shunt Unit (SSU) in front of him to other equipment Lynn Ashby (right), with Johnson Space Center, is pointing at. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  2. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Pilot Rick Husband and Mission Specialist Ellen Ochoa (on the left) and Mission Specialist Julie Payette (on the far right) listen to Khristal Parker (second from right), with Boeing, explain about the equipment in front of them. Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  3. Regional maximum rainfall analysis using L-moments at the Titicaca Lake drainage, Peru

    NASA Astrophysics Data System (ADS)

    Fernández-Palomino, Carlos Antonio; Lavado-Casimiro, Waldo Sven

    2017-08-01

    The present study investigates the application of the index flood L-moments-based regional frequency analysis procedure (RFA-LM) to the annual maximum 24-h rainfall (AM) of 33 rainfall gauge stations (RGs) to estimate rainfall quantiles at the Titicaca Lake drainage (TL). The study region was chosen because it is characterised by common floods that affect agricultural production and infrastructure. First, detailed quality analyses and verification of the RFA-LM assumptions were conducted. For this purpose, different tests for outlier verification, homogeneity, stationarity, and serial independence were employed. Then, the application of RFA-LM procedure allowed us to consider the TL as a single, hydrologically homogeneous region, in terms of its maximum rainfall frequency. That is, this region can be modelled by a generalised normal (GNO) distribution, chosen according to the Z test for goodness-of-fit, L-moments (LM) ratio diagram, and an additional evaluation of the precision of the regional growth curve. Due to the low density of RG in the TL, it was important to produce maps of the AM design quantiles estimated using RFA-LM. Therefore, the ordinary Kriging interpolation (OK) technique was used. These maps will be a useful tool for determining the different AM quantiles at any point of interest for hydrologists in the region.

  4. International challenge to predict the impact of radioxenon releases from medical isotope production on a comprehensive nuclear test ban treaty sampling station.

    PubMed

    Eslinger, Paul W; Bowyer, Ted W; Achim, Pascal; Chai, Tianfeng; Deconninck, Benoit; Freeman, Katie; Generoso, Sylvia; Hayes, Philip; Heidmann, Verena; Hoffman, Ian; Kijima, Yuichi; Krysta, Monika; Malo, Alain; Maurer, Christian; Ngan, Fantine; Robins, Peter; Ross, J Ole; Saunier, Olivier; Schlosser, Clemens; Schöppner, Michael; Schrom, Brian T; Seibert, Petra; Stein, Ariel F; Ungar, Kurt; Yi, Jing

    2016-06-01

    The International Monitoring System (IMS) is part of the verification regime for the Comprehensive Nuclear-Test-Ban-Treaty Organization (CTBTO). At entry-into-force, half of the 80 radionuclide stations will be able to measure concentrations of several radioactive xenon isotopes produced in nuclear explosions, and then the full network may be populated with xenon monitoring afterward. An understanding of natural and man-made radionuclide backgrounds can be used in accordance with the provisions of the treaty (such as event screening criteria in Annex 2 to the Protocol of the Treaty) for the effective implementation of the verification regime. Fission-based production of (99)Mo for medical purposes also generates nuisance radioxenon isotopes that are usually vented to the atmosphere. One of the ways to account for the effect emissions from medical isotope production has on radionuclide samples from the IMS is to use stack monitoring data, if they are available, and atmospheric transport modeling. Recently, individuals from seven nations participated in a challenge exercise that used atmospheric transport modeling to predict the time-history of (133)Xe concentration measurements at the IMS radionuclide station in Germany using stack monitoring data from a medical isotope production facility in Belgium. Participants received only stack monitoring data and used the atmospheric transport model and meteorological data of their choice. Some of the models predicted the highest measured concentrations quite well. A model comparison rank and ensemble analysis suggests that combining multiple models may provide more accurate predicted concentrations than any single model. None of the submissions based only on the stack monitoring data predicted the small measured concentrations very well. Modeling of sources by other nuclear facilities with smaller releases than medical isotope production facilities may be important in understanding how to discriminate those releases from releases from a nuclear explosion. Published by Elsevier Ltd.

  5. 78 FR 59029 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-25

    ... Broadcast Station Antenna Patterns. Form No.: Not applicable. Type of Review: Revision of a currently... Rules Regarding AM Radio Service Directional Antenna Performance Verification, MM Docket No. 93-177, FCC... functions as the antenna. Consequently, a nearby tower may become an unintended part of the AM antenna...

  6. Spring-Based Helmet System Support Prototype to Address Aircrew Neck Strain

    DTIC Science & Technology

    2014-06-01

    Helicopter Squadron stationed at CFB Borden ALSE Personnel Flight Engineers Pilots 4.6 Discussion of Verification Results 4.6.1 Reduce the mass on the...the participant in the pilot’s posture. Figure 8. A simulation of Flight Engineers’ postures during landing and low flying maneuvres. Figure 9

  7. International Space Station Increment-3 Microgravity Environment Summary Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric; McPherson, Kevin; Reckart, Timothy; Grodsinksy, Carlos

    2002-01-01

    This summary report presents the results of some of the processed acceleration data measured aboard the International Space Station during the period of August to December 2001. Two accelerometer systems were used to measure the acceleration levels for the activities that took place during Increment-3. However, not all of the activities were analyzed for this report due to time constraint and lack of precise timeline information regarding some payload operations and station activities. The National Aeronautics and Space Administration sponsors the Microgravity Acceleration Measurement System and the Space Acceleration Microgravity System to support microgravity science experiments which require microgravity acceleration measurements. On April 19, 2001, both the Microgravity Acceleration Measurement System and the Space Acceleration Measurement System units were launched on STS-100 from the Kennedy Space Center for installation on the International Space Station. The Microgravity Acceleration Measurement System unit was flown to the station in support of science experiments requiring quasi-steady acceleration measurements, while the Space Acceleration Measurement System unit was flown to support experiments requiring vibratory acceleration measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification. The International Space Station Increment-3 reduced gravity environment analysis presented in this report uses acceleration data collected by both sets of accelerometer systems: (1) The Microgravity Acceleration Measurement System, which consists of two sensors: the Orbital Acceleration Research Experiment Sensor Subsystem, a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle, and the High Resolution Accelerometer Package, which is used to characterize the vibratory environment up to 100 Hz. (2) The Space Acceleration Measurement System, which is a high frequency sensor, measures vibratory acceleration data in the range of 0.01 to 400 Hz. This summary report presents analysis of some selected quasi-steady and vibratory activities measured by these accelerometers during Increment-3 from August to December, 2001.

  8. Improved orbiter waste collection system study

    NASA Technical Reports Server (NTRS)

    Bastin, P. H.

    1984-01-01

    Design concepts for improved fecal waste collection both on the space shuttle orbiter and as a precursor for the space station are discussed. Inflight usage problems associated with the existing orbiter waste collection subsystem are considered. A basis was sought for the selection of an optimum waste collection system concept which may ultimately result in the development of an orbiter flight test article for concept verification and subsequent production of new flight hardware. Two concepts were selected for orbiter and are shown in detail. Additionally, one concept selected for application to the space station is presented.

  9. Pre-Launch Risk Reduction Activities Conducted at KSC for the International Space Station

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, Paul

    2011-01-01

    In the development of any large scale space-based multi-piece assembly effort, planning must include provisions for testing and verification; not only of the individual pieces but also of the pieces together. Without such testing on the ground, the risk to cost, schedule and technical performance increases substantially. This paper will review the efforts undertaken by the International Space Station (ISS), including the International Partners, during the pre-launch phase, primarily at KSC, to reduce the risks associated with the on-orbit assembly and operation of the ISS.

  10. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Cougar, Tamara; Ulrich, BettyLynn

    2017-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies in the LAB MCA are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Finally, the Node 3 MCA is being brought to an operational configuration.

  11. Verification and Implementation of Operations Safety Controls for Flight Missions

    NASA Technical Reports Server (NTRS)

    Jones, Cheryl L.; Smalls, James R.; Carrier, Alicia S.

    2010-01-01

    Approximately eleven years ago, the International Space Station launched the first module from Russia, the Functional Cargo Block (FGB). Safety and Mission Assurance (S&MA) Operations (Ops) Engineers played an integral part in that endeavor by executing strict flight product verification as well as continued staffing of S&MA's console in the Mission Evaluation Room (MER) for that flight mission. How were these engineers able to conduct such a complicated task? They conducted it based on product verification that consisted of ensuring that safety requirements were adequately contained in all flight products that affected crew safety. S&MA Ops engineers apply both systems engineering and project management principles in order to gain a appropriate level of technical knowledge necessary to perform thorough reviews which cover the subsystem(s) affected. They also ensured that mission priorities were carried out with a great detail and success.

  12. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  13. Interacting with a security system: The Argus user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behrin, E.; Davis, G.E.

    1993-12-31

    In the mid-1980s the Lawrence Livermore National Laboratory (LLNL) developed the Argus Security System. Key requirements were to eliminate the telephone as a verification device for opening and closing alarm stations and to allow need-to-know access through local enrollment at alarm stations. Resulting from these requirements was an LLNL-designed user interface called the Remote Access Panel (RAP). The Argus RAP interacts with Argus field processors to allow secure station mode changes and local station enrollment, provides user direction and response, and assists station maintenance personnel. It consists of a tamper-detecting housing containing a badge reader, a keypad with sight screen,more » special-purpose push buttons and a liquid-crystal display. This paper discusses Argus system concepts, RAP design, functional characteristics and its physical configurations. The paper also describes the RAP`s use in access-control booths, it`s integration with biometrics and its operation for multi-person-rule stations and compartmented facilities.« less

  14. Expert system verification and validation study. Phase 2: Requirements identification. Delivery 1: Updated survey report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.

  15. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  16. Verification of voltage/frequency requirement for emergency diesel generator in nuclear power plant using dynamic modeling

    NASA Astrophysics Data System (ADS)

    Hur, Jin-Suk; Roh, Myung-Sub

    2014-02-01

    One major cause of the plant shutdown is the loss of electrical power. The study is to comprehend the coping action against station blackout including emergency diesel generator, sequential loading of safety system and to ensure that the emergency diesel generator should meet requirements, especially voltage and frequency criteria using modeling tool. This paper also considered the change of the sequencing time and load capacity only for finding electrical design margin. However, the revision of load list must be verified with safety analysis. From this study, it is discovered that new load calculation is a key factor in EDG localization and in-house capability increase.

  17. FAST Mast Structural Response to Axial Loading: Modeling and Verification

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Elliott, Kenny B.; Templeton, Justin D.; Song, Kyongchan; Rayburn, Jeffery T.

    2012-01-01

    The International Space Station s solar array wing mast shadowing problem is the focus of this paper. A building-block approach to modeling and analysis is pursued for the primary structural components of the solar array wing mast structure. Starting with an ANSYS (Registered Trademark) finite element model, a verified MSC.Nastran (Trademark) model is established for a single longeron. This finite element model translation requires the conversion of several modeling and analysis features for the two structural analysis tools to produce comparable results for the single-longeron configuration. The model is then reconciled using test data. The resulting MSC.Nastran (Trademark) model is then extended to a single-bay configuration and verified using single-bay test data. Conversion of the MSC. Nastran (Trademark) single-bay model to Abaqus (Trademark) is also performed to simulate the elastic-plastic longeron buckling response of the single bay prior to folding.

  18. An Objective Verification of the North American Mesoscale Model for Kennedy Space Center and Cape Canaveral Air Force Station

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The 45th Weather Squadron (45 WS) Launch Weather Officers (LWO's) use the 12-km resolution North American Mesoscale (NAM) model (MesoNAM) text and graphical product forecasts extensively to support launch weather operations. However, the actual performance of the model at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) has not been measured objectively. In order to have tangible evidence of model performance, the 45 WS tasked the Applied Meteorology Unit (AMU; Bauman et ai, 2004) to conduct a detailed statistical analysis of model output compared to observed values. The model products are provided to the 45 WS by ACTA, Inc. and include hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The objective analysis compared the MesoNAM forecast winds, temperature (T) and dew pOint (T d), as well as the changes in these parameters over time, to the observed values from the sensors in the KSC/CCAFS wind tower network shown in Table 1. These objective statistics give the forecasters knowledge of the model's strengths and weaknesses, which will result in improved forecasts for operations.

  19. Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program

    NASA Technical Reports Server (NTRS)

    Manobianco, John; Nutter, Paul

    1997-01-01

    The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.

  20. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less

  1. Influence of the Redundant Verification and the Non-Redundant Verification on the Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.

    2016-12-01

    In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward

  2. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  3. 32 CFR Table 1 to Part 855 - Purpose of Use/Verification/Approval Authority/Fees

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... change of station, etc.) or for private, non revenue flights Social security number in block 1 on DD Form... of a uniformed service member Identification card (DD Form 1173) number or social security number... Form 1173) number or social security number, identification card expiration date, sponsor's retirement...

  4. 32 CFR Table 1 to Part 855 - Purpose of Use/Verification/Approval Authority/Fees

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... change of station, etc.) or for private, non revenue flights Social security number in block 1 on DD Form... of a uniformed service member Identification card (DD Form 1173) number or social security number... Form 1173) number or social security number, identification card expiration date, sponsor's retirement...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, C. LEE COOK DIV., DOVER CORP., STATIC PAC SYSTEM, PHASE I REPORT

    EPA Science Inventory

    The Static Pac was verified at a natural gas compressor station operated by ANR Pipeline Company. The test was carried out on two engines (8-cylinder, 2000 hp), each with two reciprocating compressors operating in parallel (4 in. rods). The evaluation focused on two shutdown proc...

  6. Software architecture standard for simulation virtual machine, version 2.0

    NASA Technical Reports Server (NTRS)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  7. Development of a Rational Modeling Approach for the Design, and Optimization of the Multifiltration Unit. Volume 1

    NASA Technical Reports Server (NTRS)

    Hand, David W.; Crittenden, John C.; Ali, Anisa N.; Bulloch, John L.; Hokanson, David R.; Parrem, David L.

    1996-01-01

    This thesis includes the development and verification of an adsorption model for analysis and optimization of the adsorption processes within the International Space Station multifiltration beds. The fixed bed adsorption model includes multicomponent equilibrium and both external and intraparticle mass transfer resistances. Single solute isotherm parameters were used in the multicomponent equilibrium description to predict the competitive adsorption interactions occurring during the adsorption process. The multicomponent equilibrium description used the Fictive Component Analysis to describe adsorption in unknown background matrices. Multicomponent isotherms were used to validate the multicomponent equilibrium description. Column studies were used to develop and validate external and intraparticle mass transfer parameter correlations for compounds of interest. The fixed bed model was verified using a shower and handwash ersatz water which served as a surrogate to the actual shower and handwash wastewater.

  8. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  9. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  10. Microgravity Acceleration Measurement System (MAMS) Flight Configuration Verification and Status

    NASA Technical Reports Server (NTRS)

    Wagar, William

    2000-01-01

    The Microgravity Acceleration Measurement System (MAMS) is a precision spaceflight instrument designed to measure and characterize the microgravity environment existing in the US Lab Module of the International Space Station. Both vibratory and quasi-steady triaxial acceleration data are acquired and provided to an Ethernet data link. The MAMS Double Mid-Deck Locker (DMDL) EXPRESS Rack payload meets all the ISS IDD and ICD interface requirements as discussed in the paper which also presents flight configuration illustrations. The overall MAMS sensor and data acquisition performance and verification data are presented in addition to a discussion of the Command and Data Handling features implemented via the ISS, downlink and the GRC Telescience Center displays.

  11. International Space Station Payload Operations Integration Center (POIC) Overview

    NASA Technical Reports Server (NTRS)

    Ijames, Gayleen N.

    2012-01-01

    Objectives and Goals: Maintain and operate the POIC and support integrated Space Station command and control functions. Provide software and hardware systems to support ISS payloads and Shuttle for the POIF cadre, Payload Developers and International Partners. Provide design, development, independent verification &validation, configuration, operational product/system deliveries and maintenance of those systems for telemetry, commanding, database and planning. Provide Backup Control Center for MCC-H in case of shutdown. Provide certified personnel and systems to support 24x7 facility operations per ISS Program. Payloads CoFR Implementation Plan (SSP 52054) and MSFC Payload Operations CoFR Implementation Plan (POIF-1006).

  12. Verification of BWR Turbine Skyshine Dose with the MCNP5 Code Based on an Experiment Made at SHIMANE Nuclear Power Station

    NASA Astrophysics Data System (ADS)

    Tayama, Ryuichi; Wakasugi, Kenichi; Kawanaka, Ikunori; Kadota, Yoshinobu; Murakami, Yasuhiro

    We measured the skyshine dose from turbine buildings at Shimane Nuclear Power Station Unit 1 (NS-1) and Unit 2 (NS-2), and then compared it with the dose calculated with the Monte Carlo transport code MCNP5. The skyshine dose values calculated with the MCNP5 code agreed with the experimental data within a factor of 2.8, when the roof of the turbine building was precisely modeled. We concluded that our MCNP5 calculation was valid for BWR turbine skyshine dose evaluation.

  13. A Review of International Space Station Habitable Element Equipment Offgassing Characteristics

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.

    2010-01-01

    Crewed spacecraft trace contaminant control employs both passive and active methods to achieve acceptable cabin atmospheric quality. Passive methods include carefully selecting materials of construction, employing clean manufacturing practices, and minimizing systems and payload operational impacts to the cabin environment. Materials selection and manufacturing processes constitute the first level of equipment offgassing control. An element-level equipment offgassing test provides preflight verification that passive controls have been successful. Offgassing test results from multiple International Space Station (ISS) habitable elements and cargo vehicles are summarized and implications for active contamination control equipment design are discussed

  14. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  15. Circuitbot

    DTIC Science & Technology

    2016-03-01

    constraints problem. Game rules described valid moves allowing player to generate a memory graph performing improved C program verification . 15. SUBJECT...TERMS Formal Verification , Static Analysis, Abstract Interpretation, Pointer Analysis, Fixpoint Iteration 16. SECURITY CLASSIFICATION OF: 17...36 3.4.12 Example: Game Play . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.13 Verification

  16. Strategic planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Griner, Carolyn S.

    1990-01-01

    The concept for utilization and operations planning for the International Space Station Freedom was developed in a NASA Space Station Operations Task Force in 1986. Since that time the concept has been further refined to definitize the process and products required to integrate the needs of the international user community with the operational capabilities of the Station in its evolving configuration. The keystone to the process is the development of individual plans by the partners, with the parameters and formats common to the degree that electronic communications techniques can be effectively utilized, while maintaining the proper level and location of configuration control. The integration, evaluation, and verification of the integrated plan, called the Consolidated Operations and Utilization Plan (COUP), is being tested in a multilateral environment to prove out the parameters, interfaces, and process details necessary to produce the first COUP for Space Station in 1991. This paper will describe the concept, process, and the status of the multilateral test case.

  17. Shuttle to space station transfer of the materials exposure facility

    NASA Technical Reports Server (NTRS)

    Shannon, David T., Jr.; Klich, Phillip J.

    1995-01-01

    The Materials Exposure Facility (MEF) is being proposed by LaRC as the first long-term space materials exposure facility with real-time interaction with materials experiments in actual conditions of orbital space flight. The MEF is proposed as a Space Station external payload dedicated to technology advancement in spacecraft materials and coatings research. This paper will define a set of potential logistics for removing the MEF from the Shuttle cargo bay and the process required for transferring the MEF to a specific external payload site on Space Station Freedom (SSF). The SSF UF-2 configuration is used for this study. The kinematics and ability to successfully perform the appropriate MEF maneuvers required were verified. During completion of this work, the Space Station was redesigned and the International Space Station Alpha (ISSA) configuration evolved. The transfer procedure for SSF was valid for ISSA; however, a verification of kinematics and clearances was essential. Also, SSF and ISSA robotic interfaces with the Orbiter were different.

  18. Regression equations for estimating flood flows for the 2-, 10-, 25-, 50-, 100-, and 500-Year recurrence intervals in Connecticut

    USGS Publications Warehouse

    Ahearn, Elizabeth A.

    2004-01-01

    Multiple linear-regression equations were developed to estimate the magnitudes of floods in Connecticut for recurrence intervals ranging from 2 to 500 years. The equations can be used for nonurban, unregulated stream sites in Connecticut with drainage areas ranging from about 2 to 715 square miles. Flood-frequency data and hydrologic characteristics from 70 streamflow-gaging stations and the upstream drainage basins were used to develop the equations. The hydrologic characteristics?drainage area, mean basin elevation, and 24-hour rainfall?are used in the equations to estimate the magnitude of floods. Average standard errors of prediction for the equations are 31.8, 32.7, 34.4, 35.9, 37.6 and 45.0 percent for the 2-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals, respectively. Simplified equations using only one hydrologic characteristic?drainage area?also were developed. The regression analysis is based on generalized least-squares regression techniques. Observed flows (log-Pearson Type III analysis of the annual maximum flows) from five streamflow-gaging stations in urban basins in Connecticut were compared to flows estimated from national three-parameter and seven-parameter urban regression equations. The comparison shows that the three- and seven- parameter equations used in conjunction with the new statewide equations generally provide reasonable estimates of flood flows for urban sites in Connecticut, although a national urban flood-frequency study indicated that the three-parameter equations significantly underestimated flood flows in many regions of the country. Verification of the accuracy of the three-parameter or seven-parameter national regression equations using new data from Connecticut stations was beyond the scope of this study. A technique for calculating flood flows at streamflow-gaging stations using a weighted average also is described. Two estimates of flood flows?one estimate based on the log-Pearson Type III analyses of the annual maximum flows at the gaging station, and the other estimate from the regression equation?are weighted together based on the years of record at the gaging station and the equivalent years of record value determined from the regression. Weighted averages of flood flows for the 2-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals are tabulated for the 70 streamflow-gaging stations used in the regression analysis. Generally, weighted averages give the most accurate estimate of flood flows at gaging stations. An evaluation of the Connecticut's streamflow-gaging network was performed to determine whether the spatial coverage and range of geographic and hydrologic conditions are adequately represented for transferring flood characteristics from gaged to ungaged sites. Fifty-one of 54 stations in the current (2004) network support one or more flood needs of federal, state, and local agencies. Twenty-five of 54 stations in the current network are considered high-priority stations by the U.S. Geological Survey because of their contribution to the longterm understanding of floods, and their application for regionalflood analysis. Enhancements to the network to improve overall effectiveness for regionalization can be made by increasing the spatial coverage of gaging stations, establishing stations in regions of the state that are not well-represented, and adding stations in basins with drainage area sizes not represented. Additionally, the usefulness of the network for characterizing floods can be maintained and improved by continuing operation at the current stations because flood flows can be more accurately estimated at stations with continuous, long-term record.

  19. Verification of the WFAS Lightning Efficiency Map

    Treesearch

    Paul Sopko; Don Latham; Isaac Grenfell

    2007-01-01

    A Lightning Ignition Efficiency map was added to the suite of daily maps offered by the Wildland Fire Assessment System (WFAS) in 1999. This map computes a lightning probability of ignition (POI) based on the estimated fuel type, fuel depth, and 100-hour fuel moisture interpolated from the Remote Automated Weather Station (RAWS) network. An attempt to verify the...

  20. 32 CFR Table 1 to Part 855 - Purpose of Use/Verification/Approval Authority/Fees

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... change of station, etc.) or for private, non revenue flights Social security number in block 1 on DD Form... other available aircraft for official Government business travel Supervisor's endorsement in block 4 of... be listed in block 4 of the DD Form 2401. An official government document, such as an SF 1169, US...

  1. 32 CFR Table 1 to Part 855 - Purpose of Use/Verification/Approval Authority/Fees

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... change of station, etc.) or for private, non revenue flights Social security number in block 1 on DD Form... other available aircraft for official Government business travel Supervisor's endorsement in block 4 of... be listed in block 4 of the DD Form 2401. An official government document, such as an SF 1169, US...

  2. 32 CFR Table 1 to Part 855 - Purpose of Use/Verification/Approval Authority/Fees

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... change of station, etc.) or for private, non revenue flights Social security number in block 1 on DD Form... other available aircraft for official Government business travel Supervisor's endorsement in block 4 of... be listed in block 4 of the DD Form 2401. An official government document, such as an SF 1169, US...

  3. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  4. Source Characterization of Underground Explosions from Combined Regional Moment Tensor and First-Motion Analysis

    DOE PAGES

    Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.; ...

    2014-07-08

    Here in this study, we investigate the 14 September 1988 U.S.–Soviet Joint Verification Experiment nuclear test at the Semipalatinsk test site in eastern Kazakhstan and two nuclear explosions conducted less than 10 years later at the Chinese Lop Nor test site. These events were very sparsely recorded by stations located within 1600 km, and in each case only three or four stations were available in the regional distance range. We have utilized a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long-period waveforms and first-motionmore » observations provides a unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We demonstrate through a series of jackknife tests and sensitivity analyses that the source type of the explosions is well constrained. One event, a 1996 Lop Nor shaft explosion, displays large Love waves and possibly reversed Rayleigh waves at one station, indicative of a large F-factor. We show the combination of long-period waveforms and P-wave first motions are able to discriminate this event as explosion-like and distinct from earthquakes and collapses. We further demonstrate the behavior of network sensitivity solutions for models of tectonic release and spall-based tensile damage over a range of F-factors and K-factors.« less

  5. Source Characterization of Underground Explosions from Combined Regional Moment Tensor and First-Motion Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Andrea; Dreger, Douglas S.; Ford, Sean R.

    Here in this study, we investigate the 14 September 1988 U.S.–Soviet Joint Verification Experiment nuclear test at the Semipalatinsk test site in eastern Kazakhstan and two nuclear explosions conducted less than 10 years later at the Chinese Lop Nor test site. These events were very sparsely recorded by stations located within 1600 km, and in each case only three or four stations were available in the regional distance range. We have utilized a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long-period waveforms and first-motionmore » observations provides a unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We demonstrate through a series of jackknife tests and sensitivity analyses that the source type of the explosions is well constrained. One event, a 1996 Lop Nor shaft explosion, displays large Love waves and possibly reversed Rayleigh waves at one station, indicative of a large F-factor. We show the combination of long-period waveforms and P-wave first motions are able to discriminate this event as explosion-like and distinct from earthquakes and collapses. We further demonstrate the behavior of network sensitivity solutions for models of tectonic release and spall-based tensile damage over a range of F-factors and K-factors.« less

  6. Challenges in Regional CTBT Monitoring: The Experience So Far From Vienna

    NASA Astrophysics Data System (ADS)

    Bratt, S. R.

    2001-05-01

    The verification system being established to monitor the CTBT will include an International Monitoring System (IMS) network of 321 seismic, hydroacoustic, infrasound and radionuclide stations, transmitting digital data to the International Data Centre (IDC) in Vienna, Austria over a Global Communications Infrastructure (GCI). The IDC started in February 2000 to disseminate a wide range of products based on automatic processing and interactive analysis of data from about 90 stations from the four IMS technologies. The number of events in the seismo-acoustic Reviewed Event Bulletins (REB) was 18,218 for the year 2000, with the daily number ranging from 30 to 360. Over 300 users from almost 50 Member States are now receiving an average of 18,000 data and product deliveries per month from the IDC. As the IMS network expands (40 - 60 new stations are scheduled start transmitting data this year) and as GCI communications links bring increasing volumes of new data into Vienna (70 new GCI sites are currently in preparation), the monitoring capability of the IMS and IDC has the potential to improve significantly. To realize this potential, the IDC must continue to improve its capacity to exploit regional seismic data from events defined by few stations with large azimuthal gaps. During 2000, 25% of the events in the REB were defined by five or fewer stations. 48% were defined by at least one regional phase, and 24% were defined by at least three. 34% had gaps in azimuthal coverage of more than 180 degrees. The fraction of regional, sparsely detected events will only increase as new, sensitive stations come on-line, and the detection threshold drops. This will be offset, to some extent, because stations within the denser network that detect near-threshold events will be at closer distances, on average. Thus to address the challenges of regional monitoring, the IDC must integrate "tuned" station and network processing parameters for new stations; enhanced and/or new methods for estimating location, depth and uncertainty bounds; and validated, regionally-calibrated travel times, event characterization parameters and screening criteria. A new IDC program to fund research to calibrate regional seismic travel paths seeks to address, in cooperation with other national efforts, one item on this list. More effective use of the full waveform data and cross-technology synergies must be explored. All of this work must be integrated into modular software systems that can be maintained and improved over time. To motivate these regional monitoring challenges and possible improvements, the experience from the IDC will be presented via a series of illustrative, sample events. Challenges in the technical and policy arenas must be addressed as well. IMS data must first be available at the IDC before they can be analyzed. The encouraging experience to date is that the availability of data arriving via the GCI is significantly higher (~95%) than the availability (~70%) from the same stations prior to GCI installation, when they were transmitting data via other routes. Within the IDC, trade-offs must be considered between the desired levels of product quality and timeliness, and the investment in personnel and system development to support the levels sought. Another high-priority objective is to develop a policy for providing data and products to scientific and disaster alert organizations. It is clear that broader exploitation of these rich and unique assets could be of great, mutual benefit, and is, perhaps, a necessity for the CTBT verification system to achieve its potential.

  7. Real Time On-line Space Research Laboratory Environment Monitoring with Off-line Trend and Prediction Analysis

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Lin, Paul P.

    2006-01-01

    One of the responsibilities of the NASA Glenn Principal Investigator Microgravity Services is to support NASA sponsored investigators in the area of reduced-gravity acceleration data analysis, interpretation and the monitoring of the reduced-gravity environment on-board various carriers. With the International Space Station currently operational, a significant amount of acceleration data is being down-linked and processed on ground for both the space station onboard environment characterization (and verification) and scientific experiments. Therefore, to help principal investigator teams monitor the acceleration level on-board the International Space Station to avoid undesirable impact on their experiment, when possible, the NASA Glenn Principal Investigator Microgravity Services developed an artificial intelligence monitoring system, which detects in near real time any change in the environment susceptible to affect onboard experiments. The main objective of the monitoring system is to help research teams identify the vibratory disturbances that are active at any instant of time onboard the International Space Station that might impact the environment in which their experiment is being conducted. The monitoring system allows any space research scientist, at any location and at any time, to see the current acceleration level on-board the Space Station via the World Wide Web. From the NASA Glenn s Exploration Systems Division web site, research scientists can see in near real time the active disturbances, such as pumps, fans, compressor, crew exercise, re-boost, extra-vehicular activity, etc., and decide whether or not to continue operating or stopping (or making note of such activity for later correlation with science results) their experiments based on the g-level associated with that specific event. A dynamic graphical display accessible via the World Wide Web shows the status of all the vibratory disturbance activities with their degree of confidence as well as their g-level contribution to the environment. The system can detect both known and unknown vibratory disturbance activities. It can also perform trend analysis and prediction by analyzing past data over many Increments of the space station for selected disturbance activities. This feature can be used to monitor the health of onboard mechanical systems to detect and prevent potential system failure as well as for use by research scientists during their science results analysis. Examples of both real time on-line vibratory disturbance detection and off-line trend analysis are presented in this paper. Several soft computing techniques such as Kohonen s Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.

  8. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, cranes help offload the Italian Space Agency's Multi-Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane. The third of three for the International Space Station, the module will be moved on a transporter to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  9. Detection capability of the IMS seismic network based on ambient seismic noise measurements

    NASA Astrophysics Data System (ADS)

    Gaebler, Peter J.; Ceranna, Lars

    2016-04-01

    All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection threshold can be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.

  10. A collection of articles on S/X-band experiment zero delay ranging tests, volume 1

    NASA Technical Reports Server (NTRS)

    Otoshi, T. Y. (Editor)

    1975-01-01

    Articles are presented which are concerned with the development of special test equipment and a dual-frequency zero delay device (ZDD) that were required for range tests and the measurement of ground station delays for the Mariner-Venus-Mercury 1973 S/X-band experiment. Test data obtained at DSS 14 after installation of the ZDD on the 64-m antenna are given. It is shown that large variations of range were observed as a function of antenna elevation angle and were sensitive to antenna location. A ranging calibration configuration that was subsequently developed and a technique for determining the appropriate Z-correction are described. Zero delay test data at DSS 14 during the Mariner 10 Venus-Mercury-Encounter periods (1974 days 12-150) are presented. The theoretical analysis and experimental verifications are included of the effects of multipath and effects of discontinuities on range delay measurements. A movable subreflector technique and the multipath theory were used to isolate principal multipath errors on the 64-m antenna and to enable a more accurate determination of the actual ground station range delay.

  11. Radioxenon detections in the CTBT international monitoring system likely related to the announced nuclear test in North Korea on February 12, 2013.

    PubMed

    Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G

    2014-02-01

    Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Data Acquisition System Architecture and Capabilities At NASA GRC Plum Brook Station's Space Environment Test Facilities

    NASA Technical Reports Server (NTRS)

    Evans, Richard K.; Hill, Gerald M.

    2012-01-01

    Very large space environment test facilities present unique engineering challenges in the design of facility data systems. Data systems of this scale must be versatile enough to meet the wide range of data acquisition and measurement requirements from a diverse set of customers and test programs, but also must minimize design changes to maintain reliability and serviceability. This paper presents an overview of the common architecture and capabilities of the facility data acquisition systems available at two of the world?s largest space environment test facilities located at the NASA Glenn Research Center?s Plum Brook Station in Sandusky, Ohio; namely, the Space Propulsion Research Facility (commonly known as the B-2 facility) and the Space Power Facility (SPF). The common architecture of the data systems is presented along with details on system scalability and efficient measurement systems analysis and verification. The architecture highlights a modular design, which utilizes fully-remotely managed components, enabling the data systems to be highly configurable and support multiple test locations with a wide-range of measurement types and very large system channel counts.

  13. Data Acquisition System Architecture and Capabilities at NASA GRC Plum Brook Station's Space Environment Test Facilities

    NASA Technical Reports Server (NTRS)

    Evans, Richard K.; Hill, Gerald M.

    2014-01-01

    Very large space environment test facilities present unique engineering challenges in the design of facility data systems. Data systems of this scale must be versatile enough to meet the wide range of data acquisition and measurement requirements from a diverse set of customers and test programs, but also must minimize design changes to maintain reliability and serviceability. This paper presents an overview of the common architecture and capabilities of the facility data acquisition systems available at two of the world's largest space environment test facilities located at the NASA Glenn Research Center's Plum Brook Station in Sandusky, Ohio; namely, the Space Propulsion Research Facility (commonly known as the B-2 facility) and the Space Power Facility (SPF). The common architecture of the data systems is presented along with details on system scalability and efficient measurement systems analysis and verification. The architecture highlights a modular design, which utilizes fully-remotely managed components, enabling the data systems to be highly configurable and support multiple test locations with a wide-range of measurement types and very large system channel counts.

  14. The modeling of attraction characteristics regarding passenger flow in urban rail transit network based on field theory

    PubMed Central

    Jia, Limin

    2017-01-01

    Aimed at the complicated problems of attraction characteristics regarding passenger flow in urban rail transit network, the concept of the gravity field of passenger flow is proposed in this paper. We establish the computation methods of field strength and potential energy to reveal the potential attraction relationship among stations from the perspective of the collection and distribution of passenger flow and the topology of network. As for the computation methods of field strength, an optimum path concept is proposed to define betweenness centrality parameter. Regarding the computation of potential energy, Compound Simpson’s Rule Formula is applied to get a solution to the function. Taking No. 10 Beijing Subway as a practical example, an analysis of simulation and verification is conducted, and the results shows in the following ways. Firstly, the bigger field strength value between two stations is, the stronger passenger flow attraction is, and the greater probability of the formation of the largest passenger flow of section is. Secondly, there is the greatest passenger flow volume and circulation capacity between two zones of high potential energy. PMID:28863175

  15. Radiosonde and satellite observations of topographic flow off the Norwegian coast

    NASA Astrophysics Data System (ADS)

    Rugaard Furevik, Birgitte; Dagestad, Knut-Frode; Olafsson, Haraldur

    2015-04-01

    Winds in Norway are strongly affected by the complex topography and in some areas the average wind speed in the fjords may exceed those on the coast. Such effects are revealed through a statistical analysis derived wind speed from ~8500 Synthetic Aperture Radar (SAR) scenes covering the Norwegian coast. We have compared the results with modelled winds from the operational atmosphere model at MET (horizontal grid spacing of 2.5km) and 3 years of measurements from "M/S Trollfjord", a ferry traversing a 2400km coastal route between the cities Bergen and Kirkenes. The analysis reveals many coastal details of the wind field not observed from the meteorological station network of Norway. The data set proves useful for verification of offshore winds in the model. High temporal resolution radiosonde winds from two locations are used to analyse the topographic effects.

  16. Architectural design for space tourism

    NASA Astrophysics Data System (ADS)

    Martinez, Vera

    2009-01-01

    The paper describes the main issues for the design of an appropriately planned habitat for tourists in space. Due study and analysis of the environment of space stations (ISS, MIR, Skylab) delineate positive and negative aspects of architectonical design. Analysis of the features of architectonical design for touristic needs and verification of suitability with design for space habitat. Space tourism environment must offer a high degree of comfort and suggest correct behavior of the tourists. This is intended for the single person as well as for the group. Two main aspects of architectural planning will be needed: the design of the private sphere and the design of the public sphere. To define the appearance of environment there should be paid attention to some main elements like the materiality of surfaces used; the main shapes of areas and the degree of flexibility and adaptability of the environment to specific needs.

  17. The relationship between extreme precipitation events and landslides distributions in 2009 in Lower Austria

    NASA Astrophysics Data System (ADS)

    Katzensteiner, H.; Bell, R.; Petschko, H.; Glade, T.

    2012-04-01

    The prediction and forecast of widespread landsliding for a given triggering event is an open research question. Numerous studies tried to link spatial rainfall and landslide distributions. This study focuses on analysing the relationship between intensive precipitation and rainfall-triggered shallow landslides in the year 2009 in Lower Austria. Landslide distributions were gained from the building ground register, which is maintained by the Geological Survey of Lower Austria. It contains detailed information of landslides, which were registered due to damage reports. Spatially distributed rainfall estimates were extracted from INCA (Integrated Nowcasting through Comprehensive Analysis) precipitation analysis, which is a combination of station data interpolation and radar data in a spatial resolution of 1km developed by the Central Institute for Meteorology and Geodynamics (ZAMG), Vienna, Austria. The importance of the data source is shown by comparing rainfall data based on reference gauges, spatial interpolation and INCA-analysis for a certain storm period. INCA precipitation data can detect precipitating cells that do not hit a station but might trigger a landslide, which is an advantage over the application of reference stations for the definition of rainfall thresholds. Empirical thresholds at regional scale were determined based on rainfall-intensity and duration in the year 2009 and landslide information. These thresholds are dependent on the criteria which separate the landslide triggering and non-triggering precipitation events from each other. Different approaches for defining thresholds alter the shape of the threshold as well. A temporarily threshold I=8,8263*D^(-0.672) for extreme rainfall events in summer in Lower Austria was defined. A verification of the threshold with similar events of other years as well as following analyses based on a larger landslide database are in progress.

  18. Spacelab, Spacehab, and Space Station Freedom payload interface projects

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    1992-01-01

    Contributions were made to several projects. Howard Nguyen was assisted in developing the Space Station RPS (Rack Power Supply). The RPS is a computer controlled power supply that helps test equipment used for experiments before the equipment is installed on Space Station Freedom. Ron Bennett of General Electric Government Services was assisted in the design and analysis of the Standard Interface Rack Controller hardware and software. An analysis was made of the GPIB (General Purpose Interface Bus), looking for any potential problems while transmitting data across the bus, such as the interaction of the bus controller with a data talker and its listeners. An analysis was made of GPIB bus communications in general, including any negative impact the bus may have on transmitting data back to Earth. A study was made of transmitting digital data back to Earth over a video channel. A report was written about the study and a revised version of the report will be submitted for publication. Work was started on the design of a PC/AT compatible circuit board that will combine digital data with a video signal. Another PC/AT compatible circuit board is being designed to recover the digital data from the video signal. A proposal was submitted to support the continued development of the interface boards after the author returns to Memphis State University in the fall. A study was also made of storing circuit board design software and data on the hard disk server of a LAN (Local Area Network) that connects several IBM style PCs. A report was written that makes several recommendations. A preliminary design review was started of the AIVS (Automatic Interface Verification System). The summer was over before any significant contribution could be made to this project.

  19. Engine condition monitoring: CF6 family 60's through the 80's

    NASA Technical Reports Server (NTRS)

    Kent, H. J.; Dienger, G.

    1981-01-01

    The on condition program is described in terms of its effectiveness as a maintenance tool both at the line station as well as at home base by the early detection of engine faults, erroneous instrumentation signals and by verification of engine health. The system encompasses all known methods from manual procedures to the fully automated airborne integrated data system.

  20. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  1. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  2. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  3. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  4. Selected hydrologic and climatologic data from the Prairie Dog Creek basin, southeastern Montana, water year 1980

    USGS Publications Warehouse

    Cary, L.E.; Johnson, J.D.

    1982-01-01

    Hydrologic and climatologic data are being collected in a 25-square-mile (65-square-kilometer) basin in southeastern Montana to provide a base for development, calibration, and verification of a precipitation-runoff model. The study area and data-collection stations within the area are shown on a map. A summary of data collected at each station during the second year , beginning in October 1979, is provided in tables. The data include precipitation, snow depth and water content, air temperature, relative humidity, wind speed and direction, solar radiation, soil temperature and moisture, stream discharge, chemical analyses of water, and suspended sediment. (USGS)

  5. Selected hydrologic and climatologic data from the Prairie Dog Creek Basin, southeastern Montana, water year 1979

    USGS Publications Warehouse

    Cary, Lawrence E.; Johnson, Joel D.

    1981-01-01

    Hydrologic and climatologic data are being collected in a 19-square-mile (49-square-kilometer) basin in southeastern Montana to provide a base for development, calibration, and verification of a precipitation-runoff model. The study area and data-collection stations within the area are shown on a map. A summary of data collected at each station during the first year, beginning in October 1978, is provided in tables. The data include precipitation, snow depth and water content, air temperature, relative humidity, wind run, solar radiation, soil temperature and moisture, stream discharge, chemical analyses of water, and suspended sediment. (USGS)

  6. International Space Station Major Constituent Analyzer On-orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  7. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  8. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  9. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, lands at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  10. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, arrives at KSC's Shuttle Landing Facility from the factory of Alenia Aerospazio in Turin, Italy. Its cargo is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  11. The NASA Commercial Crew Program (CCP) Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy

    2016-01-01

    In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.

  12. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  13. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  14. Wall-to-wall Landsat TM classifications for Georgia in support of SAFIS using FIA plots for training and verification

    Treesearch

    William H. Cooke; Andrew J. Hartsell

    2000-01-01

    Wall-to-wall Landsat TM classification efforts in Georgia require field validation. Validation uslng FIA data was testing by developing a new crown modeling procedure. A methodology is under development at the Southern Research Station to model crown diameter using Forest Health monitoring data. These models are used to simulate the proportion of tree crowns that...

  15. Verification of Design and Construction Techniques for Gaillard Island Dredged Material Disposal Area, Mobile, Alabama.

    DTIC Science & Technology

    1986-08-01

    the dike centerline. Small pen-lite flashlight bulbs and dry cell batteries were taped to the bamboo poles so the spill and haul barges could work at...fDY 01TO .3 0 1000 200 300 ------ -STATION 35+00 9.0: ... .4.. ... 3 ~~ U. -- "I-- 2IU MY STAMIN 30+D0 BI DOTI O ............ ......4 ..+.0

  16. Study of plate-fin heat exchanger and cold plate for the active thermal control system of Space Station

    NASA Technical Reports Server (NTRS)

    Chyu, MING-C.

    1992-01-01

    Plate-fin heat exchangers will be employed in the Active Thermal Control System of Space Station Freedom. During ground testing of prototypic heat exchangers, certain anomalous behaviors have been observed. Diagnosis has been conducted to determine the cause of the observed behaviors, including a scrutiny of temperature, pressure, and flow rate test data, and verification calculations based on such data and more data collected during the ambient and thermal/vacuum tests participated by the author. The test data of a plate-fin cold plate have been also analyzed. Recommendation was made with regard to further tests providing more useful information of the cold plate performance.

  17. International Space Station United States Laboratory Module Water Recovery Management Subsystem Verification from Flight 5A to Stage ULF2

    NASA Technical Reports Server (NTRS)

    Williams, David E.; Labuda, Laura

    2009-01-01

    The International Space Station (ISS) Environmental Control and Life Support (ECLS) system comprises of seven subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), Vacuum System (VS), Water Recovery and Management (WRM), and Waste Management (WM). This paper provides a summary of the nominal operation of the United States (U.S.) Laboratory Module WRM design and detailed element methodologies utilized during the Qualification phase of the U.S. Laboratory Module prior to launch and the Qualification of all of the modification kits added to it from Flight 5A up and including Stage ULF2.

  18. The environment workbench: A design tool for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Rankin, Thomas V.; Wilcox, Katherine G.; Roche, James C.

    1991-01-01

    The environment workbench (EWB) is being developed for NASA by S-CUBED to provide a standard tool that can be used by the Space Station Freedom (SSF) design and user community for requirements verification. The desktop tool will predict and analyze the interactions of SSF with its natural and self-generated environments. A brief review of the EWB design and capabilities is presented. Calculations using a prototype EWB of the on-orbit floating potentials and contaminant environment of SSF are also presented. Both the positive and negative grounding configurations for the solar arrays are examined to demonstrate the capability of the EWB to provide quick estimates of environments, interactions, and system effects.

  19. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Thoresen, Souzan; Wiedemann, Rachel; Matty, Chris

    2015-01-01

    The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. Improvements to ion pump operation and ion source tuning have improved lifetime performance of the current ORU 02 design. The most recent ORU 02 analyzer assemblies, as well as ORU 08, have operated nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance and logistical support. Monitoring several key parameters provides the capacity to monitor ORU health and properly anticipate end of life.

  20. Use of Human Modeling Simulation Software in the Task Analysis of the Environmental Control and Life Support System Component Installation Procedures

    NASA Technical Reports Server (NTRS)

    Estes, Samantha; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    Virtual reality and simulation applications are becoming widespread in human task analysis. These programs have many benefits for the Human Factors Engineering field. Not only do creating and using virtual environments for human engineering analyses save money and time, this approach also promotes user experimentation and provides increased quality of analyses. This paper explains the human engineering task analysis performed on the Environmental Control and Life Support System (ECLSS) space station rack and its Distillation Assembly (DA) subsystem using EAI's human modeling simulation software, Jack. When installed on the International Space Station (ISS), ECLSS will provide the life and environment support needed to adequately sustain crew life. The DA is an Orbital Replaceable Unit (ORU) that provides means of wastewater (primarily urine from flight crew and experimental animals) reclamation. Jack was used to create a model of the weightless environment of the ISS Node 3, where the ECLSS is housed. Computer aided drawings of the ECLSS rack and DA system were also brought into the environment. Anthropometric models of a 95th percentile male and 5th percentile female were used to examine the human interfaces encountered during various ECLSS and DA tasks. The results of the task analyses were used in suggesting modifications to hardware and crew task procedures to improve accessibility, conserve crew time, and add convenience for the crew. This paper will address some of those suggested modifications and the method of presenting final analyses for requirements verification.

  1. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, taxis onto the parking apron at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  2. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, workers in cherry pickers (right) help guide offloading of the Italian Space Agency's Multi-Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  3. Space station dynamics, attitude control and momentum management

    NASA Technical Reports Server (NTRS)

    Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi

    1989-01-01

    The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.

  4. KSC01pp0234

    NASA Image and Video Library

    2001-02-01

    An Airbus “Beluga” air cargo plane, The Super Transporter, taxis onto the parking apron at KSC’s Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency’s Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo

  5. Space Station Freedom avionics technology

    NASA Technical Reports Server (NTRS)

    Edwards, A.

    1990-01-01

    The Space Station Freedom Program (SSFP) encompasses the design, development, test, evaluation, verification, launch, assembly, and operation and utilization of a set of spacecraft in low earth orbit (LEO) and their supporting facilities. The spacecraft set includes: the Space Station Manned Base (SSMB), a European Space Agency (ESA) provided Man-Tended Free Flyer (MTFF) at an inclination of 28.5 degrees and nominal attitude of 410 km, a USA provided Polar Orbiting Platform (POP), and an ESA provided POP in sun-synchronous, near polar orbits at a nominal altitude of 822 km. The SSMB will be assembled using the National Space Transportation System (NSTS). The POPs and the MTFF will be launched by Expendable Launch Vehicles (ELVs): a Titan 4 for the US POP and an Ariane for the ESA POP and MTFF. The US POP will for the most part use derivatives of systems flown on unmanned LEO spacecraft. The SSMB portion of the overall program is presented.

  6. Deformation and Flexibility Equations for ARIS Umbilicals Idealized as Planar Elastica

    NASA Technical Reports Server (NTRS)

    Hampton, R. David; Leamy, Michael J.; Bryant, Paul J.; Quraishi, Naveed

    2005-01-01

    The International Space Station relies on the active rack isolation system (ARIS) as the central component of an integrated, stationwide strategy to isolate microgravity space-science experiments. ARIS uses electromechanical actuators to isolate an international standard payload rack from disturbances due to the motion of the Space Station. Disturbances to microgravity experiments on ARIS isolated racks are transmitted primarily via the ARIS power and vacuum umbilicals. Experimental tests indicate that these umbilicals resonate at frequencies outside the ARIS controller s bandwidth at levels of potential concern for certain microgravity experiments. Reduction in the umbilical resonant frequencies could help to address this issue. This work documents the development and verification of equations for the in-plane deflections and flexibilities of an idealized umbilical (thin, flexible, inextensible, cantilever beam) under end-point, in-plane loading (inclined-force and moment). The effect of gravity is neglected due to the on-orbit application. The analysis assumes an initially curved (not necessarily circular), cantilevered umbilical with uniform cross-section, which undergoes large deflections with no plastic deformation, such that the umbilical slope changes monotonically. The treatment is applicable to the ARIS power and vacuum umbilicals under the indicated assumptions.

  7. Verification and Validation of the Coastal Modeling System. Report 2: CMS-Wave

    DTIC Science & Technology

    2011-12-01

    Figure 44. Offshore bathymetry showing NDBC and CDIP buoy locations. ........................................ 70 Figure 45. CMS-Wave modeling domain...the four measurement stations. During the same time intervals, offshore wave information was available from a Coastal Data Information Program ( CDIP ...were conducted with a grid of 236 × 398 cells with variable cell spacing of 30 to 200 m (see Figure 28). Directional wave spectra from CDIP 036 served

  8. The Seismic component of the IBERARRAY: Placing constraints on the Lithosphere and Mantle.

    NASA Astrophysics Data System (ADS)

    Carbonell, R.; Diaz, J.; Villaseñor, A.; Gallart, J.; Morales, J.; Pazos, A.; Cordoba, D.; Pulgar, J.; Garcia-Lobon, J.; Harnafi, M.

    2008-12-01

    TOPOIBERIA, is a multidisciplinary large scale research project which aims to study the links between the deep and superficial processes within the Iberian Peninsula.One of its main experimental components is the deployment of the IBERARRAY seismic network. This is a dense array (60x60 km) of new generation dataloggers equipped with broad-band seismometers which will cover Iberia and North Morocco in three successive deployments, each lasting for about 18 months. The first leg, deployed since late 2007, covers the southern part of Iberia (35 stations) and northern Morocco (20 stations). Two data centers have been established one at the CSIC-Institute of Earth Sciences (CSIC-Barcelona) and a second at the Geologic and Mining Insititute (IGME-Madrid) the data follows a standard-conventional flow from recovery to archival. The field teams collect the recorded hard disk on the field and send data and metadata to a processing center, where raw data is collected and stored and a quality control checking is performed. This include a systematic inspection of the experimental parameters (batteries charge, thermal insulation, time adjustments, geophone leveling etc), the visual verification of the seismic waveforms and the analysis, using power density spectra (PSD), of the noise level of each station. All this information is disseminated between the research teams involved in the project using a dedicated website and the continuous seismic data is made accessible through FTP and CWQ servers. Some of the nodes of the theoretical network are covered by permanent stations of the national broad-band network (IGN) or other networks operating in the region (IAG-UGR, ROA). Data from those stations will also be integrated to the Iberarray database. This Iberarray network will provide a large database of both waveform and catalogued events, with an unprecedented resolution. Earthquake data at local, regional and teleseismic scales will be analyzed using different methodologies. The first result would be an increase in the accuracy of the location of regional seismicity and the termination of focal mechanisms. A special emphasis will be attributed to seismic tomographic techniques using travel times and waveforms of P and S arrivals at different scales as well as surface waves, using dispersion measurements as well as studies dealing with background/environmental noise. In addition, receiver function analysis for seismic imaging of deep lithospheric features and splitting analysis of shear-wave arrivals will also be developed.

  9. Physics Verification Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  10. EMC: Verification

    Science.gov Websites

    , GFS, RAP, HRRR, HIRESW, SREF mean, International Global Models, HPC analysis Precipitation Skill Scores : 1995-Present NAM, GFS, NAM CONUS nest, International Models EMC Forecast Verfication Stats: NAM ) Real Time Verification of NCEP Operational Models against observations Real Time Verification of NCEP

  11. Detection, location, and characterization of hydroacoustic signals using seafloor cable networks offshore Japan (Invited)

    NASA Astrophysics Data System (ADS)

    Sugioka, H.; Suyehiro, K.; Shinohara, M.

    2009-12-01

    The hydroacoustic monitoring by the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Treaty (CTBT) verification system utilize hydrophone stations and seismic stations called T-phase stations for worldwide detection. Some signals of natural origin include those from earthquakes, submarine volcanic eruptions, or whale calls. Among artificial sources there are non-nuclear explosions and air-gun shots. It is important for IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressures, hydrophones and seismic sensors) may be utilized to verify and increase the capability of the IMS. We use these data to compare some selected event parameters with those by Pacific in the time period of 2004-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals. The seafloor cable networks composed of three hydrophones and six seismometers and a temporal dense seismic array detected and located hydroacoustic events offshore Japanese island on 12th of March in 2008, which had been reported by the IMS. We detected not only the reverberated hydroacoustic waves between the sea surface and the sea bottom but also the seismic waves going through the crust associated with the events. The determined source of the seismic waves is almost coincident with the one of hydroacoustic waves, suggesting that the seismic waves are converted very close to the origin of the hydroacoustic source. We also detected very similar signals on 16th of March in 2009 to the ones associated with the event of 12th of March in 2008.

  12. The Global Detection Capability of the IMS Seismic Network in 2013 Inferred from Ambient Seismic Noise Measurements

    NASA Astrophysics Data System (ADS)

    Gaebler, P. J.; Ceranna, L.

    2016-12-01

    All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection thresholdcan be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.

  13. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  14. Illumination-tolerant face verification of low-bit-rate JPEG2000 wavelet images with advanced correlation filters for handheld devices

    NASA Astrophysics Data System (ADS)

    Wijaya, Surya Li; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-02-01

    Face recognition on mobile devices, such as personal digital assistants and cell phones, is a big challenge owing to the limited computational resources available to run verifications on the devices themselves. One approach is to transmit the captured face images by use of the cell-phone connection and to run the verification on a remote station. However, owing to limitations in communication bandwidth, it may be necessary to transmit a compressed version of the image. We propose using the image compression standard JPEG2000, which is a wavelet-based compression engine used to compress the face images to low bit rates suitable for transmission over low-bandwidth communication channels. At the receiver end, the face images are reconstructed with a JPEG2000 decoder and are fed into the verification engine. We explore how advanced correlation filters, such as the minimum average correlation energy filter [Appl. Opt. 26, 3633 (1987)] and its variants, perform by using face images captured under different illumination conditions and encoded with different bit rates under the JPEG2000 wavelet-encoding standard. We evaluate the performance of these filters by using illumination variations from the Carnegie Mellon University's Pose, Illumination, and Expression (PIE) face database. We also demonstrate the tolerance of these filters to noisy versions of images with illumination variations.

  15. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  16. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... analysis. Any establishment that does not have a HACCP plan because a hazard analysis has revealed no food.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  17. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... analysis. Any establishment that does not have a HACCP plan because a hazard analysis has revealed no food.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  18. Verification of 1921 peak discharge at Skagit River near Concrete, Washington, using 2003 peak-discharge data

    USGS Publications Warehouse

    Mastin, M.C.; Kresch, D.L.

    2005-01-01

    The 1921 peak discharge at Skagit River near Concrete, Washington (U.S. Geological Survey streamflow-gaging station 12194000), was verified using peak-discharge data from the flood of October 21, 2003, the largest flood since 1921. This peak discharge is critical to determining other high discharges at the gaging station and to reliably estimating the 100-year flood, the primary design flood being used in a current flood study of the Skagit River basin. The four largest annual peak discharges of record (1897, 1909, 1917, and 1921) were used to determine the 100-year flood discharge at Skagit River near Concrete. The peak discharge on December 13, 1921, was determined by James E. Stewart of the U.S. Geological Survey using a slope-area measurement and a contracted-opening measurement. An extended stage-discharge rating curve based on the 1921 peak discharge was used to determine the peak discharges of the three other large floods. Any inaccuracy in the 1921 peak discharge also would affect the accuracies of the three other largest peak discharges. The peak discharge of the 1921 flood was recalculated using the cross sections and high-water marks surveyed after the 1921 flood in conjunction with a new estimate of the channel roughness coefficient (n value) based on an n-verification analysis of the peak discharge of the October 21, 2003, flood. The n value used by Stewart for his slope-area measurement of the 1921 flood was 0.033, and the corresponding calculated peak discharge was 240,000 cubic feet per second (ft3/s). Determination of a single definitive water-surface profile for use in the n-verification analysis was precluded because of considerable variation in elevations of surveyed high-water marks from the flood on October 21, 2003. Therefore, n values were determined for two separate water-surface profiles thought to bracket a plausible range of water-surface slopes defined by high-water marks. The n value determined using the flattest plausible slope was 0.024 and the corresponding recalculated discharge of the 1921 slope-area measurement was 266,000 ft3/s. The n value determined using the steepest plausible slope was 0.032 and the corresponding recalculated discharge of the 1921 slope-area measurement was 215,000 ft3/s. The two recalculated discharges were 10.8 percent greater than (flattest slope) and 10.4 percent less than (steepest slope) the 1921 peak discharge of 240,000 ft3/s. The 1921 peak discharge was not revised because the average of the two recalculated discharges (240,500 ft3/s) is only 0.2 percent greater than the 1921 peak discharge.

  19. Status report on the establishment of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Campus, P.; Bell, M.; Hoffmann, T. L.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.

    2002-11-01

    The infrasound component of the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Ban Treaty verification aims for global detection and localization of low-frequency sound waves originating from atmospheric nuclear explosions. The infrasound network will consist of 60 array stations, distributed as evenly as possible over the globe to assure at least two-station detection capability for 1-kton explosions at any point on earth. This network will be larger and more sensitive than any other previously operated infrasound network. As of today, 85% of the site surveys for IMS infrasound stations have been completed, 25% of the stations have been installed, and 8% of the installations have been certified and are transmitting high-quality continuous data to the International Data Center in Vienna. By the end of 2002, 20% of the infrasound network is expected to be certified and operating in post-certification mode. This presentation will discuss the current status and progress made in the site survey, installation, and certification programs for IMS infrasound stations. A review will be presented of the challenges and difficulties encountered in these programs, together with practical solutions to these problems.

  20. Meteor localization via statistical analysis of spatially temporal fluctuations in image sequences

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Šihlík, Jan; Fliegel, Karel

    2015-09-01

    Meteor detection is one of the most important procedures in astronomical imaging. Meteor path in Earth's atmosphere is traditionally reconstructed from double station video observation system generating 2D image sequences. However, the atmospheric turbulence and other factors cause spatially-temporal fluctuations of image background, which makes the localization of meteor path more difficult. Our approach is based on nonlinear preprocessing of image intensity using Box-Cox and logarithmic transform as its particular case. The transformed image sequences are then differentiated along discrete coordinates to obtain statistical description of sky background fluctuations, which can be modeled by multivariate normal distribution. After verification and hypothesis testing, we use the statistical model for outlier detection. Meanwhile the isolated outlier points are ignored, the compact cluster of outliers indicates the presence of meteoroids after ignition.

  1. Command system output bit verification

    NASA Technical Reports Server (NTRS)

    Odd, C. W.; Abbate, S. F.

    1981-01-01

    An automatic test was developed to test the ability of the deep space station (DSS) command subsystem and exciter to generate and radiate, from the exciter, the correct idle bit sequence for a given flight project or to store and radiate received command data elements and files without alteration. This test, called the command system output bit verification test, is an extension of the command system performance test (SPT) and can be selected as an SPT option. The test compares the bit stream radiated from the DSS exciter with reference sequences generated by the SPT software program. The command subsystem and exciter are verified when the bit stream and reference sequences are identical. It is a key element of the acceptance testing conducted on the command processor assembly (CPA) operational program (DMC-0584-OP-G) prior to its transfer from development to operations.

  2. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  3. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  4. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  5. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  6. Temperature-based modeling of reference evapotranspiration using several artificial intelligence models: application of different modeling scenarios

    NASA Astrophysics Data System (ADS)

    Sanikhani, Hadi; Kisi, Ozgur; Maroufpoor, Eisa; Yaseen, Zaher Mundher

    2018-02-01

    The establishment of an accurate computational model for predicting reference evapotranspiration (ET0) process is highly essential for several agricultural and hydrological applications, especially for the rural water resource systems, water use allocations, utilization and demand assessments, and the management of irrigation systems. In this research, six artificial intelligence (AI) models were investigated for modeling ET0 using a small number of climatic data generated from the minimum and maximum temperatures of the air and extraterrestrial radiation. The investigated models were multilayer perceptron (MLP), generalized regression neural networks (GRNN), radial basis neural networks (RBNN), integrated adaptive neuro-fuzzy inference systems with grid partitioning and subtractive clustering (ANFIS-GP and ANFIS-SC), and gene expression programming (GEP). The implemented monthly time scale data set was collected at the Antalya and Isparta stations which are located in the Mediterranean Region of Turkey. The Hargreaves-Samani (HS) equation and its calibrated version (CHS) were used to perform a verification analysis of the established AI models. The accuracy of validation was focused on multiple quantitative metrics, including root mean squared error (RMSE), mean absolute error (MAE), correlation coefficient (R 2), coefficient of residual mass (CRM), and Nash-Sutcliffe efficiency coefficient (NS). The results of the conducted models were highly practical and reliable for the investigated case studies. At the Antalya station, the performance of the GEP and GRNN models was better than the other investigated models, while the performance of the RBNN and ANFIS-SC models was best compared to the other models at the Isparta station. Except for the MLP model, all the other investigated models presented a better performance accuracy compared to the HS and CHS empirical models when applied in a cross-station scenario. A cross-station scenario examination implies the prediction of the ET0 of any station using the input data of the nearby station. The performance of the CHS models in the modeling the ET0 was better in all the cases when compared to that of the original HS.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, UNIVERSITY OF TENNESSEE RESEARCH CORPORATION, SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  8. STS-55 German Payload Specialist Schlegel manipulates ROTEX controls in SL-D2

    NASA Image and Video Library

    1993-05-06

    STS055-106-100 (26 April-6 May 1993) --- Hans Schlegel, wearing special glasses, works at the Robotics Experiment (ROTEX) workstation in the science module aboard the Earth-orbiting Space Shuttle Columbia. Schlegel was one of two payload specialists representing the German Aerospace Research Establishment (DLR) on the 10-day Spacelab D-2 mission. ROTEX is a robotic arm that operates within an enclosed workcell in rack 6 of the Spacelab module and uses teleoperation from both an onboard station located nearby in rack 4 and from a station on the ground. The device uses teleprogramming and artificial intelligence to look at the design, verification and operation of advanced autonomous systems for use in future applications.

  9. KSC-01pp0244

    NASA Image and Video Library

    2001-02-03

    The lid is off the shipping container with the Multi-Purpose Logistics Module Donatello inside. It sits on a transporter inside the Space Station Processing Facility. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  10. KSC-01pp0245

    NASA Image and Video Library

    2001-02-03

    Workers in the Space Station Processing Facility attach an overhead crane to the Multi-Purpose Logistics Module Donatello to lift it out of the shipping container. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  11. KSC-01pp0246

    NASA Image and Video Library

    2001-02-03

    In the Space Station Processing Facility, workers help guide the overhead crane as it lifts the Multi-Purpose Logistics Module Donatello out of the shipping container. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  12. KSC-01pp0247

    NASA Image and Video Library

    2001-02-03

    In the Space Station Processing Facility, workers help guide the Multi-Purpose Logistics Module Donatello as it moves the length of the SSPF toward a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  13. KSC-01pp0248

    NASA Image and Video Library

    2001-02-03

    In the Space Station Processing Facility, workers wait for the Multi-Purpose Logistics Module Donatello, suspended by an overhead crane, to move onto a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  14. Wavelet Analysis of Turbulent Spots and Other Coherent Structures in Unsteady Transition

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques

    1998-01-01

    This is a secondary analysis of a portion of the Halstead data. The hot-film traces from an embedded stage of a low pressure turbine have been extensively analyzed by Halstead et al. In this project, wavelet analysis is used to develop the quantitative characterization of individual coherent structures in terms of size, amplitude, phase, convection speed, etc., as well as phase-averaged time scales. The purposes of the study are (1) to extract information about turbulent time scales for comparison with unsteady model results (e.g. k/epsilon). Phase-averaged maps of dominant time scales will be presented; and (2) to evaluate any differences between wake-induced and natural spots that might affect model performance. Preliminary results, subject to verification with data at higher frequency resolution, indicate that spot properties are independent of their phase relative to the wake footprints: therefore requirements for the physical content of models are kept relatively simple. Incidentally, we also observed that spot substructures can be traced over several stations; further study will examine their possible impact.

  15. Minimum accommodation for aerobrake assembly, phase 2

    NASA Technical Reports Server (NTRS)

    Katzberg, Stephen J.; Haynes, Davy A.; Tutterow, Robin D.; Watson, Judith J.; Russell, James W.

    1994-01-01

    A multi-element study was done to assess the practicality of a Space Station Freedom-based aerobrake system for the Space Exploration Initiative. The study was organized into six parts related to structure, aerodynamics, robotics and assembly, thermal protection system, inspection, and verification, all tied together by an integration study. The integration activity managed the broad issues related to meeting mission requirements. This report is a summary of the issues addressed by the integration team.

  16. Space station System Engineering and Integration (SE and I). Volume 2: Study results

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A summary of significant study results that are products of the Phase B conceptual design task are contained. Major elements are addressed. Study results applicable to each major element or area of design are summarized and included where appropriate. Areas addressed include: system engineering and integration; customer accommodations; test and program verification; product assurance; conceptual design; operations and planning; technical and management information system (TMIS); and advanced development.

  17. Evaluation of the 29-km Eta Model. Part I: Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Manobianco, John; Nutter, Paul

    1998-01-01

    A subjective evaluation of the National Centers for Environmental Prediction 29-km (meso-) eta model during the 1996 warm (May-August) and cool (October-January) seasons is described. The overall evaluation assessed the utility of the model for operational weather forecasting by the U.S. Air Force 45th Weather Squadron, National Weather Service (NWS) Spaceflight Meteorology Group (SMG) and NWS Office in Melbourne, FL.

  18. Development of Standard Station Interface for Comprehensive Nuclear Test Ban Treaty Organistation Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Dricker, I. G.; Friberg, P.; Hellman, S.

    2001-12-01

    Under the contract with the CTBTO, Instrumental Software Technologies Inc., (ISTI) has designed and developed a Standard Station Interface (SSI) - a set of executable programs and application programming interface libraries for acquisition, authentication, archiving and telemetry of seismic and infrasound data for stations of the CTBTO nuclear monitoring network. SSI (written in C) is fully supported under both the Solaris and Linux operating systems and will be shipped with fully documented source code. SSI consists of several interconnected modules. The Digitizer Interface Module maintains a near-real-time data flow between multiple digitizers and the SSI. The Disk Buffer Module is responsible for local data archival. The Station Key Management Module is a low-level tool for data authentication and verification of incoming signatures. The Data Transmission Module supports packetized near-real-time data transmission from the primary CTBTO stations to the designated Data Center. The AutoDRM module allows transport of seismic and infrasound signed data via electronic mail (auxiliary station mode). The Command Interface Module is used to pass the remote commands to the digitizers and other modules of SSI. A station operator has access to the state-of-health information and waveforms via an the Operator Interface Module. Modular design of SSI will allow painless extension of the software system within and outside the boundaries of CTBTO station requirements. Currently an alpha version of SSI undergoes extensive tests in the lab and onsite.

  19. A Deep Space Orbit Determination Software: Overview and Event Prediction Capability

    NASA Astrophysics Data System (ADS)

    Kim, Youngkwang; Park, Sang-Young; Lee, Eunji; Kim, Minsik

    2017-06-01

    This paper presents an overview of deep space orbit determination software (DSODS), as well as validation and verification results on its event prediction capabilities. DSODS was developed in the MATLAB object-oriented programming environment to support the Korea Pathfinder Lunar Orbiter (KPLO) mission. DSODS has three major capabilities: celestial event prediction for spacecraft, orbit determination with deep space network (DSN) tracking data, and DSN tracking data simulation. To achieve its functionality requirements, DSODS consists of four modules: orbit propagation (OP), event prediction (EP), data simulation (DS), and orbit determination (OD) modules. This paper explains the highest-level data flows between modules in event prediction, orbit determination, and tracking data simulation processes. Furthermore, to address the event prediction capability of DSODS, this paper introduces OP and EP modules. The role of the OP module is to handle time and coordinate system conversions, to propagate spacecraft trajectories, and to handle the ephemerides of spacecraft and celestial bodies. Currently, the OP module utilizes the General Mission Analysis Tool (GMAT) as a third-party software component for highfidelity deep space propagation, as well as time and coordinate system conversions. The role of the EP module is to predict celestial events, including eclipses, and ground station visibilities, and this paper presents the functionality requirements of the EP module. The validation and verification results show that, for most cases, event prediction errors were less than 10 millisec when compared with flight proven mission analysis tools such as GMAT and Systems Tool Kit (STK). Thus, we conclude that DSODS is capable of predicting events for the KPLO in real mission applications.

  20. Alternative Nonvolatile Residue Analysis with Contaminant Identification Project

    NASA Technical Reports Server (NTRS)

    Loftin, Kathleen (Compiler); Summerfield, Burton (Compiler); Thompson, Karen (Compiler); Mullenix, Pamela (Compiler); Zeitlin, Nancy (Compiler)

    2015-01-01

    Cleanliness verification is required in numerous industries including spaceflight ground support, electronics, medical and aerospace. Currently at KSC requirement for cleanliness verification use solvents that environmentally unfriendly. This goal of this project is to produce an alternative cleanliness verification technique that is both environmentally friendly and more cost effective.

  1. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  2. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  3. Joint NASA/EPA AVIRIS Analysis in the Chesapeake Bay Region: Plans and Initial Results

    NASA Technical Reports Server (NTRS)

    Johnson, Lee; Stokely, Peter; Lobitz, Brad; Shelton, Gary

    1998-01-01

    NASA's Ames Research Center is performing an AVIRIS demonstration project in conjunction with the U. S. Environmental Protection Agency (Region 3). NASA and EPA scientists have jointly defined a Study Area in eastern Virginia to include portions of the Chesapeake Bay, southern Delmarva Peninsula, and the mouths of the York and James Rivers. Several environmental issues have been identified for study. These include, by priority: 1) water constituent analysis in the Chesapeake Bay, 2) mapping of submerged aquatic vegetation in the Bay, 3) detection of vegetation stress related to Superfund sites at the Yorktown Naval Weapons Station, and 4) wetland species analysis in the York River vicinity. In support of this project, three lines of AVIRIS data were collected during the Wallops Island deployment on 17 August 1997. The remote sensing payload included AVIRIS, MODIS Airborne Simulator and an RC-10 color infrared film camera. The AVIRIS data were delivered to Ames from the JPL AVIRIS Data Facility, on 29 September 1997. Quicklook images indicate nominal data acquisition, and at the current time an atmospheric correction is being applied. Water constituent analysis of the Bay is our highest priority based on EPA interest and available collateral data, both from the surface and from other remote sensing instruments. Constituents of interest include suspended sediments, chlorophyll-a and accessory pigments, Analysis steps will include: verification of data quality, location of study sites in imagery, incorporation of relevant field data from EPA and other Chesapeake Bay cooperators, processing of imagery to show phenomenon of interest, verification of results with cooperators. By 1st quarter CY98 we plan to circulate initial results to NASA and EPA management for review. In the longer term we will finalize documentation, prepare results for publication, and complete any needed technology transfer to EPA remote sensing personnel.

  4. Assessment of wind energy potential in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, Katarzyna; Linkowska, Joanna; Mazur, Andrzej

    2014-05-01

    The aim of the presentation is to show the suitability of using numerical model wind speed forecasts for the wind power industry applications in Poland. In accordance with the guidelines of the European Union, the consumption of wind energy in Poland is rapidly increasing. According to the report of Energy Regulatory Office from 30 March 2013, the installed capacity of wind power in Poland was 2807MW from 765 wind power stations. Wind energy is strongly dependent on the meteorological conditions. Based on the climatological wind speed data, potential energy zones within the area of Poland have been developed (H. Lorenc). They are the first criterion for assessing the location of the wind farm. However, for exact monitoring of a given wind farm location the prognostic data from numerical model forecasts are necessary. For the practical interpretation and further post-processing, the verification of the model data is very important. Polish Institute Meteorology and Water Management - National Research Institute (IMWM-NRI) runs an operational model COSMO (Consortium for Small-scale Modelling, version 4.8) using two nested domains at horizontal resolutions of 7 km and 2.8 km. The model produces 36 hour and 78 hour forecasts from 00 UTC, for 2.8 km and 7 km domain resolutions respectively. Numerical forecasts were compared with the observation of 60 SYNOP and 3 TEMP stations in Poland, using VERSUS2 (Unified System Verification Survey 2) and R package. For every zone the set of statistical indices (ME, MAE, RMSE) was calculated. Forecast errors for aerological profiles are shown for Polish TEMP stations at Wrocław, Legionowo and Łeba. The current studies are connected with a topic of the COST ES1002 WIRE-Weather Intelligence for Renewable Energies.

  5. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  6. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  7. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Verification and validation. 120.11 Section 120.11...

  8. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification and validation. 120.11 Section 120.11...

  9. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification and validation. 120.11 Section 120.11...

  10. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11...

  11. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Verification and validation. 120.11 Section 120.11...

  12. Assimilation of radar quantitative precipitation estimations in the Canadian Precipitation Analysis (CaPA)

    NASA Astrophysics Data System (ADS)

    Fortin, Vincent; Roy, Guy; Donaldson, Norman; Mahidjiba, Ahmed

    2015-12-01

    The Canadian Precipitation Analysis (CaPA) is a data analysis system used operationally at the Canadian Meteorological Center (CMC) since April 2011 to produce gridded 6-h and 24-h precipitation accumulations in near real-time on a regular grid covering all of North America. The current resolution of the product is 10-km. Due to the low density of the observational network in most of Canada, the system relies on a background field provided by the Regional Deterministic Prediction System (RDPS) of Environment Canada, which is a short-term weather forecasting system for North America. For this reason, the North American configuration of CaPA is known as the Regional Deterministic Precipitation Analysis (RDPA). Early in the development of the CaPA system, weather radar reflectivity was identified as a very promising additional data source for the precipitation analysis, but necessary quality control procedures and bias-correction algorithms were lacking for the radar data. After three years of development and testing, a new version of CaPA-RDPA system was implemented in November 2014 at CMC. This version is able to assimilate radar quantitative precipitation estimates (QPEs) from all 31 operational Canadian weather radars. The radar QPE is used as an observation source and not as a background field, and is subject to a strict quality control procedure, like any other observation source. The November 2014 upgrade to CaPA-RDPA was implemented at the same time as an upgrade to the RDPS system, which brought minor changes to the skill and bias of CaPA-RDPA. This paper uses the frequency bias indicator (FBI), the equitable threat score (ETS) and the departure from the partial mean (DPM) in order to assess the improvements to CaPA-RDPA brought by the assimilation of radar QPE. Verification focuses on the 6-h accumulations, and is done against a network of 65 synoptic stations (approximately two stations per radar) that were withheld from the station data assimilated by CaPA-RDPA. It is shown that the ETS and the DPM scores are both improved for precipitation events between 0.2 mm and 25 mm per 6-h, and that the FBI is unchanged.

  13. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    PubMed

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  14. EMC: Air Quality Forecast Home page

    Science.gov Websites

    archive NAM Verification Meteorology Error Time Series EMC NAM Spatial Maps Real Time Mesoscale Analysis Precipitation verification NAQFC VERIFICATION CMAQ Ozone & PM Error Time Series AOD Error Time Series HYSPLIT Smoke forecasts vs GASP satellite Dust and Smoke Error Time Series HYSPLIT WCOSS Upgrade (July

  15. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  16. International Space Station (ISS) External Thermal Control System (ETCS) Loop A Pump Module (PM) Jettison Options Assessment

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Dwyer Cianciolo, Alicia; Shidner, Jeremy D.; Powell, Richard W.

    2014-01-01

    On December 11, 2013, the International Space Station (ISS) experienced a failure of the External Thermal Control System (ETCS) Loop A Pump Module (PM). To minimize the number of extravehicular activities (EVA) required to replace the PM, jettisoning the faulty pump was evaluated. The objective of this study was to independently evaluate the jettison options considered by the ISS Trajectory Operations Officer (TOPO) and to provide recommendations for safe jettison of the ETCS Loop A PM. The simulation selected to evaluate the TOPO options was the NASA Engineering and Safety Center's (NESC) version of Program to Optimize Simulated Trajectories II (POST2) developed to support another NESC assessment. The objective of the jettison analysis was twofold: (1) to independently verify TOPO posigrade and retrograde jettison results, and (2) to determine jettison guidelines based on additional sensitivity, trade study, and Monte Carlo (MC) analysis that would prevent PM recontact. Recontact in this study designates a propagated PM trajectory that comes within 500 m of the ISS propagated trajectory. An additional simulation using Systems Tool Kit (STK) was run for independent verification of the POST2 simulation results. Ultimately, the ISS Program removed the PM jettison option from consideration. However, prior to the Program decision, the retrograde jettison option remained part of the EVA contingency plan. The jettison analysis presented showed that, in addition to separation velocity/direction and the atmosphere conditions, the key variables in determining the time to recontact the ISS is highly dependent on the ballistic number (BN) difference between the object being jettisoned and the ISS.

  17. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  18. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  19. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  20. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  1. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  2. Spacecraft servicing demonstration plan

    NASA Technical Reports Server (NTRS)

    Bergonz, F. H.; Bulboaca, M. A.; Derocher, W. L., Jr.

    1984-01-01

    A preliminary spacecraft servicing demonstration plan is prepared which leads to a fully verified operational on-orbit servicing system based on the module exchange, refueling, and resupply technologies. The resulting system can be applied at the space station, in low Earth orbit with an orbital maneuvering vehicle (OMV), or be carried with an OMV to geosynchronous orbit by an orbital transfer vehicle. The three phase plan includes ground demonstrations, cargo bay demonstrations, and free flight verifications. The plan emphasizes the exchange of multimission modular spacecraft (MMS) modules which involves space repairable satellites. Three servicer mechanism configurations are the engineering test unit, a protoflight quality unit, and two fully operational units that have been qualified and documented for use in free flight verification activity. The plan balances costs and risks by overlapping study phases, utilizing existing equipment for ground demonstrations, maximizing use of existing MMS equipment, and rental of a spacecraft bus.

  3. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1992-01-01

    ALEPS, which is being developed to provide the SSF program with a computer system to automate logistics resupply/return cargo load planning and verification, is presented. ALEPS will make it possible to simultaneously optimize both the resupply flight load plan and the return flight reload plan for any of the logistics carriers. In the verification mode ALEPS will support the carrier's flight readiness reviews and control proper execution of the approved plans. It will also support the SSF inventory management system by providing electronic block updates to the inventory database on the cargo arriving at or departing the station aboard a logistics carrier. A prototype drawer packing algorithm is described which is capable of generating solutions for 3D packing of cargo items into a logistics carrier storage accommodation. It is concluded that ALEPS will provide the capability to generate and modify optimized loading plans for the logistics elements fleet.

  4. Antenna testing for the Inmarsat 2 ground control system

    NASA Astrophysics Data System (ADS)

    Ashton, C.

    1992-02-01

    This article describes how the antennas of the Inmarsat 2 TT&C and IOT ground stations were tested and calibrated. It explains the main test methods used, giving the theory behind the tests and indicates some of the practical difficulties encountered during testing. Techniques described include the use of radio stars, boresight antennas and satellite verification testing using Intelsat and Inmarsat satellites. Parameters tested include gain, G/T (figure of merit), sidelobe patterns, cross polar discrimination and isolation.

  5. Performance Testing of a Trace Contaminant Control Subassembly for the International Space Station

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Curtis, R. E.; Alexandre, K. L.; Ruggiero, L. L.; Shtessel, N.

    1998-01-01

    As part of the International Space Station (ISS) Trace Contaminant Control Subassembly (TCCS) development, a performance test has been conducted to provide reference data for flight verification analyses. This test, which used the U.S. Habitation Module (U.S. Hab) TCCS as the test article, was designed to add to the existing database on TCCS performance. Included in this database are results obtained during ISS development testing; testing of functionally similar TCCS prototype units; and bench scale testing of activated charcoal, oxidation catalyst, and granular lithium hydroxide (LiOH). The present database has served as the basis for the development and validation of a computerized TCCS process simulation model. This model serves as the primary means for verifying the ISS TCCS performance. In order to mitigate risk associated with this verification approach, the U.S. Hab TCCS performance test provides an additional set of data which serve to anchor both the process model and previously-obtained development test data to flight hardware performance. The following discussion provides relevant background followed by a summary of the test hardware, objectives, requirements, and facilities. Facility and test article performance during the test is summarized, test results are presented, and the TCCS's performance relative to past test experience is discussed. Performance predictions made with the TCCS process model are compared with the U.S. Hab TCCS test results to demonstrate its validation.

  6. Design and Verification of Space Station EVA-Operated Truss Attachment System

    NASA Technical Reports Server (NTRS)

    Katell, Gabriel

    2001-01-01

    This paper describes the design and verification of a system used to attach two segments of the International Space Station (ISS). This system was first used in space to mate the P6 and Z1 trusses together in December 2000, through a combination of robotic and extravehicular tasks. Features that provided capture, coarse alignment, and fine alignment during the berthing process are described. Attachment of this high value hardware was critical to the ISS's sequential assembly, necessitating the inclusion of backup design and operational features. Astronauts checked for the proper performance of the alignment and bolting features during on-orbit operations. During berthing, the system accommodates truss-to-truss relative displacements that are caused by manufacturing tolerances and on-orbit thermal gradients. After bolt installation, the truss interface becomes statically determinate with respect to in-plane shear loads and isolates attach bolts from bending moments. The approach used to estimate relative displacements and the means of accommodating them is explained. Confidence in system performance was achieved through a cost-effective collection of tests and analyses, including thermal, structural, vibration, misalignment, contact dynamics, underwater simulation, and full-scale functional testing. Design considerations that have potential application to other mechanisms include accommodating variations of friction coefficients in the on-orbit joints, wrench torque tolerances, joint preload, moving element clearances at temperature extremes, and bolt-nut torque reaction.

  7. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  8. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  9. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  10. Development of a prototype two-phase thermal bus system for Space Station

    NASA Technical Reports Server (NTRS)

    Myron, D. L.; Parish, R. C.

    1987-01-01

    This paper describes the basic elements of a pumped two-phase ammonia thermal control system designed for microgravity environments, the development of the concept into a Space Station flight design, and design details of the prototype to be ground-tested in the Johnson Space Center (JSC) Thermal Test Bed. The basic system concept is one of forced-flow heat transport through interface heat exchangers with anhydrous ammonia being pumped by a device expressly designed for two-phase fluid management in reduced gravity. Control of saturation conditions, and thus system interface temperatures, is accomplished with a single central pressure regulating valve. Flow control and liquid inventory are controlled by passive, nonelectromechanical devices. Use of these simple control elements results in minimal computer controls and high system reliability. Building on the basic system concept, a brief overview of a potential Space Station flight design is given. Primary verification of the system concept will involve testing at JSC of a 25-kW ground test article currently in fabrication.

  11. SCORPI and SCORPI-T: Neurophysiological experiments on animals in space

    NASA Astrophysics Data System (ADS)

    Serafini, L.; Ramacciotti, T.; Vigano, W.; Donati, A.; Porciani, M.; Zolesi, V.; Schulze-Varnholt, D.; Manieri, P.; El-Din Sallam, A.; Schmah, M.; Horn, E. R.

    2005-08-01

    The study of physiological adaptation to long-term space flights with special consideration of the internal clock systems of scorpions is the goal of the SCORPI and SCORPI-T experiments. SCORPI was selected for flight on the International Space Station (ISS) and will be mounted in the European facility BIOLAB, the ESA laboratory designed to support biological experiments on micro-organisms, cells, tissue, cultures, small plants and small invertebrates. SCORPI-T experiment, performed on the Russian FOTON-M2 satellite in May-June 2005, represents an important precursor for the success of the experiment SCORPI on BIOLAB. This paper outlines the main features of the hardware designed and developed in order to allow the analysis of critical aspects of experiment execution and the verification of experiment objectives. The capabilities of the hardware developed for SCORPI and SCORPI-T show its potential use for any future similar type of experiments in space.

  12. Feasibility analysis of marine ecological on-line integrated monitoring system

    NASA Astrophysics Data System (ADS)

    Chu, D. Z.; Cao, X.; Zhang, S. W.; Wu, N.; Ma, R.; Zhang, L.; Cao, L.

    2017-08-01

    The in-situ water quality sensors were susceptible to biological attachment. Moreover, sea water corrosion and wave impact damage, and many sensors scattered distribution would cause maintenance inconvenience. The paper proposed a highly integrated marine ecological on-line integrated monitoring system, which can be used inside monitoring station. All sensors were reasonably classified, the similar in series, the overall in parallel. The system composition and workflow were described. In addition, the paper proposed attention issues of the system design and corresponding solutions. Water quality multi-parameters and 5 nutrient salts as the verification index, in-situ and systematic data comparison experiment were carried out. The results showed that the data consistency of nutrient salt, PH and salinity was better. Temperature and dissolved oxygen data trend was consistent, but the data had deviation. Turbidity fluctuated greatly; the chlorophyll trend was similar with it. Aiming at the above phenomena, three points system optimization direction were proposed.

  13. Proven, long-life hydrogen/oxygen thrust chambers for space station propulsion

    NASA Technical Reports Server (NTRS)

    Richter, G. P.; Price, H. G.

    1986-01-01

    The development of the manned space station has necessitated the development of technology related to an onboard auxiliary propulsion system (APS) required to provide for various space station attitude control, orbit positioning, and docking maneuvers. A key component of this onboard APS is the thrust chamber design. To develop the required thrust chamber technology to support the Space Station Program, the NASA Lewis Research Center has sponsored development programs under contracts with Aerojet TechSystems Company and with Bell Aerospace Textron Division of Textron, Inc. During the NASA Lewis sponsored program with Aerojet TechSystems, a 25 lb sub f hydrogen/oxygen thruster has been developed and proven as a viable candidate to meet the needs of the Space Station Program. Likewise, during the development program with Bell Aerospace, a 50 lb sub f hydrogen/oxygen Thrust Chamber has been developed and has demonstrated reliable, long-life expectancy at anticipated space station operating conditions. Both these thrust chambers were based on design criteria developed in previous thruster programs and successfully verified in experimental test programs. Extensive thermal analyses and models were used to design the thrusters to achieve total impulse goals of 2 x 10 to the 6th power lb sub f-sec. Test data for each thruster will be compared to the analytical predictions for the performance and heat transfer characteristics. Also, the results of thrust chamber life verification tests will be presented.

  14. INDEPENDENT VERIFICATION SURVEY REPORT FOR ZONE 1 OF THE EAST TENNESSEE TECHNOLOGY PARK IN OAK RIDGE, TENNESSEE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, David A.

    2012-08-16

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).

  15. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    The current philosophy of the GSFS regarding environmental verification of Shuttle payloads is reviewed. In the structures area, increased emphasis will be placed on the use of analysis for design verification, with selective testing performed as necessary. Furthermore, as a result of recent cost optimization analysis, the multitier test program will presumably give way to a comprehensive test program at the major payload subassembly level after adequate workmanship at the component level has been verified. In the thermal vacuum area, thought is being given to modifying the approaches used for conventional spacecraft.

  16. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  17. Defining the ecological hydrology of Taiwan Rivers using multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Wu, Tzu-Ching; Tsai, Wen-Ping; Herricks, Edwin E.

    2009-09-01

    SummaryThe identification and verification of ecohydrologic flow indicators has found new support as the importance of ecological flow regimes is recognized in modern water resources management, particularly in river restoration and reservoir management. An ecohydrologic indicator system reflecting the unique characteristics of Taiwan's water resources and hydrology has been developed, the Taiwan ecohydrological indicator system (TEIS). A major challenge for the water resources community is using the TEIS to provide environmental flow rules that improve existing water resources management. This paper examines data from the extensive network of flow monitoring stations in Taiwan using TEIS statistics to define and refine environmental flow options in Taiwan. Multivariate statistical methods were used to examine TEIS statistics for 102 stations representing the geographic and land use diversity of Taiwan. The Pearson correlation coefficient showed high multicollinearity between the TEIS statistics. Watersheds were separated into upper and lower-watershed locations. An analysis of variance indicated significant differences between upstream, more natural, and downstream, more developed, locations in the same basin with hydrologic indicator redundancy in flow change and magnitude statistics. Issues of multicollinearity were examined using a Principal Component Analysis (PCA) with the first three components related to general flow and high/low flow statistics, frequency and time statistics, and quantity statistics. These principle components would explain about 85% of the total variation. A major conclusion is that managers must be aware of differences among basins, as well as differences within basins that will require careful selection of management procedures to achieve needed flow regimes.

  18. STS-55 German Payload Specialist Schlegel manipulates ROTEX controls in SL-D2

    NASA Technical Reports Server (NTRS)

    1993-01-01

    STS-55 German Payload Specialist 2 Hans Schlegel, wearing goggles (eye glasses) and positioned in front of Spacelab Deutsche 2 (SL-D2) Rack 4 System Rack controls, operates Robotics Technology Experiment (ROTEX) arm. ROTEX is a robotic arm that operates within an enclosed workcell in Rack 6 (partially visible in the foreground) and uses teleoperation from both an onboard station located nearby in Rack 4 and from a station on the ground. The device uses teleprogramming and artificial intelligence to look at the design, verification and operation of advanced autonomous systems for use in future applications. Schlegel represents the German Aerospace Research Establishment (DLR). SL-D2, a German-managed payload, is aboard Columbia, Orbiter Vehicle (OV) 102, for this science research mission.

  19. International Space Station Major Constituent Analyzer On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Thoresen, Souzan; Granahan, John; Matty, Chris

    2012-01-01

    The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic changeout, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. Over the past two years, two ORU 02 analyzer assemblies have operated nominally while two others have experienced premature on-orbit failures. These failures as well as nominal performances demonstrate that ORU 02 performance remains a key determinant of MCA performance and logistical support. It can be shown that monitoring several key parameters can maximize the capacity to monitor ORU health and properly anticipate end of life. Improvements to ion pump operation and ion source tuning are expected to improve lifetime performance of the current ORU 02 design.

  20. Free-free and fixed base modal survey tests of the Space Station Common Module Prototype

    NASA Technical Reports Server (NTRS)

    Driskill, T. C.; Anderson, J. B.; Coleman, A. D.

    1992-01-01

    This paper describes the testing aspects and the problems encountered during the free-free and fixed base modal surveys completed on the original Space Station Common Module Prototype (CMP). The CMP is a 40-ft long by 14.5-ft diameter 'waffle-grid' cylinder built by the Boeing Company and housed at the Marshall Space Flight Center (MSFC) near Huntsville, AL. The CMP modal survey tests were conducted at MSFC by the Dynamics Test Branch. The free-free modal survey tests (June '90 to Sept. '90) included interface verification tests (IFVT), often referred to as impedance measurements, mass-additive testing and linearity studies. The fixed base modal survey tests (Feb. '91 to April '91), including linearity studies, were conducted in a fixture designed to constrain the CMP in 7 total degrees-of-freedom at five trunnion interfaces (two primary, two secondary, and the keel). The fixture also incorporated an airbag off-load system designed to alleviate the non-linear effects of friction in the primary and secondary trunnion interfaces. Numerous test configurations were performed with the objective of providing a modal data base for evaluating the various testing methodologies to verify dynamic finite element models used for input to coupled load analysis.

  1. Application of Decision Tree on Collision Avoidance System Design and Verification for Quadcopter

    NASA Astrophysics Data System (ADS)

    Chen, C.-W.; Hsieh, P.-H.; Lai, W.-H.

    2017-08-01

    The purpose of the research is to build a collision avoidance system with decision tree algorithm used for quadcopters. While the ultrasonic range finder judges the distance is in collision avoidance interval, the access will be replaced from operator to the system to control the altitude of the UAV. According to the former experiences on operating quadcopters, we can obtain the appropriate pitch angle. The UAS implement the following three motions to avoid collisions. Case1: initial slow avoidance stage, Case2: slow avoidance stage and Case3: Rapid avoidance stage. Then the training data of collision avoidance test will be transmitted to the ground station via wireless transmission module to further analysis. The entire decision tree algorithm of collision avoidance system, transmission data, and ground station have been verified in some flight tests. In the flight test, the quadcopter can implement avoidance motion in real-time and move away from obstacles steadily. In the avoidance area, the authority of the collision avoidance system is higher than the operator and implements the avoidance process. The quadcopter can successfully fly away from the obstacles in 1.92 meter per second and the minimum distance between the quadcopter and the obstacle is 1.05 meters.

  2. Detection and interpretation of seismoacoustic events at German infrasound stations

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  3. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  4. Ionizing Radiation Environment on the International Space Station: Performance vs. Expectations for Avionics and Material

    NASA Technical Reports Server (NTRS)

    Koontz, Steven L.; Boeder, Paul A.; Pankop, Courtney; Reddell, Brandon

    2005-01-01

    The role of structural shielding mass in the design, verification, and in-flight performance of International Space Station (ISS), in both the natural and induced orbital ionizing radiation (IR) environments, is reported. Detailed consideration of the effects of both the natural and induced ionizing radiation environment during ISS design, development, and flight operations has produced a safe, efficient manned space platform that is largely immune to deleterious effects of the LEO ionizing radiation environment. The assumption of a small shielding mass for purposes of design and verification has been shown to be a valid worst-case approximation approach to design for reliability, though predicted dependences of single event effect (SEE) effects on latitude, longitude, SEP events, and spacecraft structural shielding mass are not observed. The Figure of Merit (FOM) method over predicts the rate for median shielding masses of about 10g/cm(exp 2) by only a factor of 3, while the Scott Effective Flux Approach (SEFA) method overestimated by about one order of magnitude as expected. The Integral Rectangular Parallelepiped (IRPP), SEFA, and FOM methods for estimating on-orbit (Single Event Upsets) SEU rates all utilize some version of the CREME-96 treatment of energetic particle interaction with structural shielding, which has been shown to underestimate the production of secondary particles in heavily shielded manned spacecraft. The need for more work directed to development of a practical understanding of secondary particle production in massive structural shielding for SEE design and verification is indicated. In contrast, total dose estimates using CAD based shielding mass distributions functions and the Shieldose Code provided a reasonable accurate estimate of accumulated dose in Grays internal to the ISS pressurized elements, albeit as a result of using worst-on-worst case assumptions (500 km altitude x 2) that compensate for ignoring both GCR and secondary particle production in massive structural shielding.

  5. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  6. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less

  7. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  8. Electric motorcycle charging station powered by solar energy

    NASA Astrophysics Data System (ADS)

    Siriwattanapong, Akarawat; Chantharasenawong, Chawin

    2018-01-01

    This research proposes a design and verification of an off-grid photovoltaic system (PVS) for electric motorcycle charging station to be located in King’s Mongkut’s University of Technology Thonburi, Bangkok, Thailand. The system is designed to work independently (off-grid) and it must be able to fully charge the batteries of a typical passenger electric motorcycle every evening. A 1,000W Toyotron electric motorcycle is chosen for this study. It carries five units of 12.8V 20Ah batteries in series; hence its maximum energy requirement per day is 1,200Wh. An assessment of solar irradiation data and the Generation Factor in Bangkok, Thailand suggests that the charging system consists of one 500W PV panel, an MPPT charge controller, 48V 150Ah battery, a 1,000W DC to AC inverter and other safety devices such as fuses and breakers. An experiment is conducted to verify the viability of the off-grid PVS charging station by collecting the total daily energy generation data in the raining season and winter. The data suggests that the designed off-grid solar power charging station for electric motorcycle is able to supply sufficient energy for daily charging requirements.

  9. KSC-99pp0208

    NASA Image and Video Library

    1999-02-11

    KENNEDY SPACE CENTER, FLA. -- In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (left to right) Mission Specialists Valery Tokarev, Julie Payette (holding a lithium hydroxide canister) and Dan Barry. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m

  10. Verification and Validation of the Coastal Modeling System. Report 1: Summary Report

    DTIC Science & Technology

    2011-12-01

    Information Program ( CDIP ) Buoy 036 in a water depth of 40 m (relative to Mean Tide Level, MTL) and from the National Data Buoy Center (NDBC) Buoy...August to 14 September 2005, offshore wave data from a CDIP Buoy 098, the ocean surface wind from NDBC Buoy 51001, and water level data from NOAA station...buoy at 26-m depth was maintained by CDIP (Buoy 430), and data are available online at http://cdip.ucsd.edu. The wind measurements are available

  11. Space station definition and preliminary design, WP-01. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Lenda, J. A.

    1987-01-01

    System activities are summarized and an overview of the system level engineering tasks performed are provided. Areas discussed include requirements, system test and verification, the advanced development plan, customer accommodations, software, growth, productivity, operations, product assurance and metrication. The hardware element study results are summarized. Overviews of recommended configurations are provided for the core module, the USL, the logistics elements, the propulsion subsystems, reboost, vehicle accommodations, and the smart front end. A brief overview is provided for costing activities.

  12. AJ26 rocket engine test

    NASA Image and Video Library

    2010-11-10

    Fire and steam signal a successful test firing of Orbital Sciences Corporation's Aerojet AJ26 rocket engine at John C. Stennis Space Center. AJ26 engines will be used to power Orbital's Taurus II space vehicle on commercial cargo flights to the International Space Station. On Nov. 10, operators at Stennis' E-1 Test Stand conducted a 10-second test fire of the engine, the first of a series of three verification tests. Orbital has partnered with NASA to provide eight missions to the ISS by 2015.

  13. Large project experiences with object-oriented methods and reuse

    NASA Technical Reports Server (NTRS)

    Wessale, William; Reifer, Donald J.; Weller, David

    1992-01-01

    The SSVTF (Space Station Verification and Training Facility) project is completing the Preliminary Design Review of a large software development using object-oriented methods and systematic reuse. An incremental developmental lifecycle was tailored to provide early feedback and guidance on methods and products, with repeated attention to reuse. Object oriented methods were formally taught and supported by realistic examples. Reuse was readily accepted and planned by the developers. Schedule and budget issues were handled by agreements and work sharing arranged by the developers.

  14. NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Carter, David; Wetzel, Scott

    2000-01-01

    The NASA SLR Operational Center is responsible for: 1) NASA SLR network control, sustaining engineering, and logistics; 2) ILRS mission operations; and 3) ILRS and NASA SLR data operations. NASA SLR network control and sustaining engineering tasks include technical support, daily system performance monitoring, system scheduling, operator training, station status reporting, system relocation, logistics and support of the ILRS Networks and Engineering Working Group. These activities ensure the NASA SLR systems are meeting ILRS and NASA mission support requirements. ILRS mission operations tasks include mission planning, mission analysis, mission coordination, development of mission support plans, and support of the ILRS Missions Working Group. These activities ensure than new mission and campaign requirements are coordinated with the ILRS. Global Normal Points (NP) data, NASA SLR FullRate (FR) data, and satellite predictions are managed as part of data operations. Part of this operation includes supporting the ILRS Data Formats and Procedures Working Group. Global NP data operations consist of receipt, format and data integrity verification, archiving and merging. This activity culminates in the daily electronic transmission of NP files to the CDDIS. Currently of all these functions are automated. However, to ensure the timely and accurate flow of data, regular monitoring and maintenance of the operational software systems, computer systems and computer networking are performed. Tracking statistics between the stations and the data centers are compared periodically to eliminate lost data. Future activities in this area include sub-daily (i.e., hourly) NP data management, more stringent data integrity tests, and automatic station notification of format and data integrity issues.

  15. Space Shuttle Range Safety Command Destruct System Analysis and Verification. Phase 1. Destruct System Analysis and Verification

    DTIC Science & Technology

    1981-03-01

    overcome the shortcomings of this system. A phase III study develops the breakup model of the Space Shuttle clus’ter at various times into flight. The...2-1 ROCKET MODEL ..................................................... 2-5 COMBUSTION CHAMBER OPERATION ................................... 2-5...2-19 RESULTS .......................................................... 2-22 ROCKET MODEL

  16. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  17. Fire safety experiments on MIR Orbital Station

    NASA Technical Reports Server (NTRS)

    Egorov, S. D.; Belayev, A. YU.; Klimin, L. P.; Voiteshonok, V. S.; Ivanov, A. V.; Semenov, A. V.; Zaitsev, E. N.; Balashov, E. V.; Andreeva, T. V.

    1995-01-01

    The process of heterogeneous combustion of most materials under zero-g without forced motion of air is practically impossible. However, ventilation is required to support astronauts' life and cool equipment. The presence of ventilation flows in station compartments at accidental ignition can cause a fire. An additional, but exceedingly important parameter of the fire risk of solid materials under zero-g is the minimum air gas velocity at which the extinction of materials occurs. Therefore, the conception of fire safety can be based on temporarily lowering the intensity of ventilation and even turning it off. The information on the limiting conditions of combustion under natural conditions is needed from both scientific and practical points of view. It will enable us to judge the reliability of results of ground-based investigations and develop a conception of fire safety of inhabited sealed compartments of space stations to by provided be means of nontraditional and highly-effective methods without both employing large quantities of fire-extinguishing compounds and hard restrictions on use of polymers. In this connection, an experimental installation was created to study the process of heterogeneous combustion of solid non-metals and to determine the conditions of its extinction under microgravity. This installation was delivered to the orbital station 'Mir' and the cosmonauts Viktorenko and Kondakova performed initial experiments on it in late 1994. The experimental installation consists of a combustion chamber with an electrical systems for ignition of samples, a device for cleaning air from combustion products, an air suction unit, air pipes and a control panel. The whole experiment is controlled by telemetry and recorded with two video cameras located at two different places. Besides the picture, parameters are recorded to determine the velocity of the air flow incoming to the samples, the time points of switching on/off the devices, etc. The combustion chamber temperature is also controlled. The main objectives of experiments of this series were as follows: (1) verification of the reliability of the installation in orbital flight; (2) verification of the experimental procedure; and (3) investigation of combustion of two types of materials under microgravity at various velocities of the incoming air flow.

  18. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    PubMed

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (p<0.01), with a corresponding significant decrease in US-based vendors (71.9% in 2013 and 65% in 2014). Most vendors did little to prevent youth access in either year, with 67.6% in 2013 and 63.2% in 2014 employing no age verification or relying exclusively on strategies that cannot effectively verify age. Effective age verification strategies such as online age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, p<0.01) or age verification at delivery (6.4% in 2013 and 8.1% in 2104) were rarely advertised on IEV websites. Nearly all vendors advertised accepting credit cards, and about ¾ shipping via United States Postal Service, similar to the internet cigarette industry prior to federal bans. The number of IEVs grew sharply from 2013 to 2014, with poor age verification practices. New and expanded regulations for online e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  20. CTBT on-site inspections

    NASA Astrophysics Data System (ADS)

    Zucca, J. J.

    2014-05-01

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  1. Independent calculation of monitor units for VMAT and SPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xin; Bush, Karl; Ding, Aiping

    Purpose: Dose and monitor units (MUs) represent two important facets of a radiation therapy treatment. In current practice, verification of a treatment plan is commonly done in dose domain, in which a phantom measurement or forward dose calculation is performed to examine the dosimetric accuracy and the MU settings of a given treatment plan. While it is desirable to verify directly the MU settings, a computational framework for obtaining the MU values from a known dose distribution has yet to be developed. This work presents a strategy to calculate independently the MUs from a given dose distribution of volumetric modulatedmore » arc therapy (VMAT) and station parameter optimized radiation therapy (SPORT). Methods: The dose at a point can be expressed as a sum of contributions from all the station points (or control points). This relationship forms the basis of the proposed MU verification technique. To proceed, the authors first obtain the matrix elements which characterize the dosimetric contribution of the involved station points by computing the doses at a series of voxels, typically on the prescription surface of the VMAT/SPORT treatment plan, with unit MU setting for all the station points. An in-house Monte Carlo (MC) software is used for the dose matrix calculation. The MUs of the station points are then derived by minimizing the least-squares difference between doses computed by the treatment planning system (TPS) and that of the MC for the selected set of voxels on the prescription surface. The technique is applied to 16 clinical cases with a variety of energies, disease sites, and TPS dose calculation algorithms. Results: For all plans except the lung cases with large tissue density inhomogeneity, the independently computed MUs agree with that of TPS to within 2.7% for all the station points. In the dose domain, no significant difference between the MC and Eclipse Anisotropic Analytical Algorithm (AAA) dose distribution is found in terms of isodose contours, dose profiles, gamma index, and dose volume histogram (DVH) for these cases. For the lung cases, the MC-calculated MUs differ significantly from that of the treatment plan computed using AAA. However, the discrepancies are reduced to within 3% when the TPS dose calculation algorithm is switched to a transport equation-based technique (Acuros™). Comparison in the dose domain between the MC and Eclipse AAA/Acuros calculation yields conclusion consistent with the MU calculation. Conclusions: A computational framework relating the MU and dose domains has been established. The framework does not only enable them to verify the MU values of the involved station points of a VMAT plan directly in the MU domain but also provide a much needed mechanism to adaptively modify the MU values of the station points in accordance to a specific change in the dose domain.« less

  2. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  3. INF and IAEA: A comparative analysis of verification strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  4. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  5. Improved performance comparisons of radioxenon systems for low level releases in nuclear explosion monitoring.

    PubMed

    Haas, Derek A; Eslinger, Paul W; Bowyer, Theodore W; Cameron, Ian M; Hayes, James C; Lowrey, Justin D; Miley, Harry S

    2017-11-01

    The Comprehensive Nuclear-Test-Ban Treaty bans all nuclear tests and mandates development of verification measures to detect treaty violations. One verification measure is detection of radioactive xenon isotopes produced in the fission of actinides. The International Monitoring System (IMS) currently deploys automated radioxenon systems that can detect four radioxenon isotopes. Radioxenon systems with lower detection limits are currently in development. Historically, the sensitivity of radioxenon systems was measured by the minimum detectable concentration for each isotope. In this paper we analyze the response of radioxenon systems using rigorous metrics in conjunction with hypothetical representative releases indicative of an underground nuclear explosion instead of using only minimum detectable concentrations. Our analyses incorporate the impact of potential spectral interferences on detection limits and the importance of measuring isotopic ratios of the relevant radioxenon isotopes in order to improve discrimination from background sources particularly for low-level releases. To provide a sufficient data set for analysis, hypothetical representative releases are simulated every day from the same location for an entire year. The performance of three types of samplers are evaluated assuming they are located at 15 IMS radionuclide stations in the region of the release point. The performance of two IMS-deployed samplers and a next-generation system is compared with proposed metrics for detection and discrimination using representative releases from the nuclear test site used by the Democratic People's Republic of Korea. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Contribution of the infrasound technology to characterize large scale atmospheric disturbances and impact on infrasound monitoring

    NASA Astrophysics Data System (ADS)

    Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter

    2016-04-01

    The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.

  7. Space robotics--DLR's telerobotic concepts, lightweight arms and articulated hands.

    PubMed

    Hirzinger, G; Brunner, B; Landzettel, K; Sporer, N; Butterfass, J; Schedl, M

    2003-01-01

    The paper briefly outlines DLR's experience with real space robot missions (ROTEX and ETS VII). It then discusses forthcoming projects, e.g., free-flying systems in low or geostationary orbit and robot systems around the space station ISS, where the telerobotic system MARCO might represent a common baseline. Finally it describes our efforts in developing a new generation of "mechatronic" ultra-light weight arms with multifingered hands. The third arm generation is operable now (approaching present-day technical limits). In a similar way DLR's four-fingered hand II was a big step towards higher reliability and yet better performance. Artificial robonauts for space are a central goal now for the Europeans as well as for NASA, and the first verification tests of DLR's joint components are supposed to fly already end of 93 on the space station.

  8. 78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...

  9. INF and IAEA: A comparative analysis of verification strategy. [Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  10. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    NASA Astrophysics Data System (ADS)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.

  11. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  12. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  13. Long-term observations of tropospheric ozone: GAW Measurement Guidelines

    NASA Astrophysics Data System (ADS)

    Tarasova, Oksana; Galbally, Ian E.; Schultz, Martin G.

    2013-04-01

    The Global Atmosphere Watch (GAW) Programme of the World Meteorological Organization (WMO) coordinates long-term observations of the chemical composition and physical properties of the atmosphere which are relevant for understanding of atmospheric chemistry and climate change. Atmospheric observations of reactive gases (tropospheric ozone, carbon monoxide, volatile organic compounds and nitrogen oxides) coordinated by the GAW Programme complement local and regional scale air quality monitoring efforts. As part of the GAW quality assurance (QA) system detailed measurement guidelines for atmospheric trace species are developed by international expert teams at irregular intervals. The most recent report focuses on continuous in-situ measurements of ozone in the troposphere, performed in particular at continental or island sites with altitudes ranging from sea level to mountain tops. Data Quality Objectives (DQOs) are defined for different applications of the data (e.g. trend analysis and verification of global model forecasts). These DQOs include a thorough discussion of the tolerable level of measurement uncertainty and data completeness. The guidelines present the best practices and practical arrangements adopted by the GAW Programme in order to enable the GAW station network to approach or achieve the defined tropospheric ozone DQOs. The document includes information on the selection of station and measurement locations, required skills and training of staff, recommendations on the measurement technique and the necessary equipment to perform highest quality measurements, rules for conducting the measurements, preparing the data and archiving them, and more. Much emphasis is given to discussions about how to ensure the quality of the data through tracing calibrations back to primary standards, proper calibration and data analysis, etc. In the GAW Programme the QA system is implemented through Central Facilities (Central Calibration Laboratories, World and Regional Calibration Centers and World Data Centers), Scientific Advisory Groups and GAW Training and Education Center. These bodies support primary standards, provide calibration and data archiving facilities, coordinate comparison campaigns, perform stations audit, provide documentation and training of personnel.

  14. Being prepared to verify the CTBT-Atmospheric Transport modeling and radionuclide analysis at the Austrian National Data Centre during the NDC Preparedness Exercise 2009

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Schraick, Irene

    2010-05-01

    An explosion in the Kara-Zhyra mine in Eastern Kazakhstan on 28 November 2009 around 07:20 UTC was recorded by both the CTBTO seismic and infrasound networks. This event triggered a world-wide preparedness exercise among the CTBTO National Data Centres. Within an hour after the event was selected by the German NDC, a computer program developed by NDC Austria based on weather forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) and from the U.S. National Centers for Environmental Prediction (NCEP) was started to analyse what Radionuclide Stations of the CTBTO International Monitoring System (IMS) would be potentially affected by the release from a nuclear explosion at this place in the course of the following 3-10 days. These calculations were daily updated to consider the observed state of the atmosphere instead of the predicted one. Based on these calculations, automated and reviewed radionuclide reports from the potentially affected stations as produced by the CTBTO International Data Centre (IDC) were looked at. An additional analysis of interesting spectra was provided by the Seibersdorf Laboratories. Based on all the results coming in, no evidence whatsoever was found that the explosion in Kazakhstan was nuclear. This is in accordance with ground truth information saying that the event was caused by the detonation of more than 53 Tons of explosives as part of mining operations. A number of conclusions can be drawn from this exercise. First, the international, bilateral as well as national mechanisms and procedures in place for such an event worked smoothly. Second, the products and services from the CTBTO IDC proved to be very useful to assist the member states in their verification efforts. Last but not least, issues with the availability of data from IMS radionuclide stations do remain.

  15. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  16. Goddard high resolution spectrograph science verification and data analysis

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.

  17. Deductive Evaluation: Formal Code Analysis With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben. L

    2016-01-01

    We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.

  18. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  19. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  20. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  1. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  2. 2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter

    DTIC Science & Technology

    2007-08-23

    Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust

  3. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the KSC Shuttle Landing Facility, the Italian Space Agency's Multi- Purpose Logistics Module Donatello begins rolling out of the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  4. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the KSC Shuttle Landing Facility, an Airbus '''Beluga''' air cargo plane opens to reveal its cargo, the Italian Space Agency's Multi- Purpose Logistics Module Donatello, from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  5. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the KSC Shuttle Landing Facility, the Italian Space Agency's Multi- Purpose Logistics Module Donatello rolls out of the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  6. The development of test beds to support the definition and evolution of the Space Station Freedom power system

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Frye, Robert J.; Phillips, Rudy L.

    1991-01-01

    Since the beginning of the Space Station Freedom Program (SSFP), the Lewis Research Center (LeRC) and the Rocketdyne Division of Rockwell International have had extensive efforts underway to develop test beds to support the definition of the detailed electrical power system design. Because of the extensive redirections that have taken place in the Space Station Freedom Program in the past several years, the test bed effort was forced to accommodate a large number of changes. A short history of these program changes and their impact on the LeRC test beds is presented to understand how the current test bed configuration has evolved. The current test objectives and the development approach for the current DC Test Bed are discussed. A description of the test bed configuration, along with its power and controller hardware and its software components, is presented. Next, the uses of the test bed during the mature design and verification phase of SSFP are examined. Finally, the uses of the test bed in operation and evolution of the SSF are addressed.

  7. The development of test beds to support the definition and evolution of the Space Station Freedom power system

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Frye, Robert J.; Phillips, Rudy L.

    1991-01-01

    Since the beginning of the Space Station Freedom Program (SSFP), the NASA Lewis Research Center (LeRC) and the Rocketdyne Division of Rockwell International have had extensive efforts underway to develop testbeds to support the definition of the detailed electrical power system design. Because of the extensive redirections that have taken place in the Space Station Freedom Program in the past several years, the test bed effort was forced to accommodate a large number of changes. A short history of these program changes and their impact on the LeRC test beds is presented to understand how the current test bed configuration has evolved. The current test objectives and the development approach for the current DC test bed are discussed. A description of the test bed configuration, along with its power and controller hardware and its software components, is presented. Next, the uses of the test bed during the mature design and verification phase of SSFP are examined. Finally, the uses of the test bed in the operation and evolution of the SSF are addressed.

  8. Structural technology challenges for evolutionary growth of Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Doiron, Harold H.

    1990-01-01

    A proposed evolutionary growth scenario for Space Station Freedom was defined recently by a NASA task force created to study requirements for a Human Exploration Initiative. The study was an initial response to President Bush's July 20, 1989 proposal to begin a long range program of human exploration of space including a permanently manned lunar base and a manned mission to Mars. This growth scenario evolves Freedom into a critical transportation node to support lunar and Mars missions. The growth scenario begins with the Assembly Complete configuration and adds structure, power, and facilities to support a Lunar Transfer Vehicle (LTV) verification flight. Evolutionary growth continues to support expendable, then reusable LTV operations, and finally, LTV and Mars Transfer Vehicle (MTV) operations. The significant structural growth and additional operations creating new loading conditions will present new technological and structural design challenges in addition to the considerable technology requirements of the baseline Space Station Freedom program. Several structural design and technology issues of the baseline program are reviewed and related technology development required by the growth scenario is identified.

  9. Evaluation of Preproduction Hardware Components for IMS Station Upgrades to Reduce Manufacturers Development Time

    NASA Astrophysics Data System (ADS)

    Hart, Darren; Pearce, Nathan; Starovoit, Yuri; Guralp, Cansun

    2014-05-01

    Since the Comprehensive Nuclear-Test-Ban Treaty was opened for signature in 1996, nearly 80% of the network has been certified as operational, and those stations are sending data to the International Data Centre (IDC) in Vienna. Several International Monitoring System (IMS) monitoring facilities have been in operation for close to 15 years, and several certified stations are facing equipment obsolescence issues. The search for engineering solutions to replace obsolete hardware components is guided by two primary goals: 1) be compliant with IMS minimum technical requirements and 2) be able to be integrated with the existing system. To reduce the development and verification time necessary to address obsolescence in equipment, the PTS has requested the preproduction testing of the recently revised Guralp CMG-DM24AM digitizer. Performing preproduction testing has helped in identifying issues, which Guralp Systems has resolved. In our poster, we will review the reasons for the digitizer updates, present results of the preproduction testing of the Guralp digitizer, and comment on the value this process has provided to the IMS operation.

  10. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  11. New Perspectives of "old" Data Sources: the Dataset of Long-Term Research Watersheds in the Former Soviet Union for the Task of Hydrological Models Development, Verification and Comparison

    NASA Astrophysics Data System (ADS)

    Lebedeva, L.; Semenova, O.

    2013-12-01

    Lack of detailed process-oriented observational data is often claimed as one of the major obstacle for further advance of hydrological process understanding and development of deterministic models that do not rely on calibration. New sources of hydrological information (satellites, radars etc.) have the perspectives for the future but can not completely replace conventional and experimental observations at the moment. Long-term data-rich research catchments remain valuable if not the only source of information for development, verification, regionalization and comparison of different hydrological and environmental models. There existed the set of more than 20 such basins that were operated according to single observational program from the 1930-1950th to 1990th in the former Soviet Union. Research basins, so called water-balance stations, covered all main climatic and landscape zones such as taiga, forest-steppe, steppe, desert, mountains and permafrost regions. Each station conducted broad range of standard, special and experimental hydrometeorological field studies including spatially distributed meteorological observations, soil and snow variable states, measurements of the groundwater levels, hydrochemistry, evapotranspiration, discharges in several, often nested, slope- and small-scale watersheds, etc. The data were accompanied by the descriptions of observational techniques and landscapes allowing linking natural conditions with dominant hydrological processes. Each station is representative for larger area and the results of local studies could be transferred to other basins in similar conditions. Till recently the data existed only in hard copies in Russian language therefore they are not enough explored yet. We are currently digitizing main part of the observational and supportive materials and make it available for any scientific purpose via website http://hydrograph-model.ru/. We propose to hydrological community to use the data for comprehensive intercomparison studies of our models and their modules to reject inadequate algorithms and advance our process understanding and modeling efforts in different environments.

  12. Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures

    NASA Technical Reports Server (NTRS)

    Hartle, M. S.; Mcknight, R. L.; Huang, H.; Holt, R.

    1992-01-01

    Described here are the accomplishments of a 5-year program to develop a methodology for coupled structural, thermal, electromagnetic analysis tailoring of graded component structures. The capabilities developed over the course of the program are the analyzer module and the tailoring module for the modeling of graded materials. Highlighted accomplishments for the past year include the addition of a buckling analysis capability, the addition of mode shape slope calculation for flutter analysis, verification of the analysis modules using simulated components, and verification of the tailoring module.

  13. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  14. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  15. Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.

    2010-01-01

    The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.

  16. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  17. KSC-99pd0209

    NASA Image and Video Library

    1999-02-11

    KENNEDY SPACE CENTER, FLA. -- In the SPACEHAB Facility, the STS-96 crew looks at equipment as part of a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . From left are Mission Specialist Ellen Ochoa (behind the opened storage cover ), Commander Kent Rominger, Pilot Rick Husband (holding a lithium hydroxide canister) and Mission Specialists Dan Barry, Valery Tokarev of Russia and Julie Payette. In the background is TTI interpreter Valentina Maydell. The other crew member at KSC for the IVT is Mission Specialist Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m

  18. A space station Structures and Assembly Verification Experiment, SAVE

    NASA Technical Reports Server (NTRS)

    Russell, R. A.; Raney, J. P.; Deryder, L. J.

    1986-01-01

    The Space Station structure has been baselined to be a 5 M (16.4 ft) erectable truss. This structure will provide the overall framework to attach laboratory modules and other systems, subsystems and utilities. The assembly of this structure represents a formidable EVA challenge. To validate this capability the Space Station Structures/Dynamics Technical Integration Panel (TIP) met to develop the necessary data for an integrated STS structures flight experiment. As a result of this meeting, the Langley Research Center initiated a joint Langley/Boeing Aerospace Company study which supported the structures/dynamics TIP in developing the preliminary definition and design of a 5 M erectable space station truss and the resources required for a proposed flight experiment. The purpose of the study was to: (1) devise methods of truss assembly by astronauts; (2) define a specific test matrix for dynamic characterization; (3) identify instrumentation and data system requirements; (4) determine the power, propulsion and control requirements for the truss on-orbit for 3 years; (5) study the packaging of the experiment in the orbiter cargo bay; (6) prepare a preliminary cost estimate and schedule for the experiment; and (7) provide a list of potential follow-on experiments using the structure as a free flyer. The results of this three month study are presented.

  19. GPS Estimates of Integrated Precipitable Water Aid Weather Forecasters

    NASA Technical Reports Server (NTRS)

    Moore, Angelyn W.; Gutman, Seth I.; Holub, Kirk; Bock, Yehuda; Danielson, David; Laber, Jayme; Small, Ivory

    2013-01-01

    Global Positioning System (GPS) meteorology provides enhanced density, low-latency (30-min resolution), integrated precipitable water (IPW) estimates to NOAA NWS (National Oceanic and Atmospheric Adminis tration Nat ional Weather Service) Weather Forecast Offices (WFOs) to provide improved model and satellite data verification capability and more accurate forecasts of extreme weather such as flooding. An early activity of this project was to increase the number of stations contributing to the NOAA Earth System Research Laboratory (ESRL) GPS meteorology observing network in Southern California by about 27 stations. Following this, the Los Angeles/Oxnard and San Diego WFOs began using the enhanced GPS-based IPW measurements provided by ESRL in the 2012 and 2013 monsoon seasons. Forecasters found GPS IPW to be an effective tool in evaluating model performance, and in monitoring monsoon development between weather model runs for improved flood forecasting. GPS stations are multi-purpose, and routine processing for position solutions also yields estimates of tropospheric zenith delays, which can be converted into mm-accuracy PWV (precipitable water vapor) using in situ pressure and temperature measurements, the basis for GPS meteorology. NOAA ESRL has implemented this concept with a nationwide distribution of more than 300 "GPSMet" stations providing IPW estimates at sub-hourly resolution currently used in operational weather models in the U.S.

  20. Analysis of potential errors in real-time streamflow data and methods of data verification by digital computer

    USGS Publications Warehouse

    Lystrom, David J.

    1972-01-01

    Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.

  1. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    PubMed

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  2. Chemical Analysis Results for Potable Water from ISS Expeditions 21 to 25

    NASA Technical Reports Server (NTRS)

    Straub, John E., II; Plumlee, Debrah K.; Schultz, John R.; McCoy, J. Torin

    2010-01-01

    The Johnson Space Center Water and Food Analytical Laboratory (WAFAL) performed detailed ground-based analyses of archival water samples for verification of the chemical quality of the International Space Station (ISS) potable water supplies for Expeditions 21 to 25. Over a 14-month period, the Space Shuttle visited the ISS on five occasions to complete construction and deliver supplies. The onboard supplies of potable water available for consumption by the Expeditions 21 to 25 crews consisted of Russian ground-supplied potable water, Russian potable water regenerated from humidity condensate, and US potable water recovered from urine distillate and condensate. Chemical archival water samples that were collected with U.S. hardware during Expeditions 21 to 25 were returned on Shuttle flights STS-129 (ULF3), STS-130 (20A), STS-131 (19A), STS-132 (ULF4) and STS-133 (ULF5), as well as on Soyuz flights 19-22. This paper reports the analytical results for the returned archival water samples and evaluates their compliance with ISS water quality standards. The WAFAL also received and analyzed aliquots of some Russian potable water samples collected in-flight and pre-flight samples of Rodnik potable water delivered to the Station on the Russian Progress vehicle during Expeditions 21 to 25. These additional analytical results are also reported and discussed in this paper.

  3. Lessons Learned from Optical Payload for Lasercomm Science (OPALS) Mission Operations

    NASA Technical Reports Server (NTRS)

    Sindiy, Oleg V.; Abrahamson, Matthew J.; Biswas, Abhijit; Wright, Malcolm W.; Padams, Jordan H.; Konyha, Alexander L.

    2015-01-01

    This paper provides an overview of Optical Payload for Lasercomm Science (OPALS) activities and lessons learned during mission operations. Activities described cover the periods of commissioning, prime, and extended mission operations, during which primary and secondary mission objectives were achieved for demonstrating space-to-ground optical communications. Lessons learned cover Mission Operations System topics in areas of: architecture verification and validation, staffing, mission support area, workstations, workstation tools, interfaces with support services, supporting ground stations, team training, procedures, flight software upgrades, post-processing tools, and public outreach.

  4. Feasibility and systems definition study for Microwave Multi-Application Payload (MMAP)

    NASA Technical Reports Server (NTRS)

    Horton, J. B.; Allen, C. C.; Massaro, M. J.; Zemany, J. L.; Murrell, J. W.; Stanhouse, R. W.; Condon, G. P.; Stone, R. F.; Swana, J.; Afifi, M.

    1977-01-01

    Work completed on three Shuttle/Spacelab experiments is examined: the Adaptive Multibeam Phased Array Antenna (AMPA) Experiment, Electromagnetic Environment Experiment (EEE) and Millimeter Wave Communications Experiment (MWCE). Results included the definition of operating modes, sequence of operation, radii of operation about several ground stations, signal format, foot prints of typical orbits and preliminary definition of ground and user terminals. Conceptual hardware designs, Spacelab interfaces, data handling methods, experiment testing and verification studies were included. The MWCE-MOD I was defined conceptually for a steerable high gain antenna.

  5. The use of computer graphic simulation in the development of on-orbit tele-robotic systems

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Hinman, Elaine

    1987-01-01

    This paper describes the use of computer graphic simulation techniques to resolve critical design and operational issues for robotic systems used for on-orbit operations. These issues are robot motion control, robot path-planning/verification, and robot dynamics. The major design issues in developing effective telerobotic systems are discussed, and the use of ROBOSIM, a NASA-developed computer graphic simulation tool, to address these issues is presented. Simulation plans for the Space Station and the Orbital Maneuvering Vehicle are presented and discussed.

  6. KSC-02pd1946

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. - An Orbital Sciences L-1011 aircraft arrives at the Cape Canaveral Air Force Station Skid Strip. Attached underneath the aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  7. KSC-02pd1951

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. -- Workers at the Cape Canaveral Air Force Station Skid Strip stand next to the Pegasus XL Expendable Launch Vehicle underneath the Orbital Sciences L-1011 aircraft. The Pegasus will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  8. Near-real-time Estimation and Forecast of Total Precipitable Water in Europe

    NASA Astrophysics Data System (ADS)

    Bartholy, J.; Kern, A.; Barcza, Z.; Pongracz, R.; Ihasz, I.; Kovacs, R.; Ferencz, C.

    2013-12-01

    Information about the amount and spatial distribution of atmospheric water vapor (or total precipitable water) is essential for understanding weather and the environment including the greenhouse effect, the climate system with its feedbacks and the hydrological cycle. Numerical weather prediction (NWP) models need accurate estimations of water vapor content to provide realistic forecasts including representation of clouds and precipitation. In the present study we introduce our research activity for the estimation and forecast of atmospheric water vapor in Central Europe using both observations and models. The Eötvös Loránd University (Hungary) operates a polar orbiting satellite receiving station in Budapest since 2002. This station receives Earth observation data from polar orbiting satellites including MODerate resolution Imaging Spectroradiometer (MODIS) Direct Broadcast (DB) data stream from satellites Terra and Aqua. The received DB MODIS data are automatically processed using freely distributed software packages. Using the IMAPP Level2 software total precipitable water is calculated operationally using two different methods. Quality of the TPW estimations is a crucial question for further application of the results, thus validation of the remotely sensed total precipitable water fields is presented using radiosonde data. In a current research project in Hungary we aim to compare different estimations of atmospheric water vapor content. Within the frame of the project we use a NWP model (DBCRAS; Direct Broadcast CIMSS Regional Assimilation System numerical weather prediction software developed by the University of Wisconsin, Madison) to forecast TPW. DBCRAS uses near real time Level2 products from the MODIS data processing chain. From the wide range of the derived Level2 products the MODIS TPW parameter found within the so-called mod07 results (Atmospheric Profiles Product) and the cloud top pressure and cloud effective emissivity parameters from the so-called mod06 results (Cloud Product) are assimilated twice a day (at 00 and 12 UTC) by DBCRAS. DBCRAS creates 72 hours long weather forecasts with 48 km horizontal resolution. DBCRAS is operational at the University since 2009 which means that by now sufficient data is available for the verification of the model. In the present study verification results for the DBCRAS total precipitable water forecasts are presented based on analysis data from the European Centre for Medium-Range Weather Forecasts (ECMWF). Numerical indices are calculated to quantify the performance of DBCRAS. During a limited time period DBCRAS was also ran without assimilating MODIS products which means that there is possibility to quantify the effect of assimilating MODIS physical products on the quality of the forecasts. For this limited time period verification indices are compared to decide whether MODIS data improves forecast quality or not.

  9. Evaluation of verification and testing tools for FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Smith, K. A.

    1980-01-01

    Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.

  10. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  11. Analysis of Signals from an Unique Ground-Truth Infrasound Source Observed at IMS Station IS26 in Southern Germany

    NASA Astrophysics Data System (ADS)

    Koch, Karl

    2010-05-01

    Quantitative modeling of infrasound signals and development and verification of the corresponding atmospheric propagation models requires the use of well-calibrated sources. Numerous sources have been detected by the currently installed network of about 40 of the final 60 IMS infrasound stations. Besides non-nuclear explosions such as mining and quarry blasts and atmospheric phenomena like auroras, these sources include meteorites, volcanic eruptions and supersonic aircraft including re-entering spacecraft and rocket launches. All these sources of infrasound have one feature in common, in that their source parameters are not precisely known and the quantitative interpretation of the corresponding signals is therefore somewhat ambiguous. A source considered well-calibrated has been identified producing repeated infrasound signals at the IMS infrasound station IS26 in the Bavarian forest. The source results from propulsion tests of the ARIANE-5 rocket's main engine at a testing facility near Heilbronn, southern Germany. The test facility is at a range of 320 km and a backazimuth of ~280° from IS26. Ground-truth information was obtained for nearly 100 tests conducted in a 5-year period. Review of the available data for IS26 revealed that at least 28 of these tests show signals above the background noise level. These signals are verified based on the consistency of various signal parameters, e.g., arrival times, durations, and estimates of propagation characteristics (backazimuth, apparent velocity). Signal levels observed are a factor of 2-8 above the noise and reach values of up to 250 mPa for peak amplitudes, and a factor of 2-3 less for RMS measurements. Furthermore, only tests conducted during the months from October to April produce observable signals, indicating a significant change in infrasound propagation conditions between summer and winter months.

  12. What would dense atmospheric observation networks bring to the quantification of city CO2 emissions?

    NASA Astrophysics Data System (ADS)

    Wu, Lin; Broquet, Grégoire; Ciais, Philippe; Bellassen, Valentin; Vogel, Felix; Chevallier, Frédéric; Xueref-Remy, Irène; Wang, Yilong

    2016-06-01

    Cities currently covering only a very small portion ( < 3 %) of the world's land surface directly release to the atmosphere about 44 % of global energy-related CO2, but they are associated with 71-76 % of CO2 emissions from global final energy use. Although many cities have set voluntary climate plans, their CO2 emissions are not evaluated by the monitoring, reporting, and verification (MRV) procedures that play a key role for market- or policy-based mitigation actions. Here we analyze the potential of a monitoring tool that could support the development of such procedures at the city scale. It is based on an atmospheric inversion method that exploits inventory data and continuous atmospheric CO2 concentration measurements from a network of stations within and around cities to estimate city CO2 emissions. This monitoring tool is configured for the quantification of the total and sectoral CO2 emissions in the Paris metropolitan area (˜ 12 million inhabitants and 11.4 TgC emitted in 2010) during the month of January 2011. Its performances are evaluated in terms of uncertainty reduction based on observing system simulation experiments (OSSEs). They are analyzed as a function of the number of sampling sites (measuring at 25 m a.g.l.) and as a function of the network design. The instruments presently used to measure CO2 concentrations at research stations are expensive (typically ˜ EUR 50 k per sensor), which has limited the few current pilot city networks to around 10 sites. Larger theoretical networks are studied here to assess the potential benefit of hypothetical operational lower-cost sensors. The setup of our inversion system is based on a number of diagnostics and assumptions from previous city-scale inversion experiences with real data. We find that, given our assumptions underlying the configuration of the OSSEs, with 10 stations only the uncertainty for the total city CO2 emission during 1 month is significantly reduced by the inversion by ˜ 42 %. It can be further reduced by extending the network, e.g., from 10 to 70 stations, which is promising for MRV applications in the Paris metropolitan area. With 70 stations, the uncertainties in the inverted emissions are reduced significantly over those obtained using 10 stations: by 32 % for commercial and residential buildings, by 33 % for road transport, by 18 % for the production of energy by power plants, and by 31 % for total emissions. These results indicate that such a high number of stations would be likely required for the monitoring of sectoral emissions in Paris using this observation-model framework. They demonstrate some high potential that atmospheric inversions can contribute to the monitoring and/or the verification of city CO2 emissions (baseline) and CO2 emission reductions (commitments) and the advantage that could be brought by the current developments of lower-cost medium precision (LCMP) sensors.

  13. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  14. Development and verification of hardware for life science experiments in the Japanese Experiment Module "Kibo" on the International Space Station.

    PubMed

    Ishioka, Noriaki; Suzuki, Hiromi; Asashima, Makoto; Kamisaka, Seiichiro; Mogami, Yoshihiro; Ochiai, Toshimasa; Aizawa-Yano, Sachiko; Higashibata, Akira; Ando, Noboru; Nagase, Mutsumu; Ogawa, Shigeyuki; Shimazu, Toru; Fukui, Keiji; Fujimoto, Nobuyoshi

    2004-03-01

    Japan Aerospace Exploration Agency (JAXA) has developed a cell biology experiment facility (CBEF) and a clean bench (CB) as a common hardware in which life science experiments in the Japanese Experiment Module (JEM known as "Kibo") of the International Space Station (ISS) can be performed. The CBEF, a CO2 incubator with a turntable that provides variable gravity levels, is the basic hardware required to carry out the biological experiments using microorganisms, cells, tissues, small animals, plants, etc. The CB provides a closed aseptic operation area for life science and biotechnology experiments in Kibo. A phase contrast and fluorescence microscope is installed inside CB. The biological experiment units (BEU) are designed to run individual experiments using the CBEF and the CB. A plant experiment unit (PEU) and two cell experiment units (CEU type1 and type2) for the BEU have been developed.

  15. Interface Management for a NASA Flight Project Using Model-Based Systems Engineering (MBSE)

    NASA Technical Reports Server (NTRS)

    Vipavetz, Kevin; Shull, Thomas A.; Infeld, Samatha; Price, Jim

    2016-01-01

    The goal of interface management is to identify, define, control, and verify interfaces; ensure compatibility; provide an efficient system development; be on time and within budget; while meeting stakeholder requirements. This paper will present a successful seven-step approach to interface management used in several NASA flight projects. The seven-step approach using Model Based Systems Engineering will be illustrated by interface examples from the Materials International Space Station Experiment-X (MISSE-X) project. The MISSE-X was being developed as an International Space Station (ISS) external platform for space environmental studies, designed to advance the technology readiness of materials and devices critical for future space exploration. Emphasis will be given to best practices covering key areas such as interface definition, writing good interface requirements, utilizing interface working groups, developing and controlling interface documents, handling interface agreements, the use of shadow documents, the importance of interface requirement ownership, interface verification, and product transition.

  16. Mature data transport and command management services for the Space Station

    NASA Technical Reports Server (NTRS)

    Carper, R. D.

    1986-01-01

    The duplex space/ground/space data services for the Space Station are described. The need to separate the uplink data service functions from the command functions is discussed. Command management is a process shared by an operation control center and a command management system and consists of four functions: (1) uplink data communications, (2) management of the on-board computer, (3) flight resource allocation and management, and (4) real command management. The new data service capabilities provided by microprocessors, ground and flight nodes, and closed loop and open loop capabilities are studied. The need for and functions of a flight resource allocation management service are examined. The system is designed so only users can access the system; the problems encountered with open loop uplink access are analyzed. The procedures for delivery of operational, verification, computer, and surveillance and monitoring data directly to users are reviewed.

  17. Knowledge-based assistance in costing the space station DMS

    NASA Technical Reports Server (NTRS)

    Henson, Troy; Rone, Kyle

    1988-01-01

    The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.

  18. Verification of the NWP models operated at ICM, Poland

    NASA Astrophysics Data System (ADS)

    Melonek, Malgorzata

    2010-05-01

    Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw (ICM) started its activity in the field of NWP in May 1997. Since this time the numerical weather forecasts covering Central Europe have been routinely published on our publicly available website. First NWP model used in ICM was hydrostatic Unified Model developed by the UK Meteorological Office. It was a mesoscale version with horizontal resolution of 17 km and 31 levels in vertical. At present two NWP non-hydrostatic models are running in quasi-operational regime. The main new UM model with 4 km horizontal resolution, 38 levels in vertical and forecats range of 48 hours is running four times a day. Second, the COAMPS model (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by the US Naval Research Laboratory, configured with the three nested grids (with coresponding resolutions of 39km, 13km and 4.3km, 30 vertical levels) are running twice a day (for 00 and 12 UTC). The second grid covers Central Europe and has forecast range of 84 hours. Results of the both NWP models, ie. COAMPS computed on 13km mesh resolution and UM, are verified against observations from the Polish synoptic stations. Verification uses surface observations and nearest grid point forcasts. Following meteorological elements are verified: air temperature at 2m, mean sea level pressure, wind speed and wind direction at 10 m and 12 hours accumulated precipitation. There are presented different statistical indices. For continous variables Mean Error(ME), Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) in 6 hours intervals are computed. In case of precipitation the contingency tables for different thresholds are computed and some of the verification scores such as FBI, ETS, POD, FAR are graphically presented. The verification sample covers nearly one year.

  19. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  20. Hardware Design Improvements to the Major Constituent Analyzer

    NASA Technical Reports Server (NTRS)

    Combs, Scott; Schwietert, Daniel; Anaya, Marcial; DeWolf, Shannon; Merrill, Dave; Gardner, Ben D.; Thoresen, Souzan; Granahan, John; Belcher, Paul; Matty, Chris

    2011-01-01

    The Major Constituent Analyzer (MCA) onboard the International Space Station (ISS) is designed to monitor the major constituents of the ISS's internal atmosphere. This mass spectrometer based system is an integral part of the Environmental Control and Life Support System (ECLSS) and is a primary tool for the management of ISS atmosphere composition. As a part of NASA Change Request CR10773A, several alterations to the hardware have been made to accommodate improved MCA logistics. First, the ORU 08 verification gas assembly has been modified to allow the verification gas cylinder to be installed on orbit. The verification gas is an essential MCA consumable that requires periodic replenishment. Designing the cylinder for subassembly transport reduces the size and weight of the maintained item for launch. The redesign of the ORU 08 assembly includes a redesigned housing, cylinder mounting apparatus, and pneumatic connection. The second hardware change is a redesigned wiring harness for the ORU 02 analyzer. The ORU 02 electrical connector interface was damaged in a previous on-orbit installation, and this necessitated the development of a temporary fix while a more permanent solution was developed. The new wiring harness design includes flexible cable as well as indexing fasteners and guide-pins, and provides better accessibility during the on-orbit maintenance operation. This presentation will describe the hardware improvements being implemented for MCA as well as the expected improvement to logistics and maintenance.

  1. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  2. Trace Contaminant Control for the International Space Station's Node 1- Analysis, Design, and Verification

    NASA Technical Reports Server (NTRS)

    Perry, J. L.

    2017-01-01

    Trace chemical contaminant generation inside crewed spacecraft cabins is a technical and medical problem that must be continuously evaluated. Although passive control through materials selection and active control by adsorption and catalytic oxidation devices is employed during normal operations of a spacecraft, contaminant buildup can still become a problem. Buildup is particularly troublesome during the stages between the final closure of a spacecraft during ground processing and the time that a crewmember enters for the first time during the mission. Typically, the elapsed time between preflight closure and first entry on orbit for spacecraft such as Spacelab modules was 30 days. During that time, the active contamination control systems are not activated and contaminants can potentially build up to levels which exceed the spacecraft maximum allowable concentrations (SMACs) specified by NASA toxicology experts. To prevent excessively high contamination levels at crew entry, the Spacelab active contamination control system was operated for 53 hours just before launch.

  3. Chemical Analysis Results for Potable Water from ISS Expeditions 21 Through 25

    NASA Technical Reports Server (NTRS)

    Straub, John E., II; Plumlee, Debrah K.; Schultz, John R.; McCoy, J. Torin

    2011-01-01

    The Johnson Space Center Water and Food Analytical Laboratory (WAFAL) performed detailed ground-based analyses of archival water samples for verification of the chemical quality of the International Space Station (ISS) potable water supplies for Expeditions 21 through 25. Over a 14-month period the Space Shuttle visited the ISS on four occasions to complete construction and deliver supplies. The onboard supplies of potable water available for consumption by the Expeditions 21 to 25 crews consisted of Russian ground-supplied potable water, Russian potable water regenerated from humidity condensate, and US potable water recovered from urine distillate and condensate. Chemical archival water samples that were collected with U.S. hardware during Expeditions 21 to 25 were returned on Shuttle flights STS-129 (ULF3), STS-130 (20A), STS-131 (19A), and STS-132 (ULF4), as well as on Soyuz flights 19-23. This paper reports the analytical results for these returned potable water archival samples and their compliance with ISS water quality standards.

  4. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ANEST IWATA CORPORATION W400-LV SPRAY GUN

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SHARPE MANUFACTURING TITANIUM T1-CG SPRAY GUN

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...

  7. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...

  8. 10 CFR 300.11 - Independent verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...

  9. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G. P.

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  10. Real-Time Kennedy Space Center and Cape Canaveral Air Force Station High-Resolution Model Implementation and Verification

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn; Watson, Leela R.

    2015-01-01

    NASA's Launch Services Program, Ground Systems Development and Operations, Space Launch System and other programs at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) use the daily and weekly weather forecasts issued by the 45th Weather Squadron (45 WS) as decision tools for their day-to-day and launch operations on the Eastern Range (ER). Examples include determining if they need to limit activities such as vehicle transport to the launch pad, protect people, structures or exposed launch vehicles given a threat of severe weather, or reschedule other critical operations. The 45 WS uses numerical weather prediction models as a guide for these weather forecasts, particularly the Air Force Weather Agency (AFWA) 1.67 km Weather Research and Forecasting (WRF) model. Considering the 45 WS forecasters' and Launch Weather Officers' (LWO) extensive use of the AFWA model, the 45 WS proposed a task at the September 2013 Applied Meteorology Unit (AMU) Tasking Meeting requesting the AMU verify this model. Due to the lack of archived model data available from AFWA, verification is not yet possible. Instead, the AMU proposed to implement and verify the performance of an ER version of the high-resolution WRF Environmental Modeling System (EMS) model configured by the AMU (Watson 2013) in real time. Implementing a real-time version of the ER WRF-EMS would generate a larger database of model output than in the previous AMU task for determining model performance, and allows the AMU more control over and access to the model output archive. The tasking group agreed to this proposal; therefore the AMU implemented the WRF-EMS model on the second of two NASA AMU modeling clusters. The AMU also calculated verification statistics to determine model performance compared to observational data. Finally, the AMU made the model output available on the AMU Advanced Weather Interactive Processing System II (AWIPS II) servers, which allows the 45 WS and AMU staff to customize the model output display on the AMU and Range Weather Operations (RWO) AWIPS II client computers and conduct real-time subjective analyses.

  11. Recent literature on structural modeling, identification, and analysis

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.

    1990-01-01

    The literature on the mathematical modeling of large space structures is first reviewed, with attention given to continuum models, model order reduction, substructuring, and computational techniques. System identification and mode verification are then discussed with reference to the verification of mathematical models of large space structures. In connection with analysis, the paper surveys recent research on eigensolvers and dynamic response solvers for large-order finite-element-based models.

  12. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  13. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  14. Consistent Structural Integrity and Efficient Certification with Analysis. Volume 3: Appendices of Verification and Validation Examples, Correlation Factors, and Failure Criteria

    DTIC Science & Technology

    2005-05-01

    TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE

  15. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  16. Mechanical verification of a schematic Byzantine clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Shankar, Natarajan

    1991-01-01

    Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.

  17. Developing a NASA strategy for the verification of large space telescope observatories

    NASA Astrophysics Data System (ADS)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  18. Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network

    NASA Astrophysics Data System (ADS)

    Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey

    2010-05-01

    The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.

  19. Efficient Certificate Verification for Vehicle-to-Grid Communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akkaya, Kemal; Saputro, Nico; Tonyali, Samet

    While public charging stations are typically used for Electric Vehicle (EV) such as charging, home microgrids that may act as private charging stations are also expected to be used for meeting the increased EV charging demands in the future. Such home microgrids can be accessible through their smart meters, which makes advanced metering infrastructure (AMI) a viable alternative for vehicle-to-grid (V2G) communications. However, to ensure secure V2G communications using public-keys, smart meters will need to maintain certificate revocation lists (CRLs) not just for the AMI network but also for large number of EVs that may interact with them. For resource-constrainedmore » smart meters, this will increase the storage requirements and introduce additional overhead in terms of delay and CRL maintenance. To eliminate this burden, we propose keeping merely non-revoked certificates that belong to EVs, which are usually driven within the vicinity of that particular microgrid. The motivation comes from the fact that it is inefficient to distribute and store a large CRL that has revocation information about all EVs in the whole system as most of these EVs will never come to the geographic vicinity of that home microgrid. The approach ensures that any status changes of these certificates are communicated to the smart meters. We implemented the proposed approach in a realistic V2G communication scenario by using IEEE 802.11s mesh as the underlying AMI infrastructure using ns-3 simulator. The results confirmed that the proposed approach significantly reduces the certificate verification time and the storage requirements on smart meters.« less

  20. An overview of 2016 WISE Urban Summer Observation Campaign (WUSOC 2016) in the Seoul metropolitan area of South Korea

    NASA Astrophysics Data System (ADS)

    Jung, Jae-Won; Kim, Sang-Woo; Shim, Jae-Kwan; Kwak, Kyung-Hwan

    2017-04-01

    The Weather Information Service Engine (WISE), launched project of the Korea Meteorological Administration (KMA), aims to operate the urban meteorological observation network from 2012 to 2019 and to test and operate the application weather service (e.g., flash flood, road weather, city ecology, city microclimate, dispersion of hazardous substance etc.) in 2019 through the development of Advanced Storm-scale Analysis Prediction System(ASAPS) for the production of storm-scale hazard weather monitoring and prediction system. The WISE institute has completed construction of 31 urban meteorological observation cities in Seoul metropolitan area and has built a real-time test operation and verification system by improving the ASAPS that produces 1 km and 6 hour forecast information based on the 5 km forecast information of KMA. Field measurements of 2016 WISE Urban Summer Observation Campaign (WUSOC 2016) was conducted in the Seoul metropolitan area of South Korea from August 22 to October 14, 2016. Involving over 70 researchers from more than 12 environmental and atmospheric science research groups in South Korea, WUSOC2016 focused on special observations, severe rain storm observations using mobile observation car and radiosonde, wind profile observations using Wind Doppler Lidar and radiosonde, etc., around the Seoul metropolitan area. WUSOC2016 purpose at data quality control, accuracy verification, usability check, and quality improvement of ASAPS at observation stations constructed in WISE. In addition, we intend to contribute to the activation of urban fusion weather research and risk weather research through joint observation and data sharing.

  1. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  2. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  3. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  4. Aqueous cleaning and verification processes for precision cleaning of small parts

    NASA Technical Reports Server (NTRS)

    Allen, Gale J.; Fishell, Kenneth A.

    1995-01-01

    The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.

  5. A Spatio-Temporal Analysis of the Relationship Between Near-Surface Air Temperature and Satellite Land Surface Temperatures Using 17 Years of Data from the ATSR Series

    NASA Astrophysics Data System (ADS)

    Ghent, D.; Good, E.; Bulgin, C.; Remedios, J. J.

    2017-12-01

    Surface temperatures (ST) over land have traditionally been measured at weather stations. There are many parts of the globe with very few stations, e.g. across much of Africa, leading to gaps in ST datasets, affecting our understanding of how ST is changing, and the impacts of extreme events. Satellites can provide global ST data but these observations represent how hot the land ST (LST; including the uppermost parts of e.g. trees, buildings) is to touch, whereas stations measure the air temperature just above the surface (T2m). Satellite LST data may only be available in cloud-free conditions and data records are frequently <10-15 years in length. Consequently, satellite LST data have not yet featured widely in climate studies. In this study, the relationship between clear-sky satellite LST and all-sky T2m is characterised in space and time using >17 years of data. The analysis uses a new monthly LST climate data record (CDR) based on the Along-Track Scanning Radiometer (ATSR) series, which has been produced within the European Space Agency GlobTemperature project. The results demonstrate the dependency of the global LST-T2m differences on location, land cover, vegetation and elevation. LSTnight ( 10 pm local solar time) is found to be closely coupled with minimum T2m (Tmin) and the two temperatures generally consistent to within ±5 °C (global median LSTnight- Tmin= 1.8 °C, interquartile range = 3.8 °C). The LSTday ( 10 am local time)-maximum T2m (Tmax) variability is higher because LST is strongly influenced by insolation and surface regime (global median LSTday-Tmax= -0.1 °C, interquartile range = 8.1 °C). Correlations for both temperature pairs are typically >0.9 outside of the tropics. A crucial aspect of this study is a comparison between the monthly global anomaly time series of LST and CRUTEM4 T2m. The time series agree remarkably well, with a correlation of 0.9 and 90% of the CDR anomalies falling within the T2m 95% confidence limits (see figure). This analysis provides independent verification of the 1995-2012 T2m anomaly time series, suggesting that LST can provide a complementary perspective on global ST change. The results presented give justification for increasing use of satellite LST data in climate and weather science, both as an independent variable, and to augment T2m data acquired at weather stations.

  6. Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure

    DTIC Science & Technology

    2016-05-09

    Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure Amanda S. Appel,† John H. McDonough,‡ Joseph D...feasible. In this study, hair was evaluated as a long-term repository of nerve agent hydrolysis products. Pinacolyl methylphosphonic acid (PMPA...hydrolysis product of soman) and isopropyl methylphosphonic acid (IMPA; hydrolysis product of sarin) were extracted from hair samples with N,N

  7. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  8. An Ecosystem Model for the Simulation of Physical and Biological Oceanic Processes-IDAPAK User's Guide and Applications

    NASA Technical Reports Server (NTRS)

    McClain, Charles R.; Arrigo, Kevin; Murtugudde, Ragu; Signorini, Sergio R.; Tai, King-Sheng

    1998-01-01

    This TM describes the development, testing, and application of a 4-component (phytoplankton, zooplankton, nitrate, and ammonium) ecosystem model capable of simulating oceanic biological processes. It also reports and documents an in-house software package (Interactive Data Analysis Package - IDAPAK) for interactive data analysis of geophysical fields, including those related to the forcing, verification, and analysis of the ecosystem model. Two regions were studied in the Pacific: the Warm Pool (WP) in the Equatorial Pacific (165 deg. E at the equator) and at Ocean Weather Station P (OWS P) in the Northeast Pacific (50 deg. N, 145 deg. W). The WP results clearly indicate that the upwelling at 100 meters correlates well with surface blooms. The upwelling events in late 1987 and 1990 produced dramatic increases in the surface layer values of all 4 ecosystem components, whereas the spring-summer deep mixing events, do not seem to incur a significant response in any of the ecosystem quantities. The OWS P results show that the monthly profiles of temperature, the annual cycles of solar irradiance, and 0- to 50-m integrated nitrate accurately reproduce observed values. Annual primary production is 190 gC/m(exp 2)/yr, which is consistent with recent observations but is much greater than earlier estimates.

  9. IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.

    PubMed

    Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis

    2018-04-01

    Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.

  10. Surface Landing Site Weather Analysis for NASA's Constellation Program

    NASA Technical Reports Server (NTRS)

    Altino, Karen M.; Burns, K. L.

    2008-01-01

    Weather information is an important asset for NASA's Constellation Program in developing the next generation space transportation system to fly to the International Space Station, the Moon and, eventually, to Mars. Weather conditions can affect vehicle safety and performance during multiple mission phases ranging from pre-launch ground processing of the Ares vehicles to landing and recovery operations, including all potential abort scenarios. Meteorological analysis is art important contributor, not only to the development and verification of system design requirements but also to mission planning and active ground operations. Of particular interest are the surface weather conditions at both nominal and abort landing sites for the manned Orion capsule. Weather parameters such as wind, rain, and fog all play critical roles in the safe landing of the vehicle and subsequent crew and vehicle recovery. The Marshall Space Flight Center (MSFC) Natural Environments Branch has been tasked by the Constellation Program with defining the natural environments at potential landing zones. This paper wiI1 describe the methodology used for data collection and quality control, detail the types of analyses performed, and provide a sample of the results that cab be obtained.

  11. Evaluation of MPLM Design and Mission 6A Coupled Loads Analyses

    NASA Technical Reports Server (NTRS)

    Bookout, Paul S.; Ricks, Ed

    1999-01-01

    Through the development of a space shuttle payload, there are usually several coupled loads analyses (CLA) performed: preliminary design, critical design, final design and verification loads analysis (VLA). A final design CLA is the last analysis conducted prior to model delivery to the shuttle program for the VLA. The finite element models used in the final design CLA and the VLA are test verified dynamic math models. Mission 6A is the first of many flights of the Multi-Purpose Logistics Module (MPLM). The MPLM was developed by Alenia Spazio S.p.A. (an Italian aerospace company) and houses the International Standard Payload Racks (ISPR) for transportation to the space station in the shuttle. Marshall Space Flight Center (MSFC), the payload integrator of the MPLM for Mission 6A, performed the final design CLA using the M6.OZC shuttle data for liftoff and landing conditions using the proper shuttle cargo manifest. Alenia performed the preliminary and critical design CLAs for the development of the MPLM. However, these CLAs did not use the current Mission 6A cargo manifest. An evaluation of the preliminary and critical design performed by Alenia and the final design performed by MSFC is presented.

  12. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    NASA Astrophysics Data System (ADS)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  13. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--CAPSTONE 30KW MICROTURBINE SYSTEM

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system was evaluated based on the Capstone 30kW Microturbine developed by Cain Ind...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--MARTIN MACHINERY INTERNAL COMBUSTION ENGINE

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system designed by Martin Machinery was evaluated. This paper provides test result...

  16. Finding the Bio in Biobased Products: Electrophoretic Identification of Wheat Proteins in Processed Products

    USDA-ARS?s Scientific Manuscript database

    Verification of the bio-content in bio-based or green products identifies genuine products, exposes counterfeit copies, supports or refutes content claims and ensures consumer confidence. When the bio-content includes protein, elemental nitrogen analysis is insufficient for verification since non-pr...

  17. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  18. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  19. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  20. Application of the precipitation-runoff modeling system to the Ah- shi-sle-pah Wash watershed, San Juan County, New Mexico

    USGS Publications Warehouse

    Hejl, H.R.

    1989-01-01

    The precipitation-runoff modeling system was applied to the 8.21 sq-mi drainage area of the Ah-shi-sle-pah Wash watershed in northwestern New Mexico. The calibration periods were May to September of 1981 and 1982, and the verification period was May to September 1983. Twelve storms were available for calibration and 8 storms were available for verification. For calibration A (hydraulic conductivity estimated from onsite data and other storm-mode parameters optimized), the computed standard error of estimate was 50% for runoff volumes and 72% of peak discharges. Calibration B included hydraulic conductivity in the optimization, which reduced the standard error of estimate to 28 % for runoff volumes and 50% for peak discharges. Optimized values for hydraulic conductivity resulted in reductions from 1.00 to 0.26 in/h and 0.20 to 0.03 in/h for the 2 general soils groups in the calibrations. Simulated runoff volumes using 7 of 8 storms occurring during the verification period had a standard error of estimate of 40% for verification A and 38% for verification B. Simulated peak discharge had a standard error of estimate of 120% for verification A and 56% for verification B. Including the eighth storm which had a relatively small magnitude in the verification analysis more than doubled the standard error of estimating volumes and peaks. (USGS)

  1. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  2. The National Data Center Preparedness Exercise 2009 - First Results

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Bönnemann, Christian; Ceranna, Lars; Wotawa, Gerhard

    2010-05-01

    The NDC preparedness initiative was initiated by 8 signature states. It has now a history of more than 2 years with two successful exercises and subsequent fruitful discussions during the NDC Evaluation Workshops of the CTBTO. The first exercise was carried out in 2007 (NPE07). The objectives of and the idea behind this exercise have been described in the working paper CTBT/WGB-28/DE-IT/1 of the CTBTO. The exercise simulates a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of analysis procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the NPE is a measure for the readiness of the NDCs to fulfil their duties in regard of the CTBT verification: the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. The NPE09 has started on 1 October 2009, 00:00 UTC. In addition to the previous exercises, three technologies (seismology, infrasound, and radionuclide) have been taken into account leading to tentative mock events generated by strong explosions in open pit mines. Consequently, the first event, which fulfils all previously defined criteria, was close to the Kara-Zhyra mine in Eastern Kazakhstan and occurred on 28 November 2009 at 07:20:31 UTC. It generated seismic signals as well as infrasound signals at the closest IMS stations. The forward atmospheric transport modelling indicated that a sufficient number of radionuclide stations were also affected to enable the application of a negative testing scenario. First results of the seismo-acoustic analysis of the NPE09 event were presented along with details on the event selection process.

  3. Apollo experience report: Guidance and control systems. Engineering simulation program

    NASA Technical Reports Server (NTRS)

    Gilbert, D. W.

    1973-01-01

    The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.

  4. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  5. Space Station Facility government estimating

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1993-01-01

    This new, unique Cost Engineering Report introduces the 800-page, C-100 government estimate for the Space Station Processing Facility (SSPF) and Volume IV Aerospace Construction Price Book. At the January 23, 1991, bid opening for the SSPF, the government cost estimate was right on target. Metric, Inc., Prime Contractor, low bid was 1.2 percent below the government estimate. This project contains many different and complex systems. Volume IV is a summary of the cost associated with construction, activation and Ground Support Equipment (GSE) design, estimating, fabrication, installation, testing, termination, and verification of this project. Included are 13 reasons the government estimate was so accurate; abstract of bids, for 8 bidders and government estimate with additive alternates, special labor and materials, budget comparison and system summaries; and comments on the energy credit from local electrical utility. This report adds another project to our continuing study of 'How Does the Low Bidder Get Low and Make Money?' which was started in 1967, and first published in the 1973 AACE Transaction with 18 ways the low bidders get low. The accuracy of this estimate proves the benefits of our Kennedy Space Center (KSC) teamwork efforts and KSC Cost Engineer Tools which are contributing toward our goals of the Space Station.

  6. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, workers in cherry pickers (left and right) help direct the offloading of the Italian Space Agency's Multi- Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  7. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, cranes are poised to help offload the Italian Space Agency's Multi- Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  8. Methods for estimating tributary streamflow in the Chattahoochee River basin between Buford Dam and Franklin, Georgia

    USGS Publications Warehouse

    Stamey, Timothy C.

    1998-01-01

    Simple and reliable methods for estimating hourly streamflow are needed for the calibration and verification of a Chattahoochee River basin model between Buford Dam and Franklin, Ga. The river basin model is being developed by Georgia Department of Natural Resources, Environmental Protection Division, as part of their Chattahoochee River Modeling Project. Concurrent streamflow data collected at 19 continuous-record, and 31 partial-record streamflow stations, were used in ordinary least-squares linear regression analyses to define estimating equations, and in verifying drainage-area prorations. The resulting regression or drainage-area ratio estimating equations were used to compute hourly streamflow at the partial-record stations. The coefficients of determination (r-squared values) for the regression estimating equations ranged from 0.90 to 0.99. Observed and estimated hourly and daily streamflow data were computed for May 1, 1995, through October 31, 1995. Comparisons of observed and estimated daily streamflow data for 12 continuous-record tributary stations, that had available streamflow data for all or part of the period from May 1, 1995, to October 31, 1995, indicate that the mean error of estimate for the daily streamflow was about 25 percent.

  9. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  10. Proceedings of the IDA (Institute for Defense Analyses) Workshop on Formal Specification and Verification of Ada (Trade Name) (2nd) Held in Alexandria, Virginia on July 23-25, 1985.

    DTIC Science & Technology

    1985-11-01

    2% -N X Mailing Directory U Bernard Abrams ABRAMS@USC-ECLB Grumman Aerospace Corporation Mail Station 001-31T Bethpage, NY 11714 (516) 575-9487 Omar...Aerospace & Comm. Corp. 10440 State Highway 83 Colorado Springs, Colorado 80908 Mark R. Cornwell CORNWELL @NRL-CSS Code 7590 Naval Research Lab Washington...5) Role of the Formal Definition of Ada Bernard Lang, INRIA, no date, 10 pages [6) The Users of a Formal Definition for Ada Bernd Krieg-Brdckner 2

  11. Test bed design for evaluating the Space Station ECLSS Water Recovery System

    NASA Technical Reports Server (NTRS)

    Ezell, Timothy G.; Long, David A.

    1990-01-01

    The design of the Phase III Environmental Control and Life Support System (ECLSS) Water Recovery System (WRS) test bed is in progress at the Marshall Space Flight Center (MSFC), building 4755, in Huntsville, Alabama. The overall design for the ECLSS WRS test bed will be discussed. Described within this paper are the design, fabrication, placement, and testing of the supporting facility which will provide the test bed for the ECLSS subsystems. Topics to be included are sterilization system design, component selection, microbial design considerations, and verification of test bed design prior to initiating WRS testing.

  12. Experimental verification of concrete resistance against effect of low pH

    NASA Astrophysics Data System (ADS)

    Dobias, D.; Rehacek, S.; Pokorny, P.; Citek, D.; Kolisko, J.

    2018-03-01

    In the introductory part of this article, the principles of a concrete degradation by organic acids are mentioned, these acids occur, particularly in silage and haylage troughs, biogas stations, on concrete floors and grates in the vicinity of drinking basins with an addition of formic acid and also in fermenters and slurry reservoirs. In the experimental part, the first results of monitoring resistance of a concrete with a sealing admixture on the basis of styrene-acrylate against an effect of a low pH are presented. Additional accompanying tests are stated in the tested concretes.

  13. Integration Testing of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Honeycutt, Timothy; Sowards, Stephanie

    2008-01-01

    Based on the previous success' of Multi-Element Integration Testing (MEITs) for the International Space Station Program, these type of integrated tests have also been planned for the Constellation Program: MEIT (1) CEV to ISS (emulated) (2) CEV to Lunar Lander/EDS (emulated) (3) Future: Lunar Surface Systems and Mars Missions Finite Element Integration Test (FEIT) (1) CEV/CLV (2) Lunar Lander/EDS/CaL V Integrated Verification Tests (IVT) (1) Performed as a subset of the FEITs during the flight tests and then performed for every flight after Full Operational Capability (FOC) has been obtained with the flight and ground Systems.

  14. KSC-02pd1949

    NASA Image and Video Library

    2002-12-17

    KENNEDY SPACE CENTER, FLA. -- Workers at the Cape Canaveral Air Force Station Skid Strip get ready to remove the Pegasus XL Expendable Launch Vehicle attached underneath the Orbital Sciences L-1011 aircraft. The Pegasus will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .

  15. Feasibility and systems definition study for microwave multi-application payload (MMAP)

    NASA Technical Reports Server (NTRS)

    Horton, J. B.; Allen, C. C.; Massaro, M. J.; Zemany, J. L.; Murrell, J. W.; Stanhouse, R. W.; Condon, G. P.; Stone, R. F.

    1977-01-01

    There were three Shuttle/Spacelab experiments: adaptive multibeam phased array antenna (AMPA) experiment, electromagnetic environment experiment (EEE), and millimeter wave communications experiment (MWCE). Work on the AMPA experiment was completed. Results included are definition of operating modes, sequence of operation, radii of operation about several ground stations, signal format, foot prints of typical orbits and preliminary definition of ground and user terminals. Definition of the MOD I EEE included conceptual hardware designs, spacelab interfaces, preliminary data handling methods, experiment tests and verification, and EMC studies. The MWCE was defined conceptually for a steerable high gain antenna.

  16. New climatic classification of Nepal

    NASA Astrophysics Data System (ADS)

    Karki, Ramchandra; Talchabhadel, Rocky; Aalto, Juha; Baidya, Saraju Kumar

    2016-08-01

    Although it is evident that Nepal has an extremely wide range of climates within a short latitudinal distance, there is a lack of comprehensive research in this field. The climatic zoning in a topographically complex country like Nepal has important implications for the selection of scientific station network design and climate model verification, as well as for studies examining the effects of climate change in terms of shifting climatic boundaries and vegetation in highly sensitive environments. This study presents a new high-resolution climate map of Nepal on the basis of long-term (1981-2010) monthly precipitation data for 240 stations and mean air temperature data for 74 stations, using original and modified Köppen-Geiger climate classification systems. Climatic variables used in Köppen-Geiger system were calculated (i) at each station and (ii) interpolated to 1-km spatial resolution using kriging which accounted for latitude, longitude, and elevation. The original Köppen-Geiger scheme could not identify all five types of climate (including tropical) observed in Nepal. Hence, the original scheme was slightly modified by changing the boundary of coldest month mean air temperature value from 18 °C to 14.5 °C in order to delineate the realistic climatic condition of Nepal. With this modification, all five types of climate (including tropical) were identified. The most common dominant type of climate for Nepal is temperate with dry winter and hot summer (Cwa).

  17. Human Factors Analysis and Layout Guideline Development for the Canadian Surface Combatant (CSC) Project

    DTIC Science & Technology

    2013-04-01

    project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of

  18. Antenna modeling considerations for accurate SAR calculations in human phantoms in close proximity to GSM cellular base station antennas.

    PubMed

    van Wyk, Marnus J; Bingle, Marianne; Meyer, Frans J C

    2005-09-01

    International bodies such as International Commission on Non-Ionizing Radiation Protection (ICNIRP) and the Institute for Electrical and Electronic Engineering (IEEE) make provision for human exposure assessment based on SAR calculations (or measurements) and basic restrictions. In the case of base station exposure this is mostly applicable to occupational exposure scenarios in the very near field of these antennas where the conservative reference level criteria could be unnecessarily restrictive. This study presents a variety of critical aspects that need to be considered when calculating SAR in a human body close to a mobile phone base station antenna. A hybrid FEM/MoM technique is proposed as a suitable numerical method to obtain accurate results. The verification of the FEM/MoM implementation has been presented in a previous publication; the focus of this study is an investigation into the detail that must be included in a numerical model of the antenna, to accurately represent the real-world scenario. This is accomplished by comparing numerical results to measurements for a generic GSM base station antenna and appropriate, representative canonical and human phantoms. The results show that it is critical to take the disturbance effect of the human phantom (a large conductive body) on the base station antenna into account when the antenna-phantom spacing is less than 300 mm. For these small spacings, the antenna structure must be modeled in detail. The conclusion is that it is feasible to calculate, using the proposed techniques and methodology, accurate occupational compliance zones around base station antennas based on a SAR profile and basic restriction guidelines. (c) 2005 Wiley-Liss, Inc.

  19. A Synoptic Weather Typing Approach and Its application to Assess Climate Change Impacts on Extreme Weather Events at Local Scale in South-Central Canada

    NASA Astrophysics Data System (ADS)

    Shouquan Cheng, Chad; Li, Qian; Li, Guilong

    2010-05-01

    The synoptic weather typing approach has become popular in evaluating the impacts of climate change on a variety of environmental problems. One of the reasons is its ability to categorize a complex set of meteorological variables as a coherent index, which can facilitate analyses of local climate change impacts. The weather typing method has been successfully applied in Environment Canada for several research projects to analyze climatic change impacts on a number of extreme weather events, such as freezing rain, heavy rainfall, high-/low-flow events, air pollution, and human health. These studies comprise of three major parts: (1) historical simulation modeling to verify the extreme weather events, (2) statistical downscaling to provide station-scale future hourly/daily climate data, and (3) projections of changes in frequency and intensity of future extreme weather events in this century. To achieve these goals, in addition to synoptic weather typing, the modeling conceptualizations in meteorology and hydrology and a number of linear/nonlinear regression techniques were applied. Furthermore, a formal model result verification process has been built into each of the three parts of the projects. The results of the verification, based on historical observations of the outcome variables predicted by the models, showed very good agreement. The modeled results from these projects found that the frequency and intensity of future extreme weather events are projected to significantly increase under a changing climate in this century. This talk will introduce these research projects and outline the modeling exercise and result verification process. The major findings on future projections from the studies will be summarized in the presentation as well. One of the major conclusions from the studies is that the procedures (including synoptic weather typing) used in the studies are useful for climate change impact analysis on future extreme weather events. The implication of the significant increases in frequency and intensity of future extreme weather events would be useful to be considered when revising engineering infrastructure design standards and developing adaptation strategies and policies.

  20. Merging Infrasound and Electromagnetic Signals as a Means for Nuclear Explosion Detection

    NASA Astrophysics Data System (ADS)

    Ashkenazy, Joseph; Lipshtat, Azi; Kesar, Amit S.; Pistinner, Shlomo; Ben Horin, Yochai

    2016-04-01

    The infrasound monitoring network of the CTBT consists of 60 stations. These stations are capable of detecting atmospheric events, and may provide approximate location within time scale of a few hours. However, the nature of these events cannot be deduced from the infrasound signal. More than two decades ago it was proposed to use the electromagnetic pulse (EMP) as a means of discriminating nuclear explosion from other atmospheric events. An EMP is a unique signature of nuclear explosion and is not detected from chemical ones. Nevertheless, it was decided to exclude the EMP technology from the official CTBT verification regime, mainly because of the risk of high false alarm rate, due to lightning electromagnetic pulses [1]. Here we present a method of integrating the information retrieved from the infrasound system with the EMP signal which enables us to discriminate between lightning discharges and nuclear explosions. Furthermore, we show how spectral and other characteristics of the electromagnetic signal emitted from a nuclear explosion are distinguished from those of lightning discharge. We estimate the false alarm probability of detecting a lightning discharge from a given area of the infrasound event, and identifying it as a signature of a nuclear explosion. We show that this probability is very low and conclude that the combination of infrasound monitoring and EMP spectral analysis may produce a reliable method for identifying nuclear explosions. [1] R. Johnson, Unfinished Business: The Negotiation of the CTBT and the End of Nuclear Testing, United Nations Institute for Disarmament Research, 2009.

  1. Nitrogen Dioxide Total Column Over Terra Nova Bay Station - Antarctica - During 2001

    NASA Astrophysics Data System (ADS)

    Bortoli, D.; Ravegnani, F.; Giovanelli, G.; Petritoli, A.; Kostadinov, I.

    GASCOD (Gas Analyzer Spectrometer Correlating Optical Differences), installed at the Italian Antarctic Station of Terra Nova Bay (TNB) - 74.69S, 164.12E - since 1995, carried out a full dataset of zenith scattered light measurements for the year 2001. The application of DOAS methodology to the collected data gave as final results, the slant column values for nitrogen dioxide. The seasonal variation shows a maxi- mum in the summer and it is in good agreement with the results obtained by other authors. The data analysis is performed by using different parameters like the po- tential vorticity (PV) at 500 K and the atmospheric temperatures at the same level. After the verification of the linear dependency between the NO2 slant column values and the temperature of NO2 cross section utilized in the DOAS algorithm, the actual stratospheric temperatures (from ECMWF) over TNB are applied to the results. The sensible changes in the nitrogen dioxide slant column values allow to highlight the good matching between the NO2 AM/PM ratio and the potential vorticity at 500 K. The NO2 slant column values follow the variations of the stratospheric temperature mainly during the spring season, when the lowest temperatures are observed and the ozone-hole phenomena mainly occur. ACKNOWLEDGMENTS: The author Daniele Bortoli was financially supported by the "Subprograma Ciência e Tecnologia do Ter- ceiro Quadro Comunitário de Apoio". The National Program for Antarctic Research (PNRA) supported this research.

  2. Atmospheric inversion for cost effective quantification of city CO2 emissions

    NASA Astrophysics Data System (ADS)

    Wu, L.; Broquet, G.; Ciais, P.; Bellassen, V.; Vogel, F.; Chevallier, F.; Xueref-Remy, I.; Wang, Y.

    2015-11-01

    Cities, currently covering only a very small portion (< 3 %) of the world's land surface, directly release to the atmosphere about 44 % of global energy-related CO2, and are associated with 71-76 % of CO2 emissions from global final energy use. Although many cities have set voluntary climate plans, their CO2 emissions are not evaluated by Monitoring, Reporting and Verification (MRV) procedures that play a key role for market- or policy-based mitigation actions. Here we propose a monitoring tool that could support the development of such procedures at the city scale. It is based on an atmospheric inversion method that exploits inventory data and continuous atmospheric CO2 concentration measurements from a network of stations within and around cities to estimate city CO2 emissions. We examine the cost-effectiveness and the performance of such a tool. The instruments presently used to measure CO2 concentrations at research stations are expensive. However, cheaper sensors are currently developed and should be useable for the monitoring of CO2 emissions from a megacity in the near-term. Our assessment of the inversion method is thus based on the use of several types of hypothetical networks, with a range of numbers of sensors sampling at 25 m a.g.l. The study case for this assessment is the monitoring of the emissions of the Paris metropolitan area (~ 12 million inhabitants and 11.4 Tg C emitted in 2010) during the month of January 2011. The performance of the inversion is evaluated in terms of uncertainties in the estimates of total and sectoral CO2 emissions. These uncertainties are compared to a notional ambitious target to diagnose annual total city emissions with an uncertainty of 5 % (2-sigma). We find that, with 10 stations only, which is the typical size of current pilot networks that are deployed in some cities, the uncertainty for the 1-month total city CO2 emissions is significantly reduced by the inversion by ~ 42 % but still corresponds to an annual uncertainty that is two times larger than the target of 5 %. By extending the network from 10 to 70 stations, the inversion can meet this requirement. As for major sectoral CO2 emissions, the uncertainties in the inverted emissions using 70 stations are reduced significantly over that obtained using 10 stations by 32 % for commercial and residential buildings, by 33 % for road transport and by 18 % for the production of energy by power plants, respectively. With 70 stations, the uncertainties from the inversion become of 15 % 2-sigma annual uncertainty for dispersed building emissions, and 18 % for emissions from road transport and energy production. The inversion performance could be further improved by optimal design of station locations and/or by assimilating additional atmospheric measurements of species that are co-emitted with CO2 by fossil fuel combustion processes with a specific signature from each sector, such as carbon monoxide (CO). Atmospheric inversions based on continuous CO2 measurements from a large number of cheap sensors can thus deliver a valuable quantification tool for the monitoring and/or the verification of city CO2 emissions (baseline) and CO2 emission reductions (commitments).

  3. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  4. Weak lensing magnification in the Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  5. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  6. A preliminary study on the use of FX-Glycine gel and an in-house optical cone beam CT readout for IMRT and RapidArc verification

    NASA Astrophysics Data System (ADS)

    Ravindran, Paul B.; Ebenezer, Suman Babu S.; Winfred, Michael Raj; Amalan, S.

    2017-05-01

    The radiochromic FX gel with Optical CT readout has been investigated by several authors and has shown promising results for 3D dosimetry. One of the applications of the gel dosimeters is their use in 3D dose verification for IMRT and RapidArc quality assurance. Though polymer gel has been used successfully for clinical dose verification, the use of FX gel for clinical dose verification with optical cone beam CT needs further validation. In this work, we have used FX gel and an in- house optical readout system for gamma analysis between the dose matrices of measured dose distribution and a treatment planning system (TPS) calculated dose distribution for a few test cases.

  7. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  8. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  9. International Space Station Increment-2 Quick Look Report

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Hrovat, Kenneth; Kelly, Eric

    2001-01-01

    The objective of this quick look report is to disseminate the International Space Station (ISS) Increment-2 reduced gravity environment preliminary analysis in a timely manner to the microgravity scientific community. This report is a quick look at the processed acceleration data collected by the Microgravity Acceleration Measurement System (MAMS) during the period of May 3 to June 8, 2001. The report is by no means an exhaustive examination of all the relevant activities, which occurred during the time span mentioned above for two reasons. First, the time span being considered in this report is rather short since the MAMS was not active throughout the time span being considered to allow a detailed characterization. Second, as the name of the report implied, it is a quick look at the acceleration data. Consequently, a more comprehensive report, the ISS Increment-2 report, will be published following the conclusion of the Increment-2 tour of duty. NASA sponsors the MAMS and the Space Acceleration Microgravity System (SAMS) to support microgravity science experiments, which require microgravity acceleration measurements. On April 19, 2001, both the MAMS and the SAMS units were launched on STS-100 from the Kennedy Space Center for installation on the ISS. The MAMS unit was flown to the station in support of science experiments requiring quasisteady acceleration data measurements, while the SAMS unit was flown to support experiments requiring vibratory acceleration data measurement. Both acceleration systems are also used in support of the vehicle microgravity requirements verification. The ISS reduced gravity environment analysis presented in this report uses mostly the MAMS acceleration data measurements (the Increment-2 report will cover both systems). The MAMS has two sensors. The MAMS Orbital Acceleration Research Experiment Sensor Subsystem, which is a low frequency range sensor (up to 1 Hz), is used to characterize the quasi-steady environment for payloads and vehicle. The MAMS High Resolution Acceleration Package is used to characterize the ISS vibratory environment up to 100 Hz. This quick look report presents some selected quasi-steady and vibratory activities recorded by the MAMS during the ongoing ISS Increment-2 tour of duty.

  10. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  11. Time trend of injection drug errors before and after implementation of bar-code verification system.

    PubMed

    Sakushima, Ken; Umeki, Reona; Endoh, Akira; Ito, Yoichi M; Nasuhara, Yasuyuki

    2015-01-01

    Bar-code technology, used for verification of patients and their medication, could prevent medication errors in clinical practice. Retrospective analysis of electronically stored medical error reports was conducted in a university hospital. The number of reported medication errors of injected drugs, including wrong drug administration and administration to the wrong patient, was compared before and after implementation of the bar-code verification system for inpatient care. A total of 2867 error reports associated with injection drugs were extracted. Wrong patient errors decreased significantly after implementation of the bar-code verification system (17.4/year vs. 4.5/year, p< 0.05), although wrong drug errors did not decrease sufficiently (24.2/year vs. 20.3/year). The source of medication errors due to wrong drugs was drug preparation in hospital wards. Bar-code medication administration is effective for prevention of wrong patient errors. However, ordinary bar-code verification systems are limited in their ability to prevent incorrect drug preparation in hospital wards.

  12. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  13. Resolution verification targets for airborne and spaceborne imaging systems at the Stennis Space Center

    NASA Astrophysics Data System (ADS)

    McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye

    1997-06-01

    The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.

  14. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  15. The International Space Station Urine Monitoring System (UMS)

    NASA Technical Reports Server (NTRS)

    Feeback, Daniel L.; Cibuzar, Branelle R.; Milstead, Jeffery R.; Pietrzyk,, Robert A.; Clark, Mark S.F.

    2009-01-01

    A device capable of making in-flight volume measurements of single void urine samples, the Urine Monitoring System (UMS), was developed and flown on seven U.S. Space Shuttle missions. This device provided volume data for each urine void from multiple crewmembers and allowed samples of each to be taken and returned to Earth for post-flight analysis. There were a number of design flaws in the original instrument including the presence of liquid carry-over producing invalid "actual" micturition volumes and cross-contamination between successive users from residual urine in "dead" spots". Additionally, high or low volume voids could not be accurately measured, the on-orbit calibration and nominal use sequence was time intensive, and the unit had to be returned and disassembled to retrieve the volume data. These problems have been resolved in a new version, the International Space Station (ISS) UMS, that has been designed to provide real-time in-flight volume data with accuracy and precision equivalent to measurements made on Earth and the ability to provide urine samples that are unadulterated by the device. Originally conceived to be interfaced with a U.S.-built Waste Collection System (WCS), the unit now has been modified to interface with the Russian-supplied Sanitary Hygiene Device (ASY). The ISS UMS provides significant advantages over the current method of collecting urine samples into Urine Collection Devices (UCDs), from which samples are removed and returned to Earth for analyses. A significant future advantage of the UMS is that it can provide an interface to analytical instrumentation that will allow real-time measurement of urine bioanalytes allowing monitoring of crewmember health status during flight and the ability to provide medical interventions based on the results of these measurements. Currently, the ISS UMS is scheduled to launch along with Node-3 on STS-130 (20A) in December 2009. UMS will be installed and scientific/functional verification completed prior to placing the instrument into operation. Samples collected during the verification sequence will be returned for analyses on STS-131 (19A) currently scheduled for launch in March 2010. The presence of a UMS on ISS will provide the capability to conduct additional collaborative human life science investigations among the ISS International Partners.

  16. International challenge to predict the impact of radioxenon releases from medical isotope production on a comprehensive nuclear test ban treaty sampling station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Bowyer, Ted W.; Achim, Pascal

    Abstract The International Monitoring System (IMS) is part of the verification regime for the Comprehensive Nuclear-Test-Ban-Treaty Organization (CTBTO). At entry-into-force, half of the 80 radionuclide stations will be able to measure concentrations of several radioactive xenon isotopes produced in nuclear explosions, and then the full network may be populated with xenon monitoring afterward (Bowyer et al., 2013). Fission-based production of 99Mo for medical purposes also releases radioxenon isotopes to the atmosphere (Saey, 2009). One of the ways to mitigate the effect of emissions from medical isotope production is the use of stack monitoring data, if it were available, so thatmore » the effect of radioactive xenon emissions could be subtracted from the effect from a presumed nuclear explosion, when detected at an IMS station location. To date, no studies have addressed the impacts the time resolution or data accuracy of stack monitoring data have on predicted concentrations at an IMS station location. Recently, participants from seven nations used atmospheric transport modeling to predict the time-history of 133Xe concentration measurements at an IMS station in Germany using stack monitoring data from a medical isotope production facility in Belgium. Participants received only stack monitoring data and used the atmospheric transport model and meteorological data of their choice. Some of the models predicted the highest measured concentrations quite well (a high composite statistical model comparison rank or a small mean square error with the measured values). The results suggest release data on a 15 min time spacing is best. The model comparison rank and ensemble analysis suggests that combining multiple models may provide more accurate predicted concentrations than any single model. Further research is needed to identify optimal methods for selecting ensemble members and those methods may depend on the specific transport problem. None of the submissions based only on the stack monitoring data predicted the small measured concentrations very well. The one submission that best predicted small concentrations also included releases from nuclear power plants. Modeling of sources by other nuclear facilities with smaller releases than medical isotope production facilities may be important in discriminating those releases from releases from a nuclear explosion.« less

  17. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  18. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  19. Verification Assessment of Flow Boundary Conditions for CFD Analysis of Supersonic Inlet Flows

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2002-01-01

    Boundary conditions for subsonic inflow, bleed, and subsonic outflow as implemented into the WIND CFD code are assessed with respect to verification for steady and unsteady flows associated with supersonic inlets. Verification procedures include grid convergence studies and comparisons to analytical data. The objective is to examine errors, limitations, capabilities, and behavior of the boundary conditions. Computational studies were performed on configurations derived from a "parameterized" supersonic inlet. These include steady supersonic flows with normal and oblique shocks, steady subsonic flow in a diffuser, and unsteady flow with the propagation and reflection of an acoustic disturbance.

  20. Verification of relationship model between Korean new elderly class's recovery resilience and productive aging.

    PubMed

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-12-01

    The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.

  1. Verification of relationship model between Korean new elderly class’s recovery resilience and productive aging

    PubMed Central

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-01-01

    The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383

  2. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  3. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    NASA Technical Reports Server (NTRS)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  4. A time-lapse gravity survey of the Coso geothermal field, China Lake Naval Air Weapons Station, California

    USGS Publications Warehouse

    Phelps, Geoffrey; Cronkite-Ratcliff, Collin; Blake, Kelly

    2018-04-19

    We have conducted a gravity survey of the Coso geothermal field to continue the time-lapse gravity study of the area initiated in 1991. In this report, we outline a method of processing the gravity data that minimizes the random errors and instrument bias introduced into the data by the Scintrex CG-5 relative gravimeters that were used. After processing, the standard deviation of the data was estimated to be ±13 microGals. These data reveal that the negative gravity anomaly over the Coso geothermal field, centered on gravity station CER1, is continuing to increase in magnitude over time. Preliminary modeling indicates that water-table drawdown at the location of CER1 is between 65 and 326 meters over the last two decades. We note, however, that several assumptions on which the model results depend, such as constant elevation and free-water level over the study period, still require verification.

  5. Verification of International Space Station Component Leak Rates by Helium Accumulation Method

    NASA Technical Reports Server (NTRS)

    Underwood, Steve D.; Smith, Sherry L.

    2003-01-01

    Discovery of leakage on several International Space Station U.S. Laboratory Module ammonia system quick disconnects (QDs) led to the need for a process to quantify total leakage without removing the QDs from the system. An innovative solution was proposed allowing quantitative leak rate measurement at ambient external pressure without QD removal. The method utilizes a helium mass spectrometer configured in the detector probe mode to determine helium leak rates inside a containment hood installed on the test component. The method was validated through extensive developmental testing. Test results showed the method was viable, accurate and repeatable for a wide range of leak rates. The accumulation method has been accepted by NASA and is currently being used by Boeing Huntsville, Boeing Kennedy Space Center and Boeing Johnson Space Center to test welds and valves and will be used by Alenia to test the Cupola. The method has been used in place of more expensive vacuum chamber testing which requires removing the test component from the system.

  6. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: UTC FUEL CELLS' PC25C POWER PLANT - GAS PROCESSING UNIT PERFORMANCE FOR ANAEROBIC DIGESTER GAS

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system based on the UTC Fuel Cell's PC25C Fuel Cell Power Plant was evaluated. The...

  8. 26 CFR 1.404(a)-2 - Information to be furnished by employer claiming deductions; taxable years ending before December...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and the vesting conditions, (v) The medium of funding (e. g., self-insured, unit purchase group... source and application in sufficient detail to permit ready analysis and verification thereof, and, in... verification of the reasonableness thereof. (9) A statement of the contributions paid under the plan for the...

  9. Fabrication and verification testing of ETM 30 cm diameter ion thrusters

    NASA Technical Reports Server (NTRS)

    Collett, C.

    1977-01-01

    Engineering model designs and acceptance tests are described for the 800 and 900 series 30 cm electron bombardment thrustors. Modifications to the test console for a 1000 hr verification test were made. The 10,000 hr endurance test of the S/N 701 thruster is described, and post test analysis results are included.

  10. 78 FR 69602 - Foreign Supplier Verification Programs for Importers of Food for Humans and Animals; Extension of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Food for... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 1 [Docket No. FDA-2011-N-0143] RIN 0910-AG64 Foreign Supplier Verification Programs for Importers of Food for Humans and...

  11. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  12. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  13. SU-E-J-32: Calypso(R) and Laser-Based Localization Systems Comparison for Left-Sided Breast Cancer Patients Using Deep Inspiration Breath Hold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, S; Kaurin, D; Sweeney, L

    2014-06-01

    Purpose: Our institution uses a manual laser-based system for primary localization and verification during radiation treatment of left-sided breast cancer patients using deep inspiration breath hold (DIBH). This primary system was compared with sternum-placed Calypso(R) beacons (Varian Medical Systems, CA). Only intact breast patients are considered for this analysis. Methods: During computed tomography (CT) simulation, patients have BB and Calypso(R) surface beacons positioned sternally and marked for free-breathing and DIBH CTs. During dosimetry planning, BB longitudinal displacement between free breathing and DIBH CT determines laser mark (BH mark) location. Calypso(R) beacon locations from the DIBH CT are entered at themore » Tracking Station. During Linac simulation and treatment, patients inhale until the cross-hair and/or lasers coincide with the BH Mark, which can be seen using our high quality cameras (Pelco, CA). Daily Calypso(R) displacement values (difference from the DIBH-CT-based plan) are recorded.The displacement mean and standard deviation was calculated for each patient (77 patients, 1845 sessions). An aggregate mean and standard deviation was calculated weighted by the number of patient fractions.Some patients were shifted based on MV ports. A second data set was calculated with Calypso(R) values corrected by these shifts. Results: Mean displacement values indicate agreement within 1±3mm, with improvement for shifted data (Table). Conclusion: Both unshifted and shifted data sets show the Calypso(R) system coincides with the laser system within 1±3mm, demonstrating either localization/verification system will Resultin similar clinical outcomes. Displacement value uncertainty unilaterally reduces when shifts are taken into account.« less

  14. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  15. NASA Lighting Research, Test, & Analysis

    NASA Technical Reports Server (NTRS)

    Clark, Toni

    2015-01-01

    The Habitability and Human Factors Branch, at Johnson Space Center, in Houston, TX, provides technical guidance for the development of spaceflight lighting requirements, verification of light system performance, analysis of integrated environmental lighting systems, and research of lighting-related human performance issues. The Habitability & Human Factors Lighting Team maintains two physical facilities that are integrated to provide support. The Lighting Environment Test Facility (LETF) provides a controlled darkroom environment for physical verification of lighting systems with photometric and spetrographic measurement systems. The Graphics Research & Analysis Facility (GRAF) maintains the capability for computer-based analysis of operational lighting environments. The combined capabilities of the Lighting Team at Johnson Space Center have been used for a wide range of lighting-related issues.

  16. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  17. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  18. An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.

    2009-12-01

    For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.

  19. Fecal Coliform Model Verification Sampling Plan, Winter 2004. Addendum to the Fecal Coliform Total Maximum Daily Load Study Plan for Sinclair and Dyes Inlets

    DTIC Science & Technology

    2004-02-19

    County 142.9 17 421 60 2253 3800 PO-POBLVD Port Orchard Blvd Port Orchard 314.6 17 291 20 3119 14000 PSNS015 Naval Station McDonalds PSNS 109.5 12 1629...Stream Mouth south of Bea 47.55253 -122.60112 1 1 BL- KFC (BJ01) ECOLOGY BLACKJACK CREEK ( KFC ) Behind KFC 47.54172 -122.62777 1 1 OC KPUD OLNEY CREEK...Com/Res/Rec is a 36” diameter smooth bore concrete storm water outfall pipe located adjacent to McDonalds , near the south side of the drive-through

  20. KSC-99pc26

    NASA Image and Video Library

    1999-01-07

    Loral workers at Astrotech, Titusville, Fla., perform an illumination test for circuitry verification on the solar panel of the GOES-L weather satellite. The satellite is to be launched from Cape Canaveral Air Station aboard an Atlas II rocket in late March. The GOES-L is the fourth of a new advanced series of geostationary weather satellites for the National Oceanic and Atmospheric Administration. It is a three-axis inertially stabilized spacecraft that will provide pictures and perform atmospheric sounding at the same time. Once launched, the satellite, to be designated GOES-11, will undergo checkout and provide backup capabilities for the existing, aging GOES East weather satellite

Top