Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
Guidelines for mission integration, a summary report
NASA Technical Reports Server (NTRS)
1979-01-01
Guidelines are presented for instrument/experiment developers concerning hardware design, flight verification, and operations and mission implementation requirements. Interface requirements between the STS and instruments/experiments are defined. Interface constraints and design guidelines are presented along with integrated payload requirements for Spacelab Missions 1, 2, and 3. Interim data are suggested for use during hardware development until more detailed information is developed when a complete mission and an integrated payload system are defined. Safety requirements, flight verification requirements, and operations procedures are defined.
Instrument Systems Analysis and Verification Facility (ISAVF) users guide
NASA Technical Reports Server (NTRS)
Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.
1985-01-01
The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.
The development, verification, and comparison study between LC-MS libraries for two manufacturers’ instruments and a verified protocol are discussed. The LC-MS library protocol was verified through an inter-laboratory study that involved Federal, State, and private laboratories. ...
Design and development of the 2m resolution camera for ROCSAT-2
NASA Astrophysics Data System (ADS)
Uguen, Gilbert; Luquet, Philippe; Chassat, François
2017-11-01
EADS-Astrium has recently completed the development of a 2m-resolution camera, so-called RSI (Remote Sensing Instrument), for the small-satellite ROCSAT-2, which is the second component of the long-term space program of the Republic of China. The National Space Program Office of Taïwan selected EADS-Astrium as the Prime Contractor for the development of the spacecraft, including the bus and the main instrument RSI. The main challenges for the RSI development were: - to introduce innovative technologies in order to meet the high performance requirements while achieving the design simplicity necessary for the mission (low mass, low power) - to have a development approach and verification compatible with the very tight development schedule This paper describes the instrument design together with the development and verification logic that were implemented to successfully meet these objectives.
Software for imaging phase-shift interference microscope
NASA Astrophysics Data System (ADS)
Malinovski, I.; França, R. S.; Couceiro, I. B.
2018-03-01
In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.
NASA Astrophysics Data System (ADS)
Wernham, Denny; Ciapponi, Alessandra; Riede, Wolfgang; Allenspacher, Paul; Era, Fabio; D'Ottavi, Alessandro; Thibault, Dominique
2016-12-01
The Aladin instrument will fly on the European Space Agency's ADM Aeolus satellite. The instrument is a Doppler wind LIDAR, primarily designed to measure global wind profiles to improve the accuracy of numerical weather prediction models. At the heart of the instrument is a frequency stabilized 355nm laser which will emit approximately 100mJ of energy in the form of 20ns pulses with a fluence around 1Jcm-2. The pulse repetition frequency is 50Hz meaning that Aladin will eventually have to accumulate 5Gshots over its 3 years planned lifetime in orbit. Due to anomalies that have occurred on previous spaceborne lasers, as well as a number of failures that we have observed in previous tests, an extensive development and verification campaign was undertaken in order to ensure that the Aladin instrument is robust enough to survive the mission. In this paper, we shall report the logic and the results of this verification campaign.
GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES
This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...
NASA Astrophysics Data System (ADS)
Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.
2015-11-01
High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.
Instrumentation Technology. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Squires, Sheila S.
This report provides results of Phase I of a project that researched the occupational area of instrumentation technology, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train instrumentation technicians. Section 1 contains general information: purpose of…
40 CFR 1066.130 - Measurement instrument calibrations and verifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Measurement instrument calibrations... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.130 Measurement instrument calibrations and verifications. The...
NASA Technical Reports Server (NTRS)
Powell, John D.
2003-01-01
This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.
NASA Technical Reports Server (NTRS)
Homan, D. J.
1977-01-01
A computer program written to calculate the proximity aerodynamic force and moment coefficients of the Orbiter/Shuttle Carrier Aircraft (SCA) vehicles based on flight instrumentation is described. The ground reduced aerodynamic coefficients and instrumentation errors (GRACIE) program was developed as a tool to aid in flight test verification of the Orbiter/SCA separation aerodynamic data base. The program calculates the force and moment coefficients of each vehicle in proximity to the other, using the load measurement system data, flight instrumentation data and the vehicle mass properties. The uncertainty in each coefficient is determined, based on the quoted instrumentation accuracies. A subroutine manipulates the Orbiter/747 Carrier Separation Aerodynamic Data Book to calculate a comparable set of predicted coefficients for comparison to the calculated flight test data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Ryan M.; Suter, Jonathan D.; Jones, Anthony M.
2014-09-12
This report documents FY14 efforts for two instrumentation subtasks under storage and transportation. These instrumentation tasks relate to developing effective nondestructive evaluation (NDE) methods and techniques to (1) verify the integrity of metal canisters for the storage of used nuclear fuel (UNF) and to (2) verify the integrity of dry storage cask internals.
NASA Technical Reports Server (NTRS)
Keeley, J. T.
1976-01-01
Guidelines and general requirements applicable to the development of instrument flight hardware intended for use on the GSFC Shuttle Scientific Payloads Program are given. Criteria, guidelines, and an organized approach to specifying the appropriate level of requirements for each instrument in order to permit its development at minimum cost while still assuring crew safety, are included. It is recognized that the instruments for these payloads will encompass wide ranges of complexity, cost, development risk, and safety hazards. The flexibility required to adapt the controls, documentation, and verification requirements in accord with the specific instrument is provided.
DEMONSTRATION AND QUALITY ASSURANCE PROJECT ...
A demonstration of field portable/mobile technologies for measuring trace elements in soil and sediments was conducted under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation (SITE) Program. The demonstration took place from January 24 to 28, 2005, at the Kennedy Athletic, Recreational and Social Park at Kennedy Space Center on Merritt Island, Florida. The purpose of the demonstration was to verify the performance of various instruments that employ X-ray fluorescence (XRF) measurement technologies for the determination of 13 toxic elements in a variety of soil and sediment samples. Instruments from the technology developers listed below were demonstrated. o Innov-X Systems, Inc.o NITON LLC (2 instruments ) o Oxford Instruments Portable Division (formerly Metorex, Inc.) .Oxford Instruments Analytical .Rigaku, Inc.o RONTEC USA Inc.o Xcalibur XRF Services Inc. (Division of Elvatech Ltd. ) This demonstration plan describes the procedures that will be used to verify the performance and cost of the XRF instruments provided by these technology developers. The plan incorporates the quality assurance and quality control elements needed to generate data of sufficient quality to perform this verification. A separate innovative technology verification report (ITVR) will be prepared for each instrument. The objective of this program is to promote the acceptance and use of innovative field technologies by providing well-documented perfor
The OCMA-350 Oil Content Analyzer(OCMA-350) developed by Horiba Instruments Incorporated (Horiba), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Huen...
A digital flight control system verification laboratory
NASA Technical Reports Server (NTRS)
De Feo, P.; Saib, S.
1982-01-01
A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.
Review of measurement instruments in clinical and research ethics, 1999–2003
Redman, B K
2006-01-01
Every field of practice has the responsibility to evaluate its outcomes and to test its theories. Evidence of the underdevelopment of measurement instruments in bioethics suggests that attending to strengthening existing instruments and developing new ones will facilitate the interpretation of accumulating bodies of research as well as the making of clinical judgements. A review of 65 instruments reported in the published literature showed 10 with even a minimal level of psychometric data. Two newly developed instruments provide examples of the full use of psychometric and ethical theory. Bioethicists use a wide range of methods for knowledge development and verification; each method should meet stringent standards of quality. PMID:16507659
Application of Lightweight Formal Methods to Software Security
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt
2005-01-01
Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.
Environmental Technology Verification Report for Instrumentation Northwest, Inc., Aquistar® TempHion Smart Sensor and Datalogger Nitrate-specific Ion-selective Electrode for Groundwater Remediation Monitoring
NASA Astrophysics Data System (ADS)
Valenziano, L.; Gregorio, A.; Butler, R. C.; Amiaux, J.; Bonoli, C.; Bortoletto, F.; Burigana, C.; Corcione, L.; Ealet, A.; Frailis, M.; Jahnke, K.; Ligori, S.; Maiorano, E.; Morgante, G.; Nicastro, L.; Pasian, F.; Riva, M.; Scaramella, R.; Schiavone, F.; Tavagnacco, D.; Toledo-Moreo, R.; Trifoglio, M.; Zacchei, A.; Zerbi, F. M.; Maciaszek, T.
2012-09-01
Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near- Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control of systematic effects is required. From previous experiments, we have built the concept of an integrated instrument development and verification approach, where the scientific, instrument and ground-segment expertise have strong interactions from the early phases of the project. In particular, we discuss the strong integration of test and calibration activities with the Ground Segment, starting from early pre-launch verification activities. We want to report here the expertise acquired by the Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it is applied in the Euclid mission framework.
Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar
2014-01-01
During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458
Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar
2014-01-02
During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures.
Integration and verification testing of the Large Synoptic Survey Telescope camera
NASA Astrophysics Data System (ADS)
Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.
2016-08-01
We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.
Computer aided statistical process control for on-line instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meils, D.E.
1995-01-01
On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less
Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...
2014-01-01
This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less
Hubble Space Telescope Fine Guidance Sensors Instrument Handbook, version 4.0
NASA Technical Reports Server (NTRS)
Holfeltz, S. T. (Editor)
1994-01-01
This is a revised version of the Hubble Space Telescope Fine Guidance Sensor Instrument Handbook. The main goal of this edition is to help the potential General Observer (GO) learn how to most efficiently use the Fine Guidance Sensors (FGS's). First, the actual performance of the FGS's as scientific instruments is reviewed. Next, each of the available operating modes of the FGS's are reviewed in turn. The status and findings of pertinent calibrations, including Orbital Verification, Science Verification, and Instrument Scientist Calibrations are included as well as the relevant data reduction software.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...
Berens, Angelique M; Harbison, Richard Alex; Li, Yangming; Bly, Randall A; Aghdasi, Nava; Ferreira, Manuel; Hannaford, Blake; Moe, Kris S
2017-08-01
To develop a method to measure intraoperative surgical instrument motion. This model will be applicable to the study of surgical instrument kinematics including surgical training, skill verification, and the development of surgical warning systems that detect aberrant instrument motion that may result in patient injury. We developed an algorithm to automate derivation of surgical instrument kinematics in an endoscopic endonasal skull base surgery model. Surgical instrument motion was recorded during a cadaveric endoscopic transnasal approach to the pituitary using a navigation system modified to record intraoperative time-stamped Euclidian coordinates and Euler angles. Microdebrider tip coordinates and angles were referenced to the cadaver's preoperative computed tomography scan allowing us to assess surgical instrument kinematics over time. A representative cadaveric endoscopic endonasal approach to the pituitary was performed to demonstrate feasibility of our algorithm for deriving surgical instrument kinematics. Technical feasibility of automatically measuring intraoperative surgical instrument motion and deriving kinematics measurements was demonstrated using standard navigation equipment.
ALADIN: the first european lidar in space
NASA Astrophysics Data System (ADS)
Morançais, Didier; Fabre, Frédéric; Schillinger, Marc; Barthès, Jean-Claude; Endemann, Martin; Culoma, Alain; Durand, Yannig
2017-11-01
The Atmospheric LAser Doppler INstrument (ALADIN) is the payload of the ESA's ADMAEOLUS mission, which aims at measuring wind profiles as required by the climatology and meteorology users. ALADIN belongs to a new class of Earth Observation payloads and will be the first European Lidar in space. The instrument comprises a diode-pumped high energy Nd:YAG laser and a direct detection receiver operating on aerosol and molecular backscatter signals in parallel. In addition to the Proto- Flight Model (PFM)., two instrument models are developed: a Pre-development Model (PDM) and an Opto-Structure-Thermal Model (OSTM). The flight instrument design and the industrial team has been finalised and the major equipment are now under development. This paper describes the instrument design and performance as well as the development and verification approach. The main results obtained during the PDM programme are also reported. The ALADIN instrument is developed under prime contractorship from EADS Astrium SAS with a consortium of thirty European companies.
DOT National Transportation Integrated Search
2016-08-01
The primary objectives of this research include: performing static and dynamic load tests on : newly instrumented test piles to better understand the set-up mechanism for individual soil : layers, verifying or recalibrating previously developed empir...
Calibration of a Computer Based Instrumentation for Flight Research
NASA Technical Reports Server (NTRS)
Forsyth, T. J.; Reynolds, R. S. (Technical Monitor)
1997-01-01
NASA Ames Research Center has been investigating a Differential Global Positioning System (DGPS) for future use as a Category II/III landing system. The DGPS navigation system was developed and installed on a B200 King Air aircraft. Instrumentation that is not calibrated and verified as a total operating system can have errors or not work correctly. Systems need to be checked for cross talk and that they work together accurately. It is imperative that the instrumentation and computer do not affect aircraft avionics and instrumentation needed for aircraft operation. This paper discusses calibration and verification principles of a computer based instrumentation airborne system.
First Cryo-Vacuum Test of the JWST Integrated Science Instrument Module
NASA Astrophysics Data System (ADS)
Kimble, Randy A.; Antonille, S. R.; Balzano, V.; Comber, B. J.; Davila, P. S.; Drury, M. D.; Glasse, A.; Glazer, S. D.; Lundquist, R.; Mann, S. D.; McGuffey, D. B.; Novo-Gradac, K. J.; Penanen, K.; Ramey, D. D.; Sullivan, J.; Van Campen, J.; Vila, M. B.
2014-01-01
The integration and test program for the Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope (JWST) calls for three cryo-vacuum tests of the ISIM hardware. The first is a risk-reduction test aimed at checking out the test hardware and procedures; this will be followed by two formal verification tests that will bracket other key aspects of the environmental test program (e.g. vibration and acoustics, EMI/EMC). The first of these cryo-vacuum tests, the risk-reduction test, was executed at NASA’s Goddard Space Flight Center starting in late August, 2013. Flight hardware under test included two (of the eventual four) flight instruments, the Mid-Infrared Instrument (MIRI) and the Fine Guidance Sensor/Near-Infrared Imager and Slitless Spectrograph (FGS/NIRISS), mounted to the ISIM structure, as well as the ISIM Electronics Compartment (IEC). The instruments were cooled to their flight operating temperatures 40K for FGS/NIRISS, ~6K for MIRI) and optically tested against a cryo-certified telescope simulator. Key goals for the risk reduction test included: 1) demonstration of controlled cooldown and warmup, stable control at operating temperature, and measurement of heat loads, 2) operation of the science instruments with ISIM electronics systems at temperature, 3) health trending of the science instruments against instrument-level test results, 4) measurement of the pupil positions and six degree of freedom alignment of the science instruments against the simulated telescope focal surface, 5) detailed optical characterization of the NIRISS instrument, 6) verification of the signal-to-noise performance of the MIRI, and 7) exercise of the Onboard Script System that will be used to operate the instruments in flight. In addition, the execution of the test is expected to yield invaluable logistical experience - development and execution of procedures, communications, analysis of results - that will greatly benefit the subsequent verification tests. At the time of this submission, the hardware had reached operating temperature and was partway through the cryo test program. We report here on the test configuration, the overall process, and the results that were ultimately obtained.
NASA Technical Reports Server (NTRS)
Cleveland, Paul E.; Parrish, Keith A.
2005-01-01
A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.
ERIC Educational Resources Information Center
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien
2016-01-01
The purpose of this study was to develop a computer-based measure of elementary students' science talk and to report students' benchmarks. The development procedure had three steps: defining the framework of the test, collecting and identifying key reference sets of science talk, and developing and verifying the science talk instrument. The…
Input apparatus for dynamic signature verification systems
EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.
1978-01-01
The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.
NASA Technical Reports Server (NTRS)
Feinberg, L.; Wilson, M.
1993-01-01
To correct for the spherical aberration in the Hubble Space Telescope primary mirror, five anamorphic aspheric mirrors representing correction for three scientific instruments have been fabricated as part of the development of the corrective-optics space telescope axial-replacement instrument (COSTAR). During the acceptance tests of these mirrors at the vendor, a quick and simple method for verifying the asphere surface figure was developed. The technique has been used on three of the aspheres relating to the three instrument prescriptions. Results indicate that the three aspheres are correct to the limited accuracy expected of this test.
Laser light scattering instrument advanced technology development
NASA Technical Reports Server (NTRS)
Wallace, J. F.
1993-01-01
The objective of this advanced technology development (ATD) project has been to provide sturdy, miniaturized laser light scattering (LLS) instrumentation for use in microgravity experiments. To do this, we assessed user requirements, explored the capabilities of existing and prospective laser light scattering hardware, and both coordinated and participated in the hardware and software advances needed for a flight hardware instrument. We have successfully breadboarded and evaluated an engineering version of a single-angle glove-box instrument which uses solid state detectors and lasers, along with fiber optics, for beam delivery and detection. Additionally, we have provided the specifications and written verification procedures necessary for procuring a miniature multi-angle LLS instrument which will be used by the flight hardware project which resulted from this work and from this project's interaction with the laser light scattering community.
NASA Astrophysics Data System (ADS)
Rausch, Peter; Verpoort, Sven; Wittrock, Ulrich
2017-11-01
Concepts for future large space telescopes require an active optics system to mitigate aberrations caused by thermal deformation and gravitational release. Such a system would allow on-site correction of wave-front errors and ease the requirements for thermal and gravitational stability of the optical train. In the course of the ESA project "Development of Adaptive Deformable Mirrors for Space Instruments" we have developed a unimorph deformable mirror designed to correct for low-order aberrations and dedicated to be used in space environment. We briefly report on design and manufacturing of the deformable mirror and present results from performance verifications and environmental testing.
NASA Astrophysics Data System (ADS)
Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.
2013-05-01
Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.
Structural Safety of a Hubble Space Telescope Science Instrument
NASA Technical Reports Server (NTRS)
Lou, M. C.; Brent, D. N.
1993-01-01
This paper gives an overview of safety requirements related to structural design and verificationof payloads to be launched and/or retrieved by the Space Shuttle. To demonstrate the generalapproach used to implement these requirements in the development of a typical Shuttle payload, theWide Field/Planetary Camera II, a second generation science instrument currently being developed bythe Jet Propulsion Laboratory (JPL) for the Hubble Space Telescope is used as an example. Inaddition to verification of strength and dynamic characteristics, special emphasis is placed upon thefracture control implementation process, including parts classification and fracture controlacceptability.
Workgroup for Hydraulic laboratory Testing and Verification of Hydroacoustic Instrumentation
Fulford, Janice M.; Armstrong, Brandy N.; Thibodeaux, Kirk G.
2015-01-01
An international workgroup was recently formed for hydraulic laboratory testing and verification of hydroacoustic instrumentation used for water velocity measurements. The activities of the workgroup have included one face to face meeting, conference calls and an inter-laboratory exchange of two acoustic meters among participating laboratories. Good agreement was found among four laboratories at higher tow speeds and poorer agreement at the lowest tow speed.
On-ground tests of the NISP infrared spectrometer instrument for Euclid
NASA Astrophysics Data System (ADS)
Jomni, Cyril; Ealet, Anne; Gillard, William; Prieto, Éric; Grupp, Frank U.
2017-09-01
Euclid is an ESA mission dedicated to understand the acceleration of the expansion of the Universe. The mission will measure hundred of millions of galaxies in spectrophotometry and photometry in the near infrared thanks to a spectro-photometer called NISP. This instrument will be assembled and tested in Marseille. To prepare the on-ground test plan and develop the test procedure, we have used simulated PSF images, based on a Zemax optical design of the instrument. We have developed the analysis tools that will be further used to build the procedure verification. We present here the method and analysis results to adjust the focus of the instrument. We will in particular show that because of the sampling of the PSF, a dithering strategy should be adapted and will constraint the development of the test plan.
ETV works in partnership with recognized standards and testing organizations and stakeholder groups consisting of regulators, buyers, and vendor organizations, with the full participation of individual technology developers. The program evaluates the performance of innovative
NASA Technical Reports Server (NTRS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.;
2016-01-01
NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.
2016-09-01
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory
NASA Astrophysics Data System (ADS)
Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.
2018-02-01
PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.
NASA Astrophysics Data System (ADS)
Misnasanti; Dien, C. A.; Azizah, F.
2018-03-01
This study is aimed to describe Lesson Study (LS) activity and its roles in the development of mathematics learning instruments based on Learning Trajectory (LT). This study is a narrative study of teacher’s experiences in joining LS activity. Data collecting in this study will use three methods such as observation, documentations, and deep interview. The collected data will be analyzed with Milles and Huberman’s model that consists of reduction, display, and verification. The study result shows that through LS activity, teachers know more about how students think. Teachers also can revise their mathematics learning instrument in the form of lesson plan. It means that LS activity is important to make a better learning instruments and focus on how student learn not on how teacher teach.
Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program
NASA Technical Reports Server (NTRS)
Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby
2017-01-01
Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Automated verification of flight software. User's manual
NASA Technical Reports Server (NTRS)
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.
1984-01-01
The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.
A risk analysis approach applied to field surveillance in utility meters in legal metrology
NASA Astrophysics Data System (ADS)
Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.
2018-03-01
Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.
Development of a Hampton University Program for Novel Breast Cancer Imaging and Therapy Research
2013-04-01
intracavitary brachytherapy procedures during laboratory pre-clinical imaging and dosimetry equipment testing, calibration and data processing, in collaboration... electronics and detector instrumentation development; 4) breast phantom construction and implantation; 5) laboratory pre-clinical device testing...such as the ionization chamber, diode, radiographic verification 6 films and thermoluminescent dosimeters ( TLD ) but the scintillator fiber detectors
Replacement Technologies for Precision Cleaning of Aerospace Hardware for Propellant Service
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1997-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-l13- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. Replacement technologies are being investigated for aerospace hardware and for gauges and instrumentation. This paper includes the findings of investigations of aqueous cleaning and verification of aerospace hardware using known contaminants, such as hydraulic fluid and commonly used oils. The results correlate nonvolatile residue with CFC 113. The studies also include enhancements to aqueous sampling for organic and particulate contamination. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon 225 (HCFC 225), HCFC 141b, HFE 7100(R), and Vertrel MCA(R) was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC 113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autogenous ignition and liquid oxygen mechanical impact testing.
NASA Technical Reports Server (NTRS)
Norman, I.; Rochelle, W. C.; Kimbrough, B. S.; Ritrivi, C. A.; Ting, P. C.; Dotts, R. L.
1982-01-01
Thermal performance verification of Reusable Surface Insulation (RSI) has been accomplished by comparisons of STS-2 Orbiter Flight Test (OFT) data with Thermal Math Model (TMM) predictions. The OFT data was obtained from Development Flight Instrumentation RSI plug and gap thermocouples. Quartertile RSI TMMs were developed using measured flight data for surface temperature and pressure environments. Reference surface heating rates, derived from surface temperature data, were multiplied by gap heating ratios to obtain tile sidewall heating rates. This TMM analysis resulted in good agreement of predicted temperatures with flight data for thermocouples located in the RSI, Strain Isolation Pad, filler bar and structure.
NASA Technical Reports Server (NTRS)
Wu, S. T.
1987-01-01
The goal for the SAMEX magnetograph's optical system is to accurately measure the polarization state of sunlight in a narrow spectral bandwidth over the field of view of an active region to make an accurate determination of the magnetic field in that region. The instrumental polarization is characterized. The optics and coatings were designed to minimize this spurious polarization introduced by foreoptics. The method developed to calculate the instrumental polarization of the SAMEX optics is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovic, John; Zontek, Tracy L.; Ogle, Burton R.
We examined the calibration records of two direct reading instruments designated as condensation particle counters in order to determine the number of times they were found to be out of tolerance at annual manufacturer's recalibration. For both instruments were found to be out of tolerance more times than within tolerance. And, it was concluded that annual calibration alone was insufficient to provide operational confidence in an instrument's response. Thus, a method based on subsequent agreement with data gathered from a newly calibrated instrument was developed to confirm operational readiness between annual calibrations, hereafter referred to as bump testing. The methodmore » consists of measuring source particles produced by a gas grille spark igniter in a gallon-size jar. Sampling from this chamber with a newly calibrated instrument to determine the calibrated response over the particle concentration range of interest serves as a reference. Agreement between this reference response and subsequent responses at later dates implies that the instrument is performing as it was at the time of calibration. Side-by-side sampling allows the level of agreement between two or more instruments to be determined. This is useful when simultaneously collected data are compared for differences, i.e., background with process aerosol concentrations. A reference set of data was obtained using the spark igniter. The generation system was found to be reproducible and suitable to form the basis of calibration verification. Finally, the bump test is simple enough to be performed periodically throughout the calibration year or prior to field monitoring.« less
Jankovic, John; Zontek, Tracy L.; Ogle, Burton R.; ...
2015-01-27
We examined the calibration records of two direct reading instruments designated as condensation particle counters in order to determine the number of times they were found to be out of tolerance at annual manufacturer's recalibration. For both instruments were found to be out of tolerance more times than within tolerance. And, it was concluded that annual calibration alone was insufficient to provide operational confidence in an instrument's response. Thus, a method based on subsequent agreement with data gathered from a newly calibrated instrument was developed to confirm operational readiness between annual calibrations, hereafter referred to as bump testing. The methodmore » consists of measuring source particles produced by a gas grille spark igniter in a gallon-size jar. Sampling from this chamber with a newly calibrated instrument to determine the calibrated response over the particle concentration range of interest serves as a reference. Agreement between this reference response and subsequent responses at later dates implies that the instrument is performing as it was at the time of calibration. Side-by-side sampling allows the level of agreement between two or more instruments to be determined. This is useful when simultaneously collected data are compared for differences, i.e., background with process aerosol concentrations. A reference set of data was obtained using the spark igniter. The generation system was found to be reproducible and suitable to form the basis of calibration verification. Finally, the bump test is simple enough to be performed periodically throughout the calibration year or prior to field monitoring.« less
ExoMars Raman laser spectrometer breadboard overview
NASA Astrophysics Data System (ADS)
Díaz, E.; Moral, A. G.; Canora, C. P.; Ramos, G.; Barcos, O.; Prieto, J. A. R.; Hutchinson, I. B.; Ingley, R.; Colombo, M.; Canchal, R.; Dávila, B.; Manfredi, J. A. R.; Jiménez, A.; Gallego, P.; Pla, J.; Margoillés, R.; Rull, F.; Sansano, A.; López, G.; Catalá, A.; Tato, C.
2011-10-01
The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. The RLS Instrument will perform Raman spectroscopy on crushed powdered samples deposited on a small container after crushing the cores obtained by the Rover's drill system. In response to ESA requirements for delta-PDR to be held in mid 2012, an instrument BB programme has been developed, by RLS Assembly Integration and Verification (AIV) Team to achieve the Technology Readiness level 5 (TRL5), during last 2010 and whole 2011. Currently RLS instrument is being developed pending its CoDR (Conceptual Design Revision) with ESA, in October 2011. It is planned to have a fully operative breadboard, conformed from different unit and sub-units breadboards that would demonstrate the end-to-end performance of the flight representative units by 2011 Q4.
Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1995-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.
2015-10-01
Hawaii HASP Health and Safety Plan IDA Institute for Defense Analyses IVS Instrument Verification Strip m Meter mm Millimeter MPV Man Portable...the ArcSecond laser ranger was impractical due to the requirement to maintain line-of-sight for three rovers and tedious calibration. The SERDP...within 0.1m spacing and 99% within 0.15 m Repeatability of Instrument Verification Strip (IVS) survey Amplitude of EM anomaly Amplitude of
STORM-SEWER FLOW MEASUREMENT AND RECORDING SYSTEM.
Kilpatrick, Frederick A.; Kaehrle, William R.
1986-01-01
A comprehensive study and development of instruments and techniques for measuring all components of flow in a storm-sewer drainage system were undertaken by the U. S. Geological Survey under the sponsorship of FHWA. The study involved laboratory and field calibration and testing of measuring flumes, pipe insert meters, weirs, and electromagnetic velocity meters as well as the development and calibration of pneumatic bubbler and pressure transducer head-measuring systems. Tracer dilution and acoustic-flowmeter measurements were used in field verification tests. A single micrologger was used to record data from all the instruments and also to activate on command the electromagnetic velocity meter and tracer dilution systems.
The AAO fiber instrument data simulator
NASA Astrophysics Data System (ADS)
Goodwin, Michael; Farrell, Tony; Smedley, Scott; Heald, Ron; Heijmans, Jeroen; De Silva, Gayandhi; Carollo, Daniela
2012-09-01
The fiber instrument data simulator is an in-house software tool that simulates detector images of fiber-fed spectrographs developed by the Australian Astronomical Observatory (AAO). In addition to helping validate the instrument designs, the resulting simulated images are used to develop the required data reduction software. Example applications that have benefited from the tool usage are the HERMES and SAMI instrumental projects for the Anglo-Australian Telescope (AAT). Given the sophistication of these projects an end-to-end data simulator that accurately models the predicted detector images is required. The data simulator encompasses all aspects of the transmission and optical aberrations of the light path: from the science object, through the atmosphere, telescope, fibers, spectrograph and finally the camera detectors. The simulator runs under a Linux environment that uses pre-calculated information derived from ZEMAX models and processed data from MATLAB. In this paper, we discuss the aspects of the model, software, example simulations and verification.
Experiment S-191 visible and infrared spectrometer
NASA Technical Reports Server (NTRS)
Linnell, E. R.
1974-01-01
The design, development, fabrication test, and utilization of the visible and infrared spectrometer portion of the S-191 experiment, part of the Earth Resources Experiment Package, on board Skylab is discussed. The S-191 program is described, as well as conclusions and recommendations for improvement of this type of instrument for future applications. Design requirements, instrument design approaches, and the test verification program are presented along with test results, including flight hardware calibration data. A brief discussion of operation during the Skylab mission is included. Documentation associated with the program is listed.
2010-09-01
what level of detail is needed to build their teams, and they can add more detailed items from the model in order to tap deeper in the performance of...of a project on ‘Command Team Effectiveness’ by Task Group 127 for the RTO Human Factors and Medicine Panel (RTG HFM-127). Published...vérification du modèle et de l’instrument) This Technical Report documents the findings of a project on ‘Command Team Effectiveness’ by Task Group
40 CFR 1065.305 - Verifications for accuracy, repeatability, and noise.
Code of Federal Regulations, 2010 CFR
2010-07-01
... prescribed by the instrument manufacturer. (2) Zero the instrument as you would before an emission test by introducing a zero signal. Depending on the instrument, this may be a zero-concentration gas, a reference... a zero gas that meets the specifications of § 1065.750. (3) Span the instrument as you would before...
40 CFR 1065.305 - Verifications for accuracy, repeatability, and noise.
Code of Federal Regulations, 2013 CFR
2013-07-01
... prescribed by the instrument manufacturer. (2) Zero the instrument as you would before an emission test by introducing a zero signal. Depending on the instrument, this may be a zero-concentration gas, a reference... a zero gas that meets the specifications of § 1065.750. (3) Span the instrument as you would before...
40 CFR 1065.305 - Verifications for accuracy, repeatability, and noise.
Code of Federal Regulations, 2014 CFR
2014-07-01
... prescribed by the instrument manufacturer. (2) Zero the instrument as you would before an emission test by introducing a zero signal. Depending on the instrument, this may be a zero-concentration gas, a reference... a zero gas that meets the specifications of § 1065.750. (3) Span the instrument as you would before...
40 CFR 1065.305 - Verifications for accuracy, repeatability, and noise.
Code of Federal Regulations, 2011 CFR
2011-07-01
... prescribed by the instrument manufacturer. (2) Zero the instrument as you would before an emission test by introducing a zero signal. Depending on the instrument, this may be a zero-concentration gas, a reference... a zero gas that meets the specifications of § 1065.750. (3) Span the instrument as you would before...
40 CFR 1065.305 - Verifications for accuracy, repeatability, and noise.
Code of Federal Regulations, 2012 CFR
2012-07-01
... prescribed by the instrument manufacturer. (2) Zero the instrument as you would before an emission test by introducing a zero signal. Depending on the instrument, this may be a zero-concentration gas, a reference... a zero gas that meets the specifications of § 1065.750. (3) Span the instrument as you would before...
Environment Modeling Using Runtime Values for JPF-Android
NASA Technical Reports Server (NTRS)
van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem
2015-01-01
Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.
The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...
NASA Technical Reports Server (NTRS)
Gracey, Renee; Bartoszyk, Andrew; Cofie, Emmanuel; Comber, Brian; Hartig, George; Howard, Joseph; Sabatke, Derek; Wenzel, Greg; Ohl, Raymond
2016-01-01
The James Webb Space Telescope includes the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. We performed extensive structural, thermal, and optical performance(STOP) modeling in support of all phases of ISIM development. In this paper, we focus on modeling and results associated with test and verification. ISIMs test program is bound by ground environments, mostly notably the 1g and test chamber thermal environments. This paper describes STOP modeling used to predict ISIM system performance in 0g and at various on-orbit temperature environments. The predictions are used to project results obtained during testing to on-orbit performance.
NASA Astrophysics Data System (ADS)
Golobokov, M.; Danilevich, S.
2018-04-01
In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.
2010-04-01
the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose...made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis . We also show how our theory relates to, and...of the most recent investigations for Earth and Mars atmospheres will be discussed in the following sections. 2.4.1 Earth: lunar return NASA’s
Instrument performance and simulation verification of the POLAR detector
NASA Astrophysics Data System (ADS)
Kole, M.; Li, Z. H.; Produit, N.; Tymieniecka, T.; Zhang, J.; Zwolinska, A.; Bao, T. W.; Bernasconi, T.; Cadoux, F.; Feng, M. Z.; Gauvin, N.; Hajdas, W.; Kong, S. W.; Li, H. C.; Li, L.; Liu, X.; Marcinkowski, R.; Orsi, S.; Pohl, M.; Rybka, D.; Sun, J. C.; Song, L. M.; Szabelski, J.; Wang, R. J.; Wang, Y. H.; Wen, X.; Wu, B. B.; Wu, X.; Xiao, H. L.; Xiong, S. L.; Zhang, L.; Zhang, L. Y.; Zhang, S. N.; Zhang, X. F.; Zhang, Y. J.; Zhao, Y.
2017-11-01
POLAR is a new satellite-born detector aiming to measure the polarization of an unprecedented number of Gamma-Ray Bursts in the 50-500 keV energy range. The instrument, launched on-board the Tiangong-2 Chinese Space lab on the 15th of September 2016, is designed to measure the polarization of the hard X-ray flux by measuring the distribution of the azimuthal scattering angles of the incoming photons. A detailed understanding of the polarimeter and specifically of the systematic effects induced by the instrument's non-uniformity are required for this purpose. In order to study the instrument's response to polarization, POLAR underwent a beam test at the European Synchrotron Radiation Facility in France. In this paper both the beam test and the instrument performance will be described. This is followed by an overview of the Monte Carlo simulation tools developed for the instrument. Finally a comparison of the measured and simulated instrument performance will be provided and the instrument response to polarization will be presented.
Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification
NASA Technical Reports Server (NTRS)
Melton, D. M.
1998-01-01
Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.
Development of CO2 laser Doppler instrumentation for detection of clear air turbulence, volume 1
NASA Technical Reports Server (NTRS)
Harris, C. E.; Jelalian, A. V.
1979-01-01
Modification, construction, test and operation of an advanced airborne carbon dioxide laser Doppler system for detecting clear air turbulence are described. The second generation CAT program and those auxiliary activities required to support and verify such a first-of-a-kind system are detailed: aircraft interface; ground and flight verification tests; data analysis; and laboratory examinations.
UAVSAR Program: Initial Results from New Instrument Capabilities
NASA Technical Reports Server (NTRS)
Lou, Yunling; Hensley, Scott; Moghaddam, Mahta; Moller, Delwyn; Chapin, Elaine; Chau, Alexandra; Clark, Duane; Hawkins, Brian; Jones, Cathleen; Marks, Phillip;
2013-01-01
UAVSAR is an imaging radar instrument suite that serves as NASA's airborne facility instrument to acquire scientific data for Principal Investigators as well as a radar test-bed for new radar observation techniques and radar technology demonstration. Since commencing operational science observations in January 2009, the compact, reconfigurable, pod-based radar has been acquiring L-band fully polarimetric SAR (POLSAR) data with repeat-pass interferometric (RPI) observations underneath NASA Dryden's Gulfstream-III jet to provide measurements for science investigations in solid earth and cryospheric studies, vegetation mapping and land use classification, archaeological research, soil moisture mapping, geology and cold land processes. In the past year, we have made significant upgrades to add new instrument capabilities and new platform options to accommodate the increasing demand for UAVSAR to support scientific campaigns to measure subsurface soil moisture, acquire data in the polar regions, and for algorithm development, verification, and cross-calibration with other airborne/spaceborne instruments.
The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...
Management of the JWST MIRI pFM environmental and performance verification test campaign
NASA Astrophysics Data System (ADS)
Eccleston, Paul; Glasse, Alistair; Grundy, Timothy; Detre, Örs Hunor; O'Sullivan, Brian; Shaughnessy, Bryan; Sykes, Jon; Thatcher, John; Walker, Helen; Wells, Martyn; Wright, Gillian; Wright, David
2012-09-01
The Mid-Infrared Instrument (MIRI) is one of four scientific instruments on the James Webb Space Telescope (JWST) observatory, scheduled for launch in 2018. It will provide unique capabilities to probe the distant or deeply dust-enshrouded regions of the Universe, investigating the history of star and planet formation from the earliest universe to the present day. To enable this the instrument optical module must be cooled below 7K, presenting specific challenges for the environmental testing and calibration activities. The assembly, integration and verification (AIV) activities for the proto-flight model (pFM) instrument ran from March 2010 to May 2012 at RAL where the instrument has been put through a full suite of environmental and performance tests with a non-conventional single cryo-test approach. In this paper we present an overview of the testing conducted on the MIRI pFM including ambient alignment testing, vibration testing, gravity release testing, cryogenic performance and calibration testing, functional testing at ambient and operational temperatures, thermal balance tests, and Electro-Magnetic Compatibility (EMC) testing. We discuss how tests were planned and managed to ensure that the whole AIV process remained on schedule and give an insight into the lessons learned from this process. We also show how the process of requirement verification for this complex system was managed and documented. We describe how the risks associated with a single long duration test at operating temperature were controlled so that the complete suite of environmental tests could be used to build up a full picture of instrument compliance.
NASA Astrophysics Data System (ADS)
Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.
2004-11-01
The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.
The JWST Science Instrument Payload: Mission Context and Status
NASA Technical Reports Server (NTRS)
Greenhouse, Matthew A.
2014-01-01
The James Webb Space Telescope (JWST) is the scientific successor to the Hubble Space Telescope. It is a cryogenic infrared space observatory with a 25 sq m aperture (6 m class) telescope that will achieve diffraction limited angular resolution at a wavelength of 2 microns. The science instrument payload includes four passively cooled near-infrared instruments providing broad- and narrow-band imagery, coronography, as well as multi-object and integral-field spectroscopy over the 0.6 < lambda < 5.0 microns spectrum. An actively cooled mid-infrared instrument provides broad-band imagery, coronography, and integral-field spectroscopy over the 5.0 < lambda < 29 microns spectrum. The JWST is being developed by NASA, in partnership with the European and Canadian Space Agencies, as a general user facility with science observations to be proposed by the international astronomical community in a manner similar to the Hubble Space Telescope. Technology development and mission design are complete. Construction, integration and verification testing is underway in all areas of the program. The JWST is on schedule for launch during 2018.
Development and testing of highway storm-sewer flow measurement and recording system
Kilpatrick, F.A.; Kaehrle, W.R.; Hardee, Jack; Cordes, E.H.; Landers, M.N.
1985-01-01
A comprehensive study and development of measuring instruments and techniques for measuring all components of flow in a storm-sewer drainage system was undertaken by the U.S. Geological Survey under the sponsorship of the Federal Highway Administration. The study involved laboratory and field calibration and testing of measuring flumes, pipe insert meters, weirs, electromagnetic velocity meters as well as the development and calibration of pneumatic-bubbler pressure transducer head measuring systems. Tracer-dilution and acoustic flow meter measurements were used in field verification tests. A single micrologger was used to record data from all the above instruments as well as from a tipping-bucket rain gage and also to activate on command the electromagnetic velocity meter and tracer-dilution systems. (Author 's abstract)
Stennis Space Center Verification & Validation Capabilities
NASA Technical Reports Server (NTRS)
Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; ONeal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir
2005-01-01
Scientists within NASA s Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial and moderate resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists are also using the SSC V&V site to characterize thermal infrared systems and active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.
Remote Sensing Product Verification and Validation at the NASA Stennis Space Center
NASA Technical Reports Server (NTRS)
Stanley, Thomas M.
2005-01-01
Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.
40 CFR 1065.378 - NO2-to-NO converter conversion verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Measurements § 1065.378 NO2-to-NO converter conversion verification. (a) Scope and frequency. If you use an... catalytic activity of the NO2-to-NO converter has not deteriorated. (b) Measurement principles. An NO2-to-NO.... Allow for stabilization, accounting only for transport delays and instrument response. (ii) Use an NO...
The Deep Space Network: A Radio Communications Instrument for Deep Space Exploration
NASA Technical Reports Server (NTRS)
Renzetti, N. A.; Stelzried, C. T.; Noreen, G. K.; Slobin, S. D.; Petty, S. M.; Trowbridge, D. L.; Donnelly, H.; Kinman, P. W.; Armstrong, J. W.; Burow, N. A.
1983-01-01
The primary purpose of the Deep Space Network (DSN) is to serve as a communications instrument for deep space exploration, providing communications between the spacecraft and the ground facilities. The uplink communications channel provides instructions or commands to the spacecraft. The downlink communications channel provides command verification and spacecraft engineering and science instrument payload data.
WFIRST: Update on the Coronagraph Science Requirements
NASA Astrophysics Data System (ADS)
Douglas, Ewan S.; Cahoy, Kerri; Carlton, Ashley; Macintosh, Bruce; Turnbull, Margaret; Kasdin, Jeremy; WFIRST Coronagraph Science Investigation Teams
2018-01-01
The WFIRST Coronagraph instrument (CGI) will enable direct imaging and low resolution spectroscopy of exoplanets in reflected light and imaging polarimetry of circumstellar disks. The CGI science investigation teams were tasked with developing a set of science requirements which advance our knowledge of exoplanet occurrence and atmospheric composition, as well as the composition and morphology of exozodiacal debris disks, cold Kuiper Belt analogs, and protoplanetary systems. We present the initial content, rationales, validation, and verification plans for the WFIRST CGI, informed by detailed and still-evolving instrument and observatory performance models. We also discuss our approach to the requirements development and management process, including the collection and organization of science inputs, open source approach to managing the requirements database, and the range of models used for requirements validation. These tools can be applied to requirements development processes for other astrophysical space missions, and may ease their management and maintenance. These WFIRST CGI science requirements allow the community to learn about and provide insights and feedback on the expected instrument performance and science return.
VINCI: the VLT Interferometer commissioning instrument
NASA Astrophysics Data System (ADS)
Kervella, Pierre; Coudé du Foresto, Vincent; Glindemann, Andreas; Hofmann, Reiner
2000-07-01
The Very Large Telescope Interferometer (VLTI) is a complex system, made of a large number of separated elements. To prepare an early successful operation, it will require a period of extensive testing and verification to ensure that the many devices involved work properly together, and can produce meaningful data. This paper describes the concept chosen for the VLTI commissioning instrument, LEONARDO da VINCI, and details its functionalities. It is a fiber based two-way beam combiner, associated with an artificial star and an alignment verification unit. The technical commissioning of the VLTI is foreseen as a stepwise process: fringes will first be obtained with the commissioning instrument in an autonomous mode (no other parts of the VLTI involved); then the VLTI telescopes and optical trains will be tested in autocollimation; finally fringes will be observed on the sky.
The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...
Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform
NASA Astrophysics Data System (ADS)
Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.
2017-03-01
The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.
Radiation effects on science instruments in Grand Tour type missions
NASA Technical Reports Server (NTRS)
Parker, R. H.
1972-01-01
The extent of the radiation effects problem is delineated, along with the status of protective designs for 15 representative science instruments. Designs for protecting science instruments from radiation damage is discussed for the various instruments to be employed in the Grand Tour type missions. A literature search effort was undertaken to collect science instrument components damage/interference effects data on the various sensitive components such as Si detectors, vidicon tubes, etc. A small experimental effort is underway to provide verification of the radiation effects predictions.
NASA Technical Reports Server (NTRS)
Robinson, D. N.
1985-01-01
Three major categories of testing are identified that are necessary to provide support for the development of constitutive equations for high temperature alloys. These are exploratory, charactrization and verification tests. Each category is addressed and specific examples of each are given. An extensive, but not exhaustive, set of references is provided concerning pertinent experimental results and their relationships to theoretical development. This guide to formulating a meaningful testing effort in support of consitutive equation development can also aid in defining the necessary testing equipment and instrumentation for the establishment of a deformation and structures testing laboratory.
Experience with advanced instrumentation in a hot section cascade
NASA Technical Reports Server (NTRS)
Yeh, Frederick C.; Gladden, Herbert J.
1989-01-01
The Lewis Research Center gas turbine Hot Section Test Facility was developed to provide a real engine environment with known boundary conditions for the aerothermal performance evaluation and verification of computer design codes. This verification process requires experimental measurements in a hostile environment. The research instruments used in this facility are presented, and their characteristics and how they perform in this environment are discussed. The research instrumentation consisted of conventional pressure and temperature sensors, as well as thin-film thermocouples and heat flux gages. The hot gas temperature was measured by an aspirated temperature probe and by a dual-element, fast-response temperature probe. The data acquisition mode was both steady state and time dependent. These experiments were conducted over a wide range of gas Reynolds numbers, exit gas Mach numbers, and heat flux levels. This facility was capable of testing at temperatures up to 1600 K, and at pressures up to 18 atm. These corresponded to an airfoil exit Reynolds number range of 0.5 x 10(6) to 2.5 x 10(6) based on the airfoil chord of 5.55 cm. The results characterize the performance capability and the durability of the instrumentation. The challenge of making measurements in hostile environments is also discussed. The instruments exhibited more than adequate durability to achieve the measurement profile. About 70 percent of the thin-film thermocouples and the dual-element temperature probe survived several hundred thermal cycles and more than 35 hr at gas temperatures up to 1600 K. Within the experimental uncertainty, the steady-state and transient heat flux measurements were comparable and consistent over the range of Reynolds numbers tested.
Experience with advanced instrumentation in a hot section cascade
NASA Astrophysics Data System (ADS)
Yeh, Frederick C.; Gladden, Herbert J.
The Lewis Research Center gas turbine Hot Section Test Facility was developed to provide a real engine environment with known boundary conditions for the aerothermal performance evaluation and verification of computer design codes. This verification process requires experimental measurements in a hostile environment. The research instruments used in this facility are presented, and their characteristics and how they perform in this environment are discussed. The research instrumentation consisted of conventional pressure and temperature sensors, as well as thin-film thermocouples and heat flux gages. The hot gas temperature was measured by an aspirated temperature probe and by a dual-element, fast-response temperature probe. The data acquisition mode was both steady state and time dependent. These experiments were conducted over a wide range of gas Reynolds numbers, exit gas Mach numbers, and heat flux levels. This facility was capable of testing at temperatures up to 1600 K, and at pressures up to 18 atm. These corresponded to an airfoil exit Reynolds number range of 0.5 x 10(6) to 2.5 x 10(6) based on the airfoil chord of 5.55 cm. The results characterize the performance capability and the durability of the instrumentation. The challenge of making measurements in hostile environments is also discussed. The instruments exhibited more than adequate durability to achieve the measurement profile. About 70 percent of the thin-film thermocouples and the dual-element temperature probe survived several hundred thermal cycles and more than 35 hr at gas temperatures up to 1600 K. Within the experimental uncertainty, the steady-state and transient heat flux measurements were comparable and consistent over the range of Reynolds numbers tested.
Multibody modeling and verification
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1989-01-01
A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.
Charged Particle lunar Environment Experiment (CPLEE)
NASA Technical Reports Server (NTRS)
Reasoner, D. L.
1974-01-01
Research development in the Charged Particle Lunar Environment Experiment (CPLEE) is reported. The CPLEE is ion-electron spectrometer placed on the lunar surface for the purpose of measuring charged particle fluxes impacting the moon from a variety of regions and to study the interactions between space plasmas and the lunar surface. The principal accomplishments reported include: (1) furnishing design specifications for construction of the CPLEE instruments; (2) development of an advanced computer-controlled facility for automated instrument calibration; (3) active participation in the deployment and past-deployment operational phases with regard to data verification and operational mode selection; and (4) publication of research papers, including a study of lunar photoelectrons, a study of plasmas resulting from man-made lunar impart events, a study of magnetotail and magnetosheath particle populations, and a study of solar-flare interplanetary particles.
1968-01-01
AS-204, the fourth Saturn IB launch vehicle, developed by the Marshall Space Flight Center (MSFC), awaits its January 22, 1968 liftoff from Cape Canaveral, Florida for the unmarned Apollo 5 mission. Primary mission objectives included the verification of the Apollo Lunar Module's (LM) ascent and descent propulsion systems and an evaluation of the S-IVB stage instrument unit performance. In all, nine Saturn IB flights were made, ending with the Apollo-Soyuz Test Project in July 1975.
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
Report on the formal specification and partial verification of the VIPER microprocessor
NASA Technical Reports Server (NTRS)
Brock, Bishop; Hunt, Warren A., Jr.
1991-01-01
The formal specification and partial verification of the VIPER microprocessor is reviewed. The VIPER microprocessor was designed by RSRE, Malvern, England, for safety critical computing applications (e.g., aircraft, reactor control, medical instruments, armaments). The VIPER was carefully specified and partially verified in an attempt to provide a microprocessor with completely predictable operating characteristics. The specification of VIPER is divided into several levels of abstraction, from a gate-level description up to an instruction execution model. Although the consistency between certain levels was demonstrated with mechanically-assisted mathematical proof, the formal verification of VIPER was never completed.
NASA Technical Reports Server (NTRS)
Roder, H. M.
1974-01-01
Information is presented on instrumentation for density measurement, liquid level measurement, quantity gauging, and phase measurement. Coverage of existing information directly concerned with oxygen was given primary emphasis. A description of the physical principle of measurement for each instrumentation type is included. The basic materials of construction are listed if available from the source document for each instrument discussed. Cleaning requirements, procedures, and verification techniques are included.
Automatic performance budget: towards a risk reduction
NASA Astrophysics Data System (ADS)
Laporte, Philippe; Blake, Simon; Schmoll, Jürgen; Rulten, Cameron; Savoie, Denis
2014-08-01
In this paper, we discuss the performance matrix of the SST-GATE telescope developed to allow us to partition and allocate the important characteristics to the various subsystems as well as to describe the process in order to verify that the current design will deliver the required performance. Due to the integrated nature of the telescope, a large number of parameters have to be controlled and effective calculation tools must be developed such as an automatic performance budget. Its main advantages consist in alleviating the work of the system engineer when changes occur in the design, in avoiding errors during any re-allocation process and recalculate automatically the scientific performance of the instrument. We explain in this paper the method to convert the ensquared energy (EE) and the signal-to-noise ratio (SNR) required by the science cases into the "as designed" instrument. To ensure successful design, integration and verification of the next generation instruments, it is of the utmost importance to have methods to control and manage the instrument's critical performance characteristics at its very early design steps to limit technical and cost risks in the project development. Such a performance budget is a tool towards this goal.
Patterns relationships of student’s creativity with its indicators in learning optical instrument
NASA Astrophysics Data System (ADS)
Sukarmin; Dhian, T. E. V.; Nonoh, S. A.; Delisma, W. A.
2017-01-01
This study aims to identify patterns relationships of student’s creativity with its indicators in Learning Optical Instrument. The study was conducted at SMPN 2 Sawo. SMPN 1 Jetis, SMPIT Darut Taqwa, SMPN 1 Dander, Bojonegoro and SMPN 3 Plus Al-Fatima. Data analysis used descriptive analysis using the Confirmatory Factor Analysis. Creativity test instruments used have been tested parameters. Creativity indicators used are personal (self-confidence, perseverance), press (spirit, unyielding), process (preparation, incubation illumination, verification) and the product (knowledge, skills). Research Result shows that perseverance and incubation are the highest capabilities and verification capabilities of the lowest. All indicators on student creativity can still be improved. The relationship between creativity with the indicators grouped into a strong, moderate, weak and no relation. Indicators that have a strong relationship (r ≥ 0.50), namely are personal (self-confidence, perseverance), process (illumination). Indicators that have a connection was (0.3 ≤ r ≤ 0.49) are press (spirit), process (verification). Indicators which have a very low correlation (r ≤ 0.1 ≤ 0.29) are press (unyielding), process (preparation), process (incubation), product (skills) as shown in Figure 1. Indicators that do not have a relationship between the creativity of the students with the indicator that is, product (knowledge).
NASA Astrophysics Data System (ADS)
Dubroca, Guilhem; Richert, Michaël.; Loiseaux, Didier; Caron, Jérôme; Bézy, Jean-Loup
2015-09-01
To increase the accuracy of earth-observation spectro-imagers, it is necessary to achieve high levels of depolarization of the incoming beam. The preferred device in space instrument is the so-called polarization scrambler. It is made of birefringent crystal wedges arranged in a single or dual Babinet. Today, with required radiometric accuracies of the order of 0.1%, it is necessary to develop tools to find optimal and low sensitivity solutions quickly and to measure the performances with a high level of accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy
The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup,more » under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.« less
SAMS-II Requirements and Operations
NASA Technical Reports Server (NTRS)
Wald, Lawrence W.
1998-01-01
The Space Acceleration Measurements System (SAMS) II is the primary instrument for the measurement, storage, and communication of the microgravity environment aboard the International Space Station (ISS). SAMS-II is being developed by the NASA Lewis Research Center Microgravity Science Division to primarily support the Office of Life and Microgravity Science and Applications (OLMSA) Microgravity Science and Applications Division (MSAD) payloads aboard the ISS. The SAMS-II is currently in the test and verification phase at NASA LeRC, prior to its first hardware delivery scheduled for July 1998. This paper will provide an overview of the SAMS-II instrument, including the system requirements and topology, physical and electrical characteristics, and the Concept of Operations for SAMS-II aboard the ISS.
Towards an Integrated Model of the WEAVE Performance
NASA Astrophysics Data System (ADS)
Ham, S. J.; Dalton, G.
2016-10-01
WEAVE is a new facility instrument for the 4.2 m William Herschel Telescope (WHT). The instrument has a 2° field of view and covers a wavelength range of 366-950 nm with up to 960 simultaneous spectra in each observation. The spectrograph consists of a collimator mirror and two correcting lenses before a VPH grating and two 8-lens cameras. The two cameras have been designed to have the same lens shapes. Here we report on the development of detailed simulations for the verification of the whole data reduction procedure and analysis pipeline, and for the generation of high signal-to-noise reference images that can be used as fitting templates for fiber positions and PSF mapping.
NASA Technical Reports Server (NTRS)
Gregory, J. C.
1986-01-01
Instrument design and data analysis expertise was provided in support of several space radiation monitoring programs. The Verification of Flight Instrumentation (VFI) program at NASA included both the Active Radiation Detector (ARD) and the Nuclear Radiation Monitor (NRM). Design, partial fabrication, calibration and partial data analysis capability to the ARD program was provided, as well as detector head design and fabrication, software development and partial data analysis capability to the NRM program. The ARD flew on Spacelab-1 in 1983, performed flawlessly and was returned to MSFC after flight with unchanged calibration factors. The NRM, flown on Spacelab-2 in 1985, also performed without fault, not only recording the ambient gamma ray background on the Spacelab, but also recording radiation events of astrophysical significance.
NASA Technical Reports Server (NTRS)
Werrett, Stephen; Seivold, Alfred L.
1990-01-01
A detailed nodal computer model was developed to thermally represent the hardware, and sensitivity studies were performed to evaluate design parameters and orbital environmental effects of an instrument cooling system for IR detectors. Thermal-vacuum testing showed excellent performance of the system and a correspondence with math model predictions to within 3 K. Results show cold stage temperature sensitivity to cold patch backload, outer stage external surface emittance degradation, and cold stage emittance degradation, respectively. The increase in backload on the cold patch over the mission lifetime is anticipated to be less than 3.0 watts, which translates to less than a 3-degree increase in detector temperatures.
Fiber Lasers and Amplifiers for Space-based Science and Exploration
NASA Technical Reports Server (NTRS)
Yu, Anthony W.; Krainak, Michael A.; Stephen, Mark A.; Chen, Jeffrey R.; Coyle, Barry; Numata, Kenji; Camp, Jordan; Abshire, James B.; Allan, Graham R.; Li, Steven X.;
2012-01-01
We present current and near-term uses of high-power fiber lasers and amplifiers for NASA science and spacecraft applications. Fiber lasers and amplifiers offer numerous advantages for the deployment of instruments on exploration and science remote sensing satellites. Ground-based and airborne systems provide an evolutionary path to space and a means for calibration and verification of space-borne systems. NASA fiber-laser-based instruments include laser sounders and lidars for measuring atmospheric carbon dioxide, oxygen, water vapor and methane and a pulsed or pseudo-noise (PN) code laser ranging system in the near infrared (NIR) wavelength band. The associated fiber transmitters include high-power erbium, ytterbium, and neodymium systems and a fiber laser pumped optical parametric oscillator. We discuss recent experimental progress on these systems and instrument prototypes for ongoing development efforts.
NASA Astrophysics Data System (ADS)
Lukyanov, A. D.; Alekseev, V. V.; Bogomolov, Yu V.; Dunaeva, O. A.; Malakhov, V. V.; Mayorov, A. G.; Rodenko, S. A.
2017-01-01
Analysis of experimental data of primary positrons and antiprotons fluxes obtained by PAMELA spectrometer, recently confirmed by AMS-02 spectrometer, for some reasons is of big interest for scientific community, especially for energies higher than 100 GV, where appearance of signal coming from dark matter particles is possible. In this work we present a method for verification of charge sign for high-energy antiprotons, measured by magnetic tracking system of PAMELA spectrometer, which can be immitated by protons due to scattering or finite instrumental resolution at high energies (so-called “spillover”). We base our approach on developing2 a set of distinctive features represented by differently computed rigidities and training AdaBoost classifier, which shows good classification accuracy on Monte-Carlo simulation data of 98% for rigidity up to 600 GV.
Instrumented urethral catheter and its ex vivo validation in a sheep urethra
NASA Astrophysics Data System (ADS)
Ahmadi, Mahdi; Rajamani, Rajesh; Timm, Gerald; Sezen, Serdar
2017-03-01
This paper designs and fabricates an instrumented catheter for instantaneous measurement of distributed urethral pressure profiles. Since the catheter enables a new type of urological measurement, a process for accurate ex vivo validation of the catheter is developed. A flexible sensor strip is first fabricated with nine pressure sensors and integrated electronic pads for an associated sensor IC chip. The flexible sensor strip and associated IC chip are assembled on a 7 Fr Foley catheter. A sheep bladder and urethra are extracted and used in an ex vivo set up for verification of the developed instrumented catheter. The bladder-urethra are suspended in a test rig and pressure cuffs placed to apply known static and dynamic pressures around the urethra. A significant challenge in the performance of the sensor system is the presence of parasitics that introduce large bias and drift errors in the capacitive sensor signals. An algorithm based on use of reference parasitic transducers is used to compensate for the parasitics. Extensive experimental results verify that the developed compensation method works effectively. Results on pressure variation profiles circumferentially around the urethra and longitudinally along the urethra are presented. The developed instrumented catheter will be useful in improved urodynamics to more accurately diagnose the source of urinary incontinence in patients.
The SRI Model 86 1 OC gas chromatograph (GC) is a transportable instrument that can provide on-site analysis of soils for explosives. Coupling this transportable gas chromatograph with a thermionic ionization detector (TID) allows for the determination of explosives in soil matri...
The CHANDRA X-Ray Observatory: Thermal Design, Verification, and Early Orbit Experience
NASA Technical Reports Server (NTRS)
Boyd, David A.; Freeman, Mark D.; Lynch, Nicolie; Lavois, Anthony R. (Technical Monitor)
2000-01-01
The CHANDRA X-ray Observatory (formerly AXAF), one of NASA's "Great Observatories" was launched aboard the Shuttle in July 1999. CHANDRA comprises a grazing-incidence X-ray telescope of unprecedented focal-length, collecting area and angular resolution -- better than two orders of magnitude improvement in imaging performance over any previous soft X-ray (0.1-10 keV) mission. Two focal-plane instruments, one with a 150 K passively-cooled detector, provide celestial X-ray images and spectra. Thermal control of CHANDRA includes active systems for the telescope mirror and environment and the optical bench, and largely passive systems for the focal plans instruments. Performance testing of these thermal control systems required 1-1/2 years at increasing levels of integration, culminating in thermal-balance testing of the fully-configured observatory during the summer of 1998. This paper outlines details of thermal design tradeoffs and methods for both the Observatory and the two focal-plane instruments, the thermal verification philosophy of the Chandra program (what to test and at what level), and summarizes the results of the instrument, optical system and observatory testing.
First results of the wind evaluation breadboard for ELT primary mirror design
NASA Astrophysics Data System (ADS)
Reyes García-Talavera, Marcos; Viera, Teodora; Núñez, Miguel
2010-07-01
The Wind Evaluation Breadboard (WEB) is a primary mirror and telescope simulator formed by seven aluminium segments, including position sensors, electromechanical support systems and support structures. WEB has been developed to evaluate technologies for primary mirror wavefront control and to evaluate the performance of the control of wind buffeting disturbance on ELT segmented mirrors. For this purpose WEB electro-mechanical set-up simulates the real operational constrains applied to large segmented mirrors. This paper describes the WEB assembly, integration and verification, the instrument characterisation and close loop control design, including the dynamical characterization of the instrument and the control architecture. The performance of the new technologies developed for position sensing, acting and controlling is evaluated. The integration of the instrument in the observatory and the results of the first experiments are summarised, with different wind conditions, elevation and azimuth angles of incidence. Conclusions are extracted with respect the wind rejection performance and the control strategy for an ELT. WEB has been designed and developed by IAC, ESO, ALTRAN and JUPASA, with the integration of subsystems of FOGALE and TNO.
Lin, Wen-Yen; Chou, Wen-Cheng; Tsai, Tsai-Hsuan; Lin, Chung-Chih; Lee, Ming-Yih
2016-12-17
Body posture and activity are important indices for assessing health and quality of life, especially for elderly people. Therefore, an easily wearable device or instrumented garment would be valuable for monitoring elderly people's postures and activities to facilitate healthy aging. In particular, such devices should be accepted by elderly people so that they are willing to wear it all the time. This paper presents the design and development of a novel, textile-based, intelligent wearable vest for real-time posture monitoring and emergency warnings. The vest provides a highly portable and low-cost solution that can be used both indoors and outdoors in order to provide long-term care at home, including health promotion, healthy aging assessments, and health abnormality alerts. The usability of the system was verified using a technology acceptance model-based study of 50 elderly people. The results indicated that although elderly people are anxious about some newly developed wearable technologies, they look forward to wearing this instrumented posture-monitoring vest in the future.
Lin, Wen-Yen; Chou, Wen-Cheng; Tsai, Tsai-Hsuan; Lin, Chung-Chih; Lee, Ming-Yih
2016-01-01
Body posture and activity are important indices for assessing health and quality of life, especially for elderly people. Therefore, an easily wearable device or instrumented garment would be valuable for monitoring elderly people’s postures and activities to facilitate healthy aging. In particular, such devices should be accepted by elderly people so that they are willing to wear it all the time. This paper presents the design and development of a novel, textile-based, intelligent wearable vest for real-time posture monitoring and emergency warnings. The vest provides a highly portable and low-cost solution that can be used both indoors and outdoors in order to provide long-term care at home, including health promotion, healthy aging assessments, and health abnormality alerts. The usability of the system was verified using a technology acceptance model-based study of 50 elderly people. The results indicated that although elderly people are anxious about some newly developed wearable technologies, they look forward to wearing this instrumented posture-monitoring vest in the future. PMID:27999324
Atmospheric verification mission for the TSS/STARFAC tethered satellite
NASA Technical Reports Server (NTRS)
Wood, George M., Jr.; Stuart, Thomas D.; Crouch, Donald S.; Deloach, Richard; Brown, Kenneth G.
1991-01-01
Two types of a tethered satellite system (TSS) - a basic 1.8-m-diameter spherical spacecraft and the Shuttle Tethered Aerothermodynamic Research Facility (STARFAC) are considered. Issues related to the deployment and retrieval of a large satellite with exceedingly long tethers are discussed, and the objectives of an Atmospheric Verification Mission (ATM) are outlined. Focus is concentrated on the ATM satellite which will fly after TSS-1 and before the fully instrumented and costlier TSS-2. The differences between the AVM and TSS-2, including the configuration of the aerodynamic stabilizers, instrumentation, and the materials of construction are outlined. The basic Kevlar tether defined for the TSS-2 is being considered for use with the AVM, however, a complex tether is under consideration as well.
Gran Telescopio Canarias Commissioning Instrument Optomechanics
NASA Astrophysics Data System (ADS)
Espejo, Carlos; Cuevas, Salvador; Sanchez, Beatriz; Flores, Ruben; Lara, Gerardo; Farah, Alejandro; Godoy, Javier; Bringas, Vicente; Chavoya, Armando; Dorantes, Ariel; Manuel Montoya, Juan; Rangel, Juan Carlos; Devaney, Nicholas; Castro, Javier; Cavaller, Luis
2003-02-01
Under a contract with the GRANTECAN, the Commissioning Instrument is a project developed by a team of Mexican scientists and engineers from the Instrumentation Department of the Astronomy Institute at the UNAM and the CIDESI Engineering Center. This paper will discuss in some detail the final Commissioning Instrument (CI) mechanical design and fabrication. We will also explain the error budget and the barrels design as well as their thermal compensation. The optical design and the control system are discussed in other papers. The CI will just act as a diagnostic tool for image quality verification during the GTC Commissioning Phase. This phase is a quality control process for achieving, verifying, and documenting the performance of each GTC sub-systems. This is a very important step for the telescope life. It will begin on starting day and will last for a year. The CI project started in December 2000. The critical design phase was reviewed in July 2001. The CI manufacturing is currently in progress and most parts are finished. We are now approaching the factory acceptance stage.
Establishing and Monitoring an Aseptic Workspace for Building the MOMA Mass Spectrometer
NASA Technical Reports Server (NTRS)
Lalime, Erin
2016-01-01
Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the ESA ExoMars 2018 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). As MOMA-MS is a life-detection instrument and it thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 sporem2 is allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three cleanrooms with varying levels of bioburden control. The Aseptic Assembly Cleanroom has the highest level of control, applying three different bioburden reducing methods: 70 IPA, 7.5 Hydrogen Peroxide, and Ultra-Violet C light. The three methods are used in rotation and each kills microbes by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Cleanrooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of cleanrooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and DHMR verification. The cleanrooms are monitored both for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic the other cleanroom.
Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases
NASA Astrophysics Data System (ADS)
Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.
2013-12-01
NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.
1995-08-01
7515 (818) 791-8805 (FAX) (415) 742-7540 (FAX) CHEVRON CORPORATION HUISH DETERGENTS, INC. CHARLOTTE R. FARBER DON GOLLADAY 225 BUSH STREET P.O. BOX 25057...USA) TEXAS INSTRUMENTS KENT BOSSART ZAK KARAMALLY 1701 PENNSYLVANIA AVE., NW #500 TEXAS INSTRUMENTS WASHINGTON, DC 20006 ZAK KARAMALLY 202-861-0668 P
Fiber lasers and amplifiers for science and exploration at NASA Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Krainak, Michael A.; Abshire, James; Allan, Graham R.; Stephen Mark
2005-01-01
We discuss present and near-term uses for high-power fiber lasers and amplifiers for NASA- specific applications including planetary topography and atmospheric spectroscopy. Fiber lasers and amplifiers offer numerous advantages for both near-term and future deployment of instruments on exploration and science remote sensing orbiting satellites. Ground-based and airborne systems provide an evolutionary path to space and a means for calibration and verification of space-borne systems. We present experimental progress on both the fiber transmitters and instrument prototypes for ongoing development efforts. These near-infrared instruments are laser sounders and lidars for measuring atmospheric carbon dioxide, oxygen, water vapor and methane and a pseudo-noise (PN) code laser ranging system. The associated fiber transmitters include high-power erbium, ytterbium, neodymium and Raman fiber amplifiers. In addition, we will discuss near-term fiber laser and amplifier requirements and programs for NASA free space optical communications, planetary topography and atmospheric spectroscopy.
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
[Development of a measurement of intellectual capital for hospital nursing organizations].
Kim, Eun A; Jang, Keum Seong
2011-02-01
This study was done to develop an instrument for measuring intellectual capital and assess its validity and reliability in identifying the components, human capital, structure capital and customer capital of intellectual capital in hospital nursing organizations. The participants were 950 regular clinical nurses who had worked for over 13 months in 7 medical hospitals including 4 national university hospitals and 3 private university hospitals. The data were collected through a questionnaire survey done from July 2 to August 25, 2009. Data from 906 nurses were used for the final analysis. Data were analyzed using descriptive statistics, Cronbach's alpha coefficients, item analysis, factor analysis (principal component analysis, Varimax rotation) with the SPSS PC+ 17.0 for Windows program. Developing the instrument for measuring intellectual capital in hospital nursing organizations involved a literature review, development of preliminary items, and verification of validity and reliability. The final instrument was in a self-report form on a 5-point Likert scale. There were 29 items on human capital (5 domains), 21 items on customer capital (4 domains), 26 items on structure capital (4 domains). The results of this study may be useful to assess the levels of intellectual capital of hospital nursing organizations.
Optimization and Verification of a Brushless DC-Motor for Cryogenic Mechanisms
NASA Astrophysics Data System (ADS)
Eggens, M.; van Loon, D.; Smit, H. P.; Jellema, W.; Dieleman, P.; Detrain, A.; Stokroos, M.; Nieuwenhuizen, A. C. T.
2013-09-01
In this paper we report on the results of the investigation on the feasibility of a cryogenic motor for a Filter Wheel Mechanism (FWM) for the instrument SpicA FAR-infrared Instrument (SAFARI). The maximum allowed dissipation of 1 mW is a key requirement, as a result of the limited cooling resources of the satellite. Therefore a quasi 3D electromagnetic (EM) model of a Brushless DC (BLDC) motor has been developed. To withstand the severe launch loads a mechanical concept has been designed to limit the friction torque in the bearings. The model was verified by room temperature and cryogenic measurements on an existing motor from the test setup. The model shows that the proposed BLDC motor design fulfills the requirements.
The Earth Observing System AM Spacecraft - Thermal Control Subsystem
NASA Technical Reports Server (NTRS)
Chalmers, D.; Fredley, J.; Scott, C.
1993-01-01
Mission requirements for the EOS-AM Spacecraft intended to monitor global changes of the entire earth system are considered. The spacecraft is based on an instrument set containing the Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multiangle Imaging Spectro-Radiometer (MISR), Moderate-Resolution Imaging Spectrometer (MODIS), and Measurements of Pollution in the Troposphere (MOPITT). Emphasis is placed on the design, analysis, development, and verification plans for the unique EOS-AM Thermal Control Subsystem (TCS) aimed at providing the required environments for all the onboard equipment in a densely packed layout. The TCS design maximizes the use of proven thermal design techniques and materials, in conjunction with a capillary pumped two-phase heat transport system for instrument thermal control.
Commissioning Instrument for the GTC
NASA Astrophysics Data System (ADS)
Cuevas, S.; Sánchez, B.; Bringas, V.; Espejo, C.; Flores, R.; Chapa, O.; Lara, G.; Chavolla, A.; Anguiano, G.; Arciniega, S.; Dorantes, A.; González, J. L.; Montoya, J. M.; Toral, R.; Hernández, H.; Nava, R.; Devaney, N.; Castro, J.; Cavaller-Marqués, L.
2005-12-01
During the GTC integration phase, the Commissioning Instrument (CI) will be a diagnostic tool for performance verification. The CI features four operation modes: imaging, pupil imaging, Curvature WFS, and high resolution Shack-Hartmann WFS. This instrument was built by the Instituto de Astronomía UNAM and the Centro de Ingeniería y Desarrollo Industrial (CIDESI) under GRANTECAN contract after a public bid. In this paper we made a general instrument overview and we show some of the performance final results obtained when the Factory Acceptance tests previous to its transport to La Palma.
ChemCam rock laser for Mars Science Laboratory "Curiosity"
Wiens, Roger
2018-02-06
Los Alamos has a long history of space-related instruments, tied primarily to its role in defense-related treaty verification. Space-based detectors have helped determine the differences between signals from lightning bolts and potential nuclear explosions. LANL-developed gamma-ray detection instruments first revealed the existence of what we now know as gamma-ray bursts, an exciting area of astrophysical research. And the use of LANL instruments on varied space missions continues with such products as the ChemCam rock laser for NASA, shown here. The Engineering Model of the ChemCam Mars Science Laboratory rover instrument arrived at NASA's Jet Propulsion Laboratory on February 6, 2008. The Flight Model was shipped in August, 2010 for installation on the rover at JPL. ChemCam will use imaging and laser-induced breakdown spectroscopy (LIBS) to determine rock and soil compositions on Mars, up to 9 meters from the rover. The engineering model is being integrated into the rover test bed for the development and testing of the rover software. The actual flight model components were concurrently assembled at Los Alamos and in Toulouse, France. The Mars Science Laboratory is scheduled to launch in 2011. Animations courtesy of JPL/NASA.
ChemCam rock laser for Mars Science Laboratory "Curiosity"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiens, Roger
2010-09-03
Los Alamos has a long history of space-related instruments, tied primarily to its role in defense-related treaty verification. Space-based detectors have helped determine the differences between signals from lightning bolts and potential nuclear explosions. LANL-developed gamma-ray detection instruments first revealed the existence of what we now know as gamma-ray bursts, an exciting area of astrophysical research. And the use of LANL instruments on varied space missions continues with such products as the ChemCam rock laser for NASA, shown here. The Engineering Model of the ChemCam Mars Science Laboratory rover instrument arrived at NASA's Jet Propulsion Laboratory on February 6, 2008.more » The Flight Model was shipped in August, 2010 for installation on the rover at JPL. ChemCam will use imaging and laser-induced breakdown spectroscopy (LIBS) to determine rock and soil compositions on Mars, up to 9 meters from the rover. The engineering model is being integrated into the rover test bed for the development and testing of the rover software. The actual flight model components were concurrently assembled at Los Alamos and in Toulouse, France. The Mars Science Laboratory is scheduled to launch in 2011. Animations courtesy of JPL/NASA.« less
ChemCam Rock Laser for the Mars Science Laboratory
LANL
2017-12-09
Los Alamos has a long history of space-related instr... Los Alamos has a long history of space-related instruments, tied primarily to its role in defense-related treaty verification. Space-based detectors have helped determine the differences between signals from lightning bolts and potential nuclear explosions. LANL-developed gamma-ray detection instruments first revealed the existence of what we now know as gamma-ray bursts, an exciting area of astrophysical research. And the use of LANL instruments on varied space missions continues with such products as the ChemCam rock laser for NASA, shown here. The Engineering Model of the ChemCam Mars Science Laboratory rover instrument arrived at NASA's Jet Propulsion Laboratory on February 6, 2008. ChemCam will use imaging and laser-induced breakdown spectroscopy (LIBS) to determine rock and soil compositions on Mars, up to 9 meters from the rover. The engineering model is being integrated into the rover test bed for the development and testing of the rover software. The actual flight model components are concurrently being assembled at Los Alamos and in Toulouse, France, and will be delivered to JPL in July. The Mars Science Laboratory is scheduled to launch in 2009. Animations courtesy of JPL/NASA.
Performance of the EGRET astronomical gamma ray telescope
NASA Technical Reports Server (NTRS)
Nolan, P. L.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hofstadter, R.; Hughes, E. B.; Hunter, S. D.; Kanbach, G.; Kniffen, D. A.; Lin, Y. C.
1992-01-01
On April 5, 1991, the Space Shuttle Atlantis carried the Compton Gamma Ray Observatory (CGRO) into orbit, deploying the satellite on April 7. The EGRET instrument was activated on April 15, and the first month of operations was devoted to verification of the instrument performance. Measurements made during that month and in the subsequent sky survey phase have verified that the instrument time resolution, angular resolution, and gamma ray detection efficiency are all within nominal limits.
Russian-US collaboration on implementation of the active well coincidence counter (AWCC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mozhajev, V.; Pshakin, G.; Stewart, J.
The feasibility of using a standard AWCC at the Obninsk IPPE has been demonstrated through active measurements of single UO{sub 2} (36% enriched) disks and through passive measurements of plutonium metal disks used for simulating reactor cores. The role of the measurements is to verify passport values assigned to the disks by the facility, and thereby facilitate the mass accountability procedures developed for the very large inventory of fuel disks at the facility. The AWCC is a very flexible instrument for verification measurements of the large variety of nuclear material items at the Obninsk IPPE and other Russian facilities. Futuremore » work at the IPPE will include calibration and verification measurements for other materials, both in individual disks and in multi-disk storage tubes; it will also include training in the use of the AWCC.« less
NASA Technical Reports Server (NTRS)
Fey, M. G.
1981-01-01
The experimental verification system for the production of silicon via the arc heater-sodium reduction of SiCl4 was designed, fabricated, installed, and operated. Each of the attendant subsystems was checked out and operated to insure performance requirements. These subsystems included: the arc heaters/reactor, cooling water system, gas system, power system, Control & Instrumentation system, Na injection system, SiCl4 injection system, effluent disposal system and gas burnoff system. Prior to introducing the reactants (Na and SiCl4) to the arc heater/reactor, a series of gas only-power tests was conducted to establish the operating parameters of the three arc heaters of the system. Following the successful completion of the gas only-power tests and the readiness tests of the sodium and SiCl4 injection systems, a shakedown test of the complete experimental verification system was conducted.
CSTI Earth-to-orbit propulsion research and technology program overview
NASA Technical Reports Server (NTRS)
Gentz, Steven J.
1993-01-01
NASA supports a vigorous Earth-to-orbit (ETO) research and technology program as part of its Civil Space Technology Initiative. The purpose of this program is to provide an up-to-date technology base to support future space transportation needs for a new generation of lower cost, operationally efficient, long-lived and highly reliable ETO propulsion systems by enhancing the knowledge, understanding and design methodology applicable to advanced oxygen/hydrogen and oxygen/hydrocarbon ETO propulsion systems. Program areas of interest include analytical models, advanced component technology, instrumentation, and validation/verification testing. Organizationally, the program is divided between technology acquisition and technology verification as follows: (1) technology acquisition; and (2) technology verification.
Monitoring/Verification using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan Weeks, Kevin Kyle, Manuel Manard
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
Monitoring/Verification Using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Kyle; Stephan Weeks
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
Stennis Space Center Verification & Validation Capabilities
NASA Technical Reports Server (NTRS)
Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir
2007-01-01
Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.
EOS Aura MLS, first year post-launch engineering assessment
NASA Technical Reports Server (NTRS)
Lee, Karen A.; Lay, Richard R.; Jarnot, Robert F.; Cofield, Richard E.; Flower, Dennis A.; Pickett, Herbert M.
2005-01-01
This paper discusses the current status of the MLS instrument which now continuously provides data to produce global maps of targeted chemical species as well as temperature, cloud ice, and gravity wave activity. Performance trends are assessed with respect to characterization during initial on-orbit activiation of the instrument, and with data from ground test verification prior to launch.
NASA Technical Reports Server (NTRS)
Higgins, T.
1998-01-01
An antenna drive subsystem test was performed on the METSAT AMSU-A2 S/N 106 instrument. The objective of the test was to demonstrate compliance with applicable paragraphs of AMSU-A specifications S480-80. Tests were conducted at both the subassembly and instrument level.
Henzlova, Daniela; Menlove, Howard Olsen; Rael, Carlos D.; ...
2015-10-09
Our paper presents results of the first experimental demonstration of the Californium Interrogation Prompt Neutron (CIPN) instrument developed within a multi-year effort launched by the Next Generation Safeguards Initiative Spent Fuel Project of the United States Department of Energy. The goals of this project focused on developing viable non-destructive assay techniques with capabilities to improve an independent verification of spent fuel assembly characteristics. For this purpose, the CIPN instrument combines active and passive neutron interrogation, along with passive gamma-ray measurements, to provide three independent observables. We describe the initial feasibility demonstration of the CIPN instrument, which involved measurements of fourmore » pressurized-water-reactor spent fuel assemblies with different levels of burnup and two initial enrichments. The measurements were performed at the Post-Irradiation Examination Facility at the Korea Atomic Energy Institute in the Republic of Korea. The key aim of the demonstration was to evaluate CIPN instrument performance under realistic deployment conditions, with the focus on a detailed assessment of systematic uncertainties that are best evaluated experimentally. The measurements revealed good positioning reproducibility, as well as a high degree of insensitivity of the CIPN instrument's response to irregularities in a radial burnup profile. Systematic uncertainty of individual CIPN instrument signals due to assembly rotation was found to be <4.5%, even for assemblies with fairly extreme gradients in the radial burnup profile. Lastly, these features suggest that the CIPN instrument is capable of providing a good representation of assembly average characteristics, independent of assembly orientation in the instrument.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henzlova, Daniela; Menlove, Howard Olsen; Rael, Carlos D.
Our paper presents results of the first experimental demonstration of the Californium Interrogation Prompt Neutron (CIPN) instrument developed within a multi-year effort launched by the Next Generation Safeguards Initiative Spent Fuel Project of the United States Department of Energy. The goals of this project focused on developing viable non-destructive assay techniques with capabilities to improve an independent verification of spent fuel assembly characteristics. For this purpose, the CIPN instrument combines active and passive neutron interrogation, along with passive gamma-ray measurements, to provide three independent observables. We describe the initial feasibility demonstration of the CIPN instrument, which involved measurements of fourmore » pressurized-water-reactor spent fuel assemblies with different levels of burnup and two initial enrichments. The measurements were performed at the Post-Irradiation Examination Facility at the Korea Atomic Energy Institute in the Republic of Korea. The key aim of the demonstration was to evaluate CIPN instrument performance under realistic deployment conditions, with the focus on a detailed assessment of systematic uncertainties that are best evaluated experimentally. The measurements revealed good positioning reproducibility, as well as a high degree of insensitivity of the CIPN instrument's response to irregularities in a radial burnup profile. Systematic uncertainty of individual CIPN instrument signals due to assembly rotation was found to be <4.5%, even for assemblies with fairly extreme gradients in the radial burnup profile. Lastly, these features suggest that the CIPN instrument is capable of providing a good representation of assembly average characteristics, independent of assembly orientation in the instrument.« less
NASA Technical Reports Server (NTRS)
Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond;
2007-01-01
The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.
Life cycle management of analytical methods.
Parr, Maria Kristina; Schmidt, Alexander H
2018-01-05
In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.
Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder
NASA Technical Reports Server (NTRS)
Lindsey, A. E.; Pecheur, Charles
2004-01-01
AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.
Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems
NASA Technical Reports Server (NTRS)
Powell, John D.; Gilliam, David
2004-01-01
The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.
NASA Technical Reports Server (NTRS)
McGill, Matthew; Markus, Thorsten; Scott, V. Stanley; Neumann, Thomas
2012-01-01
The Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) mission is currently under development by NASA. The primary mission of ICESat-2 will be to measure elevation changes of the Greenland and Antarctic ice sheets, document changes in sea ice thickness distribution, and derive important information about the current state of the global ice coverage. To make this important measurement, NASA is implementing a new type of satellite-based surface altimetry based on sensing of laser pulses transmitted to, and reflected from, the surface. Because the ICESat-2 measurement approach is different from that used for previous altimeter missions, a high-fidelity aircraft instrument, the Multiple Altimeter Beam Experimental Lidar (MABEL), was developed to demonstrate the measurement concept and provide verification of the ICESat-2 methodology. The MABEL instrument will serve as a prototype for the ICESat-2 mission and also provides a science tool for studies of land surface topography. This paper outlines the science objectives for the ICESat-2 mission, the current measurement concept for ICESat-2, and the instrument concept and preliminary data from MABEL.
Active Interrogation for Spent Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swinhoe, Martyn Thomas; Dougan, Arden
2015-11-05
The DDA instrument for nuclear safeguards is a fast, non-destructive assay, active neutron interrogation technique using an external 14 MeV DT neutron generator for characterization and verification of spent nuclear fuel assemblies.
33 CFR 155.790 - Deck lighting.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Procedures, Equipment, and Records § 155.790 Deck lighting. (a) A self-propelled vessel with a capacity of... inadequate the OCMI or COTP may require verification by instrument of the levels of illumination. On a...
Assessing personal talent determinants in young racquet sport players: a systematic review.
Faber, Irene R; Bustin, Paul M J; Oosterveld, Frits G J; Elferink-Gemser, Marije T; Nijhuis-Van der Sanden, Maria W G
2016-01-01
Since junior performances have little predictive value for future success, other solutions are sought to assess a young player's potential. The objectives of this systematic review are (1) to provide an overview of instruments measuring personal talent determinants of young players in racquet sports, and (2) to evaluate these instruments regarding their validity for talent development. Electronic searches were conducted in PubMed, PsychINFO, Web of Knowledge, ScienceDirect and SPORTDiscus (1990 to 31 March 2014). Search terms represented tennis, table tennis, badminton and squash, the concept of talent, methods of testing and children. Thirty articles with information regarding over 100 instruments were included. Validity evaluation showed that instruments focusing on intellectual and perceptual abilities, and coordinative skills discriminate elite from non-elite players and/or are related to current performance, but their predictive validity is not confirmed. There is moderate evidence that the assessments of mental and goal management skills predict future performance. Data on instruments measuring physical characteristics prohibit a conclusion due to conflicting findings. This systematic review yielded an ambiguous end point. The lack of longitudinal studies precludes verification of the instrument's capacity to forecast future performance. Future research should focus on instruments assessing multidimensional talent determinants and their predictive value in longitudinal designs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewberry, R.; Ayers, J.; Tietze, F.
The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of reference 7 stated 'The facility will report measured values, book values, and statistical control limits for the selected items to DOE SR...', and 'The site/facility operator must develop, document, and maintain measurement methods for all nuclear material on inventory'. These new requirements exceeded SRNL's experience with prior semi-annual inventory expectations, but allowed the AD nuclear field measurement group to demonstrate its excellent adaptability and superior flexibility to respond to unpredicted expectations from the DOE customer. The requirements yielded five SRNL items subject to Pu verification and two SRNL items subject to HEU verification. These items are listed and described in Table 1.« less
Establishing and monitoring an aseptic workspace for building the MOMA mass spectrometer
NASA Astrophysics Data System (ADS)
Lalime, Erin N.; Berlin, David
2016-09-01
Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the European Space Agency (ESA) ExoMars 2020 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). MOMA-MS is a life-detection instrument and thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 spore/m2 are allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three clean rooms with varying levels of bioburden control. The Aseptic Assembly Clean room has the highest level of control, applying three different bioburden reducing methods: 70% Isopropyl Alcohol (IPA), 7.5% Hydrogen Peroxide, and Ultra-Violet C (UVC) light. The three methods are used in rotation and each kills microorganisms by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Clean rooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of clean rooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and Dry Heat microbial Reduction (DHMR) verification. The clean rooms are monitored for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic and other clean room.
Establishing and Monitoring an Aseptic Workspace for Building the MOMA Mass Spectrometer
NASA Technical Reports Server (NTRS)
Lalime, Erin N.; Berlin, David
2016-01-01
Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the European Space Agency (ESA) ExoMars 2020 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). MOMA-MS is a life-detection instrument and thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 spore/m2 are allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three clean rooms with varying levels of bioburden control. The Aseptic Assembly Clean room has the highest level of control, applying three different bioburden reducing methods: 70% Isopropyl Alcohol (IPA), 7.5% Hydrogen Peroxide, and Ultra-Violet C (UVC) light. The three methods are used in rotation and each kills microorganisms by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Clean rooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of clean rooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and Dry Heat microbial Reduction (DHMR) verification. The clean rooms are monitored for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic and other clean room.
Earth system dynamics: The interrelation of atmospheric, ocean and solid earth dynamics
NASA Technical Reports Server (NTRS)
Tapley, Byron D.; Asrar, Ghassem
1993-01-01
The research work performed during the time period 16 Oct. 1992 through 31 Dec. 1993 is summarized. The overall research activity, including a list of the major findings of the EOS IDS research to date, is described, the publications and presentations are listed, and a budget request for the subsequent year is attached. Specifically, the report covers: EOS panel activities; major findings of research; team member contributions; new research directions; EOS restructuring effect; changes in requirements; plans for using existing data; collaborations with other EOS and non-EOS investigations; EOS instrument team interaction; instrument development verification and validation; interaction with EOSDIS and DAAC's; team coordination; overall management; summary of response to site review questions and comments; science computing facility; and additional new research activities.
CAVE: the design of a precision metrology instrument for studying performance of KDP crystals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hibbard, R.L., LLNL
1998-03-30
A device has been developed to measure the frequency conversion performance of large aperture potassium dihydrogen phosphate (KDP) crystals. Third harmonic generation using ICDP is critical to the function of the National Ignition Facility (NIF) laser. The crystals in the converter can be angularly or thermally tuned but are subject to larger aperture inhomogeneities that are functions of growth manufacturing and - mounting. The CAVE (Crystal Alignment Verification Equipment) instrument scans the crystals in a thermally and mechanically controlled environment to determine the local peak tuning angles. The CAVE can then estimate the optimum tuning angle and conversion efficiency overmore » the entire aperture. Coupled with other metrology techniques, the CAVE will help determine which crystal life-cycle components most affect harmonic conversion.« less
NASA Astrophysics Data System (ADS)
Freudling, M.; Egner, S.; Hering, M.; Carbó, F. L.; Thiele, H.
2017-09-01
The Meteosat Third Generation (MTG) Programme will ensure the future continuity and enhancement of meteorological data from geostationary orbit as currently provided by the Meteosat Second Generation (MSG) system. The industrial prime contractor for the space segment is Thales Alenia Space (France), with a core team consortium including OHB System AG (Germany).
Acceptance test procedure for the L-070 project mechanical equipment and instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loll, C.M.
1996-04-19
This document contains the acceptance test procedure for the mechanical equipment and instrumentation installed per the L-070 Project. The specific system to be tested are the pump controls for the 3906 Lift Station and 350-A Lift Station. In addition, verification that signals are being received by the 300 Area Treated Effluent Disposal Facility control system, is also performed.
Probabilistic Sizing and Verification of Space Ceramic Structures
NASA Astrophysics Data System (ADS)
Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit
2012-07-01
Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.
A Study on Performance and Safety Tests of Electrosurgical Equipment.
Tavakoli Golpaygani, A; Movahedi, M M; Reza, M
2016-09-01
Modern medicine employs a wide variety of instruments with different physiological effects and measurements. Periodic verifications are routinely used in legal metrology for industrial measuring instruments. The correct operation of electrosurgical generators is essential to ensure patient's safety and management of the risks associated with the use of high and low frequency electrical currents on human body. The metrological reliability of 20 electrosurgical equipment in six hospitals (3 private and 3 public) was evaluated in one of the provinces of Iran according to international and national standards. The achieved results show that HF leakage current of ground-referenced generators are more than isolated generators and the power analysis of only eight units delivered acceptable output values and the precision in the output power measurements was low. Results indicate a need for new and severe regulations on periodic performance verifications and medical equipment quality control program especially in high risk instruments. It is also necessary to provide training courses for operating staff in the field of meterology in medicine to be acquianted with critical parameters to get accuracy results with operation room equipment.
Protocol Gas Verification Program Audit Reports
View the full reports from 2010 and 2013 of the PGVP audits, which tested the EPA Protocol gases that are used to calibrate continuous emission monitoring systems (CEMS), and the instruments used in EPA reference methods.
42 CFR 136a.16 - Beneficiary Identification Cards and verification of tribal membership.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the charter, articles of incorporation, or other legal instruments or traditional processes of the... making their determination. (Approved by the Office of Management and Budget under control number 0915...
Climate Absolute Radiance and Refractivity Observatory (CLARREO)
NASA Technical Reports Server (NTRS)
Leckey, John P.
2015-01-01
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a mission, led and developed by NASA, that will measure a variety of climate variables with an unprecedented accuracy to quantify and attribute climate change. CLARREO consists of three separate instruments: an infrared (IR) spectrometer, a reflected solar (RS) spectrometer, and a radio occultation (RO) instrument. The mission will contain orbiting radiometers with sufficient accuracy, including on orbit verification, to calibrate other space-based instrumentation, increasing their respective accuracy by as much as an order of magnitude. The IR spectrometer is a Fourier Transform spectrometer (FTS) working in the 5 to 50 microns wavelength region with a goal of 0.1 K (k = 3) accuracy. The FTS will achieve this accuracy using phase change cells to verify thermistor accuracy and heated halos to verify blackbody emissivity, both on orbit. The RS spectrometer will measure the reflectance of the atmosphere in the 0.32 to 2.3 microns wavelength region with an accuracy of 0.3% (k = 2). The status of the instrumentation packages and potential mission options will be presented.
Directly polished lightweight aluminum mirror
NASA Astrophysics Data System (ADS)
ter Horst, Rik; Tromp, Niels; de Haan, Menno; Navarro, Ramon; Venema, Lars; Pragt, Johan
2017-11-01
During the last ten years, Astron has been a major contractor for the design and manufacturing of astronomical instruments for Space- and Earth based observatories, such as VISIR, MIDI, SPIFFI, X-Shooter and MIRI. Driven by the need to reduce the weight of optically ultra-stiff structures, two promising techniques have been developed in the last years: ASTRON Extreme Lightweighting [1][2] for mechanical structures and an improved Polishing Technique for Aluminum Mirrors. Using one single material for both optical components and mechanical structure simplifies the design of a cryogenic instrument significantly, it is very beneficial during instrument test and verification, and makes the instrument insensitive to temperature changes. Aluminum has been the main material used for cryogenic optical instruments, and optical aluminum mirrors are generally diamond turned. The application of a polishable hard top coating like nickel removes excess stray light caused by the groove pattern, but limits the degree of lightweighting of the mirrors due to the bi-metal effect. By directly polishing the aluminum mirror surface, the recent developments at Astron allow for using a non-exotic material for light weighted yet accurate optical mirrors, with a lower surface roughness ( 1nm RMS), higher surface accuracy and reduced light scattering. This paper presents the techniques, obtained results and a global comparison with alternative lightweight mirror solutions. Recent discussions indicate possible extensions of the extreme light weight technology to alternative materials such as Zerodur or Silicon Carbide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lukac, Martin; Ramanathan, Nithya; Graham, Eric
2013-09-10
Black carbon (BC) emissions from traditional cooking fires and other sources are significant anthropogenic drivers of radiative forcing. Clean cookstoves present a more energy-efficient and cleaner-burning vehicle for cooking than traditional wood-burning stoves, yet many existing cookstoves reduce emissions by only modest amounts. Further research into cookstove use, fuel types, and verification of emissions is needed as adoption rates for such stoves remain low. Accelerated innovation requires techniques for measuring and verifying such cookstove performance. The overarching goal of the proposed program was to develop a low-cost, wireless instrument to provide a high-resolution profile of the cookstove BC emissions andmore » usage in the field. We proposed transferring the complexity of analysis away from the sampling hardware at the measurement site and to software at a centrally located server to easily analyze data from thousands of sampling instruments. We were able to build a low-cost field-based instrument that produces repeatable, low-cost estimates of cookstove usage, fuel estimates, and emission values with low variability. Emission values from our instrument were consistent with published ranges of emissions for similar stove and fuel types.« less
Ham, Y.; Kerr, P.; Sitaraman, S.; ...
2016-05-05
Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ham, Y.S.; Kerr, P.; Sitaraman, S.
The need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called 'difficult-to-access' areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into 'difficult-to-access' areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reported themore » successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17x17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly burnup levels. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ham, Y.; Kerr, P.; Sitaraman, S.
Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less
The Hubble Space Telescope Scientific Instruments
NASA Technical Reports Server (NTRS)
Moore, J. V.
1986-01-01
The paper describes the status of the five Scientific Instruments (SI's) to be flown on the Hubble Space Telescope (HST) which is planned to be launched by the Space Transportation System in the last half of 1986. Concentration is on the testing experience for each of the instruments both at the instrument level and in conjunction with the other instruments and subsystems of the HST. Since the Acceptance/Flight Qualification Program of the HST is currently underway a description of the test and verification plans to be accomplished prior to shipment to the Kennedy Space Center (KSC) and pre-launch tests plans prior to launch are provided. The paper concludes with a brief description of anticipated orbital performance.
Instrumentation for Verification of Bomb Damage Repair Computer Code.
1981-09-01
record the data, a conventional 14-track FM analog tape recorder was retained. The unknown factors of signal duration, test duration, and signal ...Kirtland Air Force Base computer centers for more detailed analyses. In addition to the analog recorder, signal conditioning equipment and amplifiers were...necessary to allow high quality data to be recorded. An Interrange Instrumentation Group (IRIG) code generator/reader placed a coded signal on the tape
In-orbit verification of MHS spectral channels co-registration using the moon
NASA Astrophysics Data System (ADS)
Bonsignori, Roberto
2017-09-01
In-orbit verification of the co-registration of channels in a scanning microwave or infrared radiometer can in principle be done during normal in-orbit operation, by using the regular events of lunar intrusion in the instrument cold space calibration view. A technique of data analysis based on best fit of data across lunar intrusions has been used to check the mutual alignment of the spectral channels of the MHS instrument. MHS (Microwave Humidity Sounder) is a cross-track scanning radiometer in the millimetre-wave range flying on EUMETSAT and NOAA polar satellites, used operationally for the retrieval of atmospheric parameters in numerical weather prediction and nowcasting. This technique does not require any special operation or manoeuvre and only relies on analysis of data from the nominal scanning operation. The co-alignment of sounding channels and window channels can be evaluated by this technique, which would not be possible by using earth landmarks, due to the absorption effect of the atmosphere. The analysis reported in this paper shows an achievable accuracy below 0.5 mrad against a beam width at 3dB and spatial sampling interval of about 20 mrad. In-orbit results for the MHS instrument on Metop-B are also compared with the pre-launch instrument characterisation, showing a good correlation.
NASA Technical Reports Server (NTRS)
Yew, Calinda; Whitehouse, Paul; Lui, Yan; Banks, Kimberly
2016-01-01
JWST Integrated Science Instruments Module (ISIM) has completed its system-level testing program at the NASA Goddard Space Flight Center (GSFC). In March 2016, ISIM was successfully delivered for integration with the Optical Telescope Element (OTE) after the successful verification of the system through a series of three cryo-vacuum (CV) tests. The first test served as a risk reduction test; the second test provided the initial verification of the fully-integrated flight instruments; and the third test verified the system in its final flight configuration. The complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. As JWST progressed through its CV testing campaign, deficiencies in the test configuration and support equipment were uncovered from one test to the next. Subsequent upgrades and modifications were implemented to improve the facility support capabilities required to achieve test requirements. This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) compares the overall facility performance and instrumentation results from the three ISIM CV tests, and (3) summarizes lessons learned from the ISIM testing campaign.
MEMS for Space Flight Applications
NASA Technical Reports Server (NTRS)
Lawton, R.
1998-01-01
Micro-Electrical Mechanical Systems (MEMS) are entering the stage of design and verification to demonstrate the utility of the technology for a wide range of applications including sensors and actuators for military, space, medical, industrial, consumer, automotive and instrumentation products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demuth, Scott F.; Trahan, Alexis Chanel
2017-06-26
DIV of facility layout, material flows, and other information provided in the DIQ. Material accountancy through an annual PIV and a number of interim inventory verifications, including UF6 cylinder identification and counting, NDA of cylinders, and DA on a sample collection of UF6. Application of C/S technologies utilizing seals and tamper-indicating devices (TIDs) on cylinders, containers, storage rooms, and IAEA instrumentation to provide continuity of knowledge between inspection. Verification of the absence of undeclared material and operations, especially HEU production, through SNRIs, LFUA of cascade halls, and environmental swipe sampling
Low level vapor verification of monomethyl hydrazine
NASA Technical Reports Server (NTRS)
Mehta, Narinder
1990-01-01
The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
EGSE customization for the Euclid NISP Instrument AIV/AIT activities
NASA Astrophysics Data System (ADS)
Franceschi, E.; Trifoglio, M.; Gianotti, F.; Conforti, V.; Andersen, J. J.; Stephen, J. B.; Valenziano, L.; Auricchio, N.; Bulgarelli, A.; De Rosa, A.; Fioretti, V.; Maiorano, E.; Morgante, G.; Nicastro, L.; Sortino, F.; Zoli, A.; Balestra, A.; Bonino, D.; Bonoli, C.; Bortoletto, F.; Capobianco, V.; Corcione, L.; Dal Corso, F.; Debei, S.; Di Ferdinando, D.; Dusini, S.; Farinelli, R.; Fornari, F.; Giacomini, F.; Guizzo, G. P.; Laudisio, F.; Ligori, S.; Mauri, N.; Medinaceli, E.; Patrizii, L.; Sirignano, C.; Sirri, G.; Stanco, L.; Tenti, M.; Valieri, C.; Ventura, S.
2016-07-01
The Near Infrared Spectro-Photometer (NISP) on board the Euclid ESA mission will be developed and tested at various levels of integration by using various test equipment. The Electrical Ground Support Equipment (EGSE) shall be required to support the assembly, integration, verification and testing (AIV/AIT) and calibration activities at instrument level before delivery to ESA, and at satellite level, when the NISP instrument is mounted on the spacecraft. In the case of the Euclid mission this EGSE will be provided by ESA to NISP team, in the HW/SW framework called "CCS Lite", with a possible first usage already during the Warm Electronics (WE) AIV/AIT activities. In this paper we discuss how we will customize that "CCS Lite" as required to support both the WE and Instrument test activities. This customization will primarily involve building the NISP Mission Information Base (the CCS MIB tables) by gathering the relevant data from the instrument sub-units and validating these inputs through specific tools. Secondarily, it will imply developing a suitable set of test sequences, by using uTOPE (an extension to the TCL scripting language, included in the CCS framework), in order to implement the foreseen test procedures. In addition and in parallel, custom interfaces shall be set up between the CCS and the NI-IWS (the NISP Instrument Workstation, which will be in use at any level starting from the WE activities), and also between the CCS and the TCC (the Telescope Control and command Computer, to be only and specifically used during the instrument level tests).
NASA Technical Reports Server (NTRS)
Comber, Brian; Glazer, Stuart
2012-01-01
The James Webb Space Telescope (JWST) is an upcoming flagship observatory mission scheduled to be launched in 2018. Three of the four science instruments are passively cooled to their operational temperature range of 36K to 40K, and the fourth instrument is actively cooled to its operational temperature of approximately 6K. The requirement for multiple thermal zoned results in the instruments being thermally connected to five external radiators via individual high purity aluminum heat straps. Thermal-vacuum and thermal balance testing of the flight instruments at the Integrated Science Instrument Module (ISIM) element level will take place within a newly constructed shroud cooled by gaseous helium inside Goddard Space Flight Center's (GSFC) Space environment Simulator (SES). The flight external radiators are not available during ISIM-level thermal vacuum/thermal testing, so they will be replaced in test with stable and adjustable thermal boundaries with identical physical interfaces to the flight radiators. Those boundaries are provided by specially designed test hardware which also measures the heat flow within each of the five heat straps to an accuracy of less than 2 mW, which is less than 5% of the minimum predicted heat flow values. Measurement of the heat loads to this accuracy is essential to ISIM thermal model correlation, since thermal models are more accurately correlated when temperature data is supplemented by accurate knowledge of heat flows. It also provides direct verification by test of several high-level thermal requirements. Devices that measure heat flow in this manner have historically been referred to a "Q-meters". Perhaps the most important feature of the design of the JWST Q-meters is that it does not depend on the absolute accuracy of its temperature sensors, but rather on knowledge of precise heater power required to maintain a constant temperature difference between sensors on two stages, for which a table is empirically developed during a calibration campaign in a small chamber at GSFC. This paper provides a brief review of Q-meter design, and discusses the Q-meter calibration procedure including calibration chamber modifications and accommodations, handling of differing conditions between calibration and usage, the calibration process itself, and the results of the tests used to determine if the calibration is successful.
Image quality specification and maintenance for airborne SAR
NASA Astrophysics Data System (ADS)
Clinard, Mark S.
2004-08-01
Specification, verification, and maintenance of image quality over the lifecycle of an operational airborne SAR begin with the specification for the system itself. Verification of image quality-oriented specification compliance can be enhanced by including a specification requirement that a vendor provide appropriate imagery at the various phases of the system life cycle. The nature and content of the imagery appropriate for each stage of the process depends on the nature of the test, the economics of collection, and the availability of techniques to extract the desired information from the data. At the earliest lifecycle stages, Concept and Technology Development (CTD) and System Development and Demonstration (SDD), the test set could include simulated imagery to demonstrate the mathematical and engineering concepts being implemented thus allowing demonstration of compliance, in part, through simulation. For Initial Operational Test and Evaluation (IOT&E), imagery collected from precisely instrumented test ranges and targets of opportunity consisting of a priori or a posteriori ground-truthed cultural and natural features are of value to the analysis of product quality compliance. Regular monitoring of image quality is possible using operational imagery and automated metrics; more precise measurements can be performed with imagery of instrumented scenes, when available. A survey of image quality measurement techniques is presented along with a discussion of the challenges of managing an airborne SAR program with the scarce resources of time, money, and ground-truthed data. Recommendations are provided that should allow an improvement in the product quality specification and maintenance process with a minimal increase in resource demands on the customer, the vendor, the operational personnel, and the asset itself.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Developing the Cleanliness Requirements for an Organic-detection Instrument MOMA-MS
NASA Technical Reports Server (NTRS)
Perry, Radford; Canham, John; Lalime, Erin
2015-01-01
The cleanliness requirements for an organic-detection instrument, like the Mars Organic Molecule Analyzer Mass Spectrometer (MOMA-MS), on a Planetary Protection Class IVb mission can be extremely stringent. These include surface molecular and particulate, outgassing, and bioburden. The prime contractor for the European Space Agencys ExoMars 2018 project, Thales Alenia Space Italy, provided requirements based on a standard, conservative approach of defining limits which yielded levels that are unverifiable by standard cleanliness verification methods. Additionally, the conservative method for determining contamination surface area uses underestimation while conservative bioburden surface area relies on overestimation, which results in inconsistencies for the normalized reporting. This presentation will provide a survey of the challenge to define requirements that can be reasonably verified and still remain appropriate to the core science of the ExoMars mission.
Data Verification Tools for Minimizing Management Costs of Dense Air-Quality Monitoring Networks.
Miskell, Georgia; Salmond, Jennifer; Alavi-Shoshtari, Maryam; Bart, Mark; Ainslie, Bruce; Grange, Stuart; McKendry, Ian G; Henshaw, Geoff S; Williams, David E
2016-01-19
Aiming at minimizing the costs, both of capital expenditure and maintenance, of an extensive air-quality measurement network, we present simple statistical methods that do not require extensive training data sets for automated real-time verification of the reliability of data delivered by a spatially dense hybrid network of both low-cost and reference ozone measurement instruments. Ozone is a pollutant that has a relatively smooth spatial spread over a large scale although there can be significant small-scale variations. We take advantage of these characteristics and demonstrate detection of instrument calibration drift within a few days using a rolling 72 h comparison of hourly averaged data from the test instrument with that from suitably defined proxies. We define the required characteristics of the proxy measurements by working from a definition of the network purpose and specification, in this case reliable determination of the proportion of hourly averaged ozone measurements that are above a threshold in any given day, and detection of calibration drift of greater than ±30% in slope or ±5 parts-per-billion in offset. By analyzing results of a study of an extensive deployment of low-cost instruments in the Lower Fraser Valley, we demonstrate that proxies can be established using land-use criteria and that simple statistical comparisons can identify low-cost instruments that are not stable and therefore need replacing. We propose that a minimal set of compliant reference instruments can be used to verify the reliability of data from a much more extensive network of low-cost devices.
Science verification of operational aerosol and cloud products for TROPOMI on Sentinel-5 precursor
NASA Astrophysics Data System (ADS)
Lelli, Luca; Gimeno-Garcia, Sebastian; Sanders, Abram; Sneep, Maarten; Rozanov, Vladimir V.; Kokhanvosky, Alexander A.; Loyola, Diego; Burrows, John P.
2016-04-01
With the approaching launch of the Sentinel-5 precursor (S-5P) satellite, scheduled by mid 2016, one preparatory task of the L2 working group (composed by the Institute of Environmental Physics IUP Bremen, the Royal Netherlands Meteorological Institute KNMI De Bilt, and the German Aerospace Center DLR Oberpfaffenhofen) has been the assessment of biases among aerosol and cloud products, that are going to be inferred by the respective algorithms from measurements of the platform's payload TROPOspheric Monitoring Instrument (TROPOMI). The instrument will measure terrestrial radiance with varying moderate spectral resolutions from the ultraviolet throughout the shortwave infrared. Specifically, all the operational and verification algorithms involved in this comparison exploit the sensitivity of molecular oxygen absorption (the A-band, 755-775 nm, with a resolution of 0.54 nm) to changes in optical and geometrical parameters of tropospheric scattering layers. Therefore, aerosol layer height (ALH) and thickness (AOT), cloud top height (CTH), thickness (COT) and albedo (CA) are the targeted properties. First, the verification of these properties has been accomplished upon synchronisation of the respective forward radiative transfer models for a variety of atmospheric scenarios. Then, biases against independent techniques have been evaluated with real measurements of selected GOME-2 orbits. Global seasonal bias assessment has been carried out for CTH, CA and COT, whereas the verification of ALH and AOT is based on the analysis of the ash plume emitted by the icelandic volcanic eruption Eyjafjallajökull in May 2010 and selected dust scenes off the Saharan west coast sensed by SCIAMACHY in year 2009.
High resolution microwave spectrometer sounder (HIMSS), volume 1, book 2
NASA Technical Reports Server (NTRS)
1990-01-01
The following topics are presented with respect to the high resolution microwave spectrometer sounder (HIMSS) that is to be used as an instrument for NASA's Earth Observing System (EOS): (1) preliminary program plans; (2) contract end item (CEI) specification; and (3) the instrument interface description document. Under the preliminary program plans section, plans dealing with the following subject areas are discussed: spares, performance assurance, configuration management, software implementation, contamination, calibration management, and verification.
NASA Astrophysics Data System (ADS)
Malphrus, Benjamin Kevin
1990-01-01
The purpose of this study is to examine the sequence of events that led to the establishment of the NRAO, the construction and development of instrumentation and the contributions and discovery events and to relate the significance of these events to the evolution of the sciences of radio astronomy and cosmology. After an overview of the resources, a brief discussion of the early days of the science is given to set the stage for an examination of events that led to the establishment of the NRAO. The developmental and construction phases of the major instruments including the 85-foot Tatel telescope, the 300-foot telescope, the 140-foot telescope, and the Green Bank lnterferometer are examined. The technical evolution of these instruments is traced and their relevance to scientific programs and discovery events is discussed. The history is told in narrative format that is interspersed with technical and scientific explanations. Through the use of original data technical and scientific information of historical concern is provided to elucidate major developments and events. An interpretive discussion of selected programs, events and technological developments that epitomize the contributions of the NRAO to the science of radio astronomy is provided. Scientific programs conducted with the NRAO instruments that were significant to galactic and extragalactic astronomy are presented. NRAO research programs presented include continuum and source surveys, mapping, a high precision verification of general relativity, and SETI programs. Cosmic phenomena investigated in these programs include galactic and extragalactic HI and HII, emission nebula, supernova remnants, cosmic masers, giant molecular clouds, radio stars, normal and radio galaxies, and quasars. Modern NRAO instruments including the VLA and VLBA and their scientific programs are presented in the final chapter as well as plans for future NRAO instruments such as the GBT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekechukwu, A.
This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less
EMIR, the NIR MOS and Imager for the GTC
NASA Astrophysics Data System (ADS)
Garzón, F.; EMIR Team
2016-10-01
EMIR is one of the first common-user instruments for the GTC, the 10-meter telescope operating at the Roque de los Muchachos Observatory (La Palma, Canary Islands, Spain). EMIR is being built by a Consortium of Spanish and French institutes led by the Instituto de Astrofísica de Canarias (IAC). EMIR is primarily designed to be operated as a MOS in the near-IR band, but offers a wide range of observing modes, including imaging and spectroscopy, both long slit and multi-object, in the wavelength range 0.9 to 2.5 μm. This contribution reports on the results achieved so far during the verification phase at the IAC prior to the shipment of the instrument to the GTC for being commissioned, which is due by mid 2015. EMIR is equipped with a set of three dispersive elements, one for each of the atmospheric windows J,H & K, formed by the combination of a high quality transmission grating embedded in between of two large prisms of ZnSe; plus a low resolution standard replicated grism, functional in the HK and ZJ windows in first and second dispersion orders respectively. The multi-object capability is achieved by means of the Cold Slit Unit (CSU), a cryogenic robotic reconfigurable multi-slit mask system capable of making user specified patterns with 55 different slitlets distributed across the EMIR focal plane. We will describe the principal units and features of the EMIR instrument and the main results of the verification performed so far with special emphasis on the NIR MOS capabilities. The development and fabrication of EMIR is funded by GRANTECAN and the Plan Nacional de Astronomía y Astrofísica (National Plan for Astronomy and Astrophysics, Spain).
Vision-based aircraft guidance
NASA Technical Reports Server (NTRS)
Menon, P. K.
1993-01-01
Early research on the development of machine vision algorithms to serve as pilot aids in aircraft flight operations is discussed. The research is useful for synthesizing new cockpit instrumentation that can enhance flight safety and efficiency. With the present work as the basis, future research will produce low-cost instrument by integrating a conventional TV camera together with off-the=shelf digitizing hardware for flight test verification. Initial focus of the research will be on developing pilot aids for clear-night operations. Latter part of the research will examine synthetic vision issues for poor visibility flight operations. Both research efforts will contribute towards the high-speed civil transport aircraft program. It is anticipated that the research reported here will also produce pilot aids for conducting helicopter flight operations during emergency search and rescue. The primary emphasis of the present research effort is on near-term, flight demonstrable technologies. This report discusses pilot aids for night landing and takeoff and synthetic vision as an aid to low visibility landing.
Hybrid Gama Emission Tomography (HGET): FY16 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Smith, Leon E.; Wittman, Richard S.
2017-02-01
Current International Atomic Energy Agency (IAEA) methodologies for the verification of fresh low-enriched uranium (LEU) and mixed oxide (MOX) fuel assemblies are volume-averaging methods that lack sensitivity to individual pins. Further, as fresh fuel assemblies become more and more complex (e.g., heavy gadolinium loading, high degrees of axial and radial variation in fissile concentration), the accuracy of current IAEA instruments degrades and measurement time increases. Particularly in light of the fact that no special tooling is required to remove individual pins from modern fuel assemblies, the IAEA needs new capabilities for the verification of unirradiated (i.e., fresh LEU and MOX)more » assemblies to ensure that fissile material has not been diverted. Passive gamma emission tomography has demonstrated potential to provide pin-level verification of spent fuel, but gamma-ray emission rates from unirradiated fuel emissions are significantly lower, precluding purely passive tomography methods. The work presented here introduces the concept of Hybrid Gamma Emission Tomography (HGET) for verification of unirradiated fuels, in which a neutron source is used to actively interrogate the fuel assembly and the resulting gamma-ray emissions are imaged using tomographic methods to provide pin-level verification of fissile material concentration.« less
The High Level Data Reduction Library
NASA Astrophysics Data System (ADS)
Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.
2015-09-01
The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.
REAL-TIME MONITORING OF DIOXINS AND OTHER ...
This project is part of EPA's EMPACT program which was begun in 1998 and is jointly administered by EPA's Office of Research and Development, the National Center for Environmental Research and Quality Assurance (NCERQA), and the National Center for Environmental Assessment. The program was developed to provide understandable environmental information on various research initiatives to the public in a timely manner on various issues of importance. This particular project involves development of the application of an on-line, real time, trace organic air toxic monitor, with special emphasis on dioxin-related compounds. Research efforts demonstrate the utility and usefulness of the Resonance Enhanced Multi-Photon Ionization (REMPI) analytical method for trace organics control, monitoring, and compliance assurance. Project objectives will be to develop the REMPI instrumental method into a tool that will be used for assessment of potential dioxin sources, control and prevention of dioxin formation in known sources, and communication of facility performance. This will be accomplished through instrument development, laboratory verification, thermokinetic modelling, equilibrium modelling, statistical determinations, field validation, program publication and presentation, regulatory office support, and development of data communication/presentation procedures. For additional information on this EMPACT project, visit the website at http://www.epa.gov/appcdwww/crb/empa
Advances in Monte-Carlo code TRIPOLI-4®'s treatment of the electromagnetic cascade
NASA Astrophysics Data System (ADS)
Mancusi, Davide; Bonin, Alice; Hugot, François-Xavier; Malouch, Fadhel
2018-01-01
TRIPOLI-4® is a Monte-Carlo particle-transport code developed at CEA-Saclay (France) that is employed in the domains of nuclear-reactor physics, criticality-safety, shielding/radiation protection and nuclear instrumentation. The goal of this paper is to report on current developments, validation and verification made in TRIPOLI-4 in the electron/positron/photon sector. The new capabilities and improvements concern refinements to the electron transport algorithm, the introduction of a charge-deposition score, the new thick-target bremsstrahlung option, the upgrade of the bremsstrahlung model and the improvement of electron angular straggling at low energy. The importance of each of the developments above is illustrated by comparisons with calculations performed with other codes and with experimental data.
Engineering within the assembly, verification, and integration (AIV) process in ALMA
NASA Astrophysics Data System (ADS)
Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene
2010-07-01
The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered directly in this paper. It is believed that both continuous improvement and the clear definition of the AIV 4-station model were key factors in achieving the goal of bringing the antennas into a state that is well enough characterized in order to smoothly start commissioning activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, R.; Grace, W.
1996-07-01
This is the final report of a one-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). We won a 1994 R&D 100 Award for inventing the Bartas Iris Verification System. The system has been delivered to a sponsor and is no longer available to us. This technology can verify the identity of a person for purposes of access control, national security, law enforcement, forensics, counter-terrorism, and medical, financial, or scholastic records. The technique is non-invasive, psychologically acceptable, works in real-time, and obtains more biometric data than any other biometric except DNA analysis. This project soughtmore » to develop a new, second-generation prototype instrument.« less
Fourier transform spectrometer controller for partitioned architectures
NASA Astrophysics Data System (ADS)
Tamas-Selicean, D.; Keymeulen, D.; Berisford, D.; Carlson, R.; Hand, K.; Pop, P.; Wadsworth, W.; Levy, R.
The current trend in spacecraft computing is to integrate applications of different criticality levels on the same platform using no separation. This approach increases the complexity of the development, verification and integration processes, with an impact on the whole system life cycle. Researchers at ESA and NASA advocated for the use of partitioned architecture to reduce this complexity. Partitioned architectures rely on platform mechanisms to provide robust temporal and spatial separation between applications. Such architectures have been successfully implemented in several industries, such as avionics and automotive. In this paper we investigate the challenges of developing and the benefits of integrating a scientific instrument, namely a Fourier Transform Spectrometer, in such a partitioned architecture.
Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies
NASA Technical Reports Server (NTRS)
Shum, C. K.
2000-01-01
This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.
NASA Technical Reports Server (NTRS)
Berendes, Todd; Sengupta, Sailes K.; Welch, Ron M.; Wielicki, Bruce A.; Navar, Murgesh
1992-01-01
A semiautomated methodology is developed for estimating cumulus cloud base heights on the basis of high spatial resolution Landsat MSS data, using various image-processing techniques to match cloud edges with their corresponding shadow edges. The cloud base height is then estimated by computing the separation distance between the corresponding generalized Hough transform reference points. The differences between the cloud base heights computed by these means and a manual verification technique are of the order of 100 m or less; accuracies of 50-70 m may soon be possible via EOS instruments.
Architectures Toward Reusable Science Data Systems
NASA Technical Reports Server (NTRS)
Moses, John Firor
2014-01-01
Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAA's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today.
A Study on Performance and Safety Tests of Electrosurgical Equipment
Tavakoli Golpaygani, A.; Movahedi, M.M.; Reza, M.
2016-01-01
Introduction: Modern medicine employs a wide variety of instruments with different physiological effects and measurements. Periodic verifications are routinely used in legal metrology for industrial measuring instruments. The correct operation of electrosurgical generators is essential to ensure patient’s safety and management of the risks associated with the use of high and low frequency electrical currents on human body. Material and Methods: The metrological reliability of 20 electrosurgical equipment in six hospitals (3 private and 3 public) was evaluated in one of the provinces of Iran according to international and national standards. Results: The achieved results show that HF leakage current of ground-referenced generators are more than isolated generators and the power analysis of only eight units delivered acceptable output values and the precision in the output power measurements was low. Conclusion: Results indicate a need for new and severe regulations on periodic performance verifications and medical equipment quality control program especially in high risk instruments. It is also necessary to provide training courses for operating staff in the field of meterology in medicine to be acquianted with critical parameters to get accuracy results with operation room equipment. PMID:27853725
Analysis and discussion on the experimental data of electrolyte analyzer
NASA Astrophysics Data System (ADS)
Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei
2018-06-01
In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, V.V.; Conley, R.; Anderson, E.H.
Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binarypseudo-random (BPR) gratings and arrays has been suggested and and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer. Here we describe the details of development of binarypseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electronmore » microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi{sub 2}/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML testsamples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less
Assessment of the first radiances received from the VSSR Atmospheric Sounder (VAS) instrument
NASA Technical Reports Server (NTRS)
Chesters, D.; Uccellini, L. W.; Montgomery, H.; Mostek, A.; Robinson, W.
1981-01-01
The first orderly, calibrated radiances from the VAS-D instrument on the GOES-4 satellite are examined for: image quality, radiometric precision, radiation transfer verification at clear air radiosonde sites, regression retrieval accuracy, and mesoscale analysis features. Postlaunch problems involving calibration and data processing irregularities of scientific or operational significance are included. The radiances provide good visual and relative radiometric data for empirically conditioned retrievals of mesoscale temperature and moisture fields in clear air.
AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF FOUR MERCURY EMISSION SAMPLING SYSTEMS
CEMs - Tekran Instrument Corp. Series 3300 and Thermo Electron's Mercury Freedom System Continuous Emission Monitors (CEMs) for mercury are designed to determine total and/or chemically speciated vapor-phase mercury in combustion emissions. Performance for mercury CEMs are cont...
Ozone Contamination in Aircraft Cabins: Appendix B: Overview papers. Ozone destruction techniques
NASA Technical Reports Server (NTRS)
Wilder, R.
1979-01-01
Ozone filter test program and ozone instrumentation are presented. Tables on the flight tests, samll scale lab tests, and full scale lab tests were reviewed. Design verification, flammability, vibration, accelerated contamination, life cycle, and cabin air quality are described.
40 CFR 1065.925 - PEMS preparation for field testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... 1065.925 Section 1065.925 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... purge any gaseous sampling PEMS instruments with ambient air until sampling begins to prevent system contamination from excessive cold-start emissions. (e) Conduct calibrations and verifications. (f) Operate any...
The U.S. Environmental Protection Agency (EPA), through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report describes ...
Conceptual designs of NDA instruments for the NRTA system at the Rokkasho Reprocessing Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T.K.; Klosterbuer, S.F.; Menlove, H.O.
The authors are studying conceptual designs of selected nondestructive assay (NDA) instruments for the near-real-time accounting system at the rokkasho Reprocessing Plant (RRP) of Japan Nuclear Fuel Limited (JNFL). The JNFL RRP is a large-scale commercial reprocessing facility for spent fuel from boiling-water and pressurized-water reactors. The facility comprises two major components: the main process area to separate and produce purified plutonium nitrate and uranyl nitrate from irradiated reactor spent fuels, and the co-denitration process area to combine and convert the plutonium nitrate and uranyl nitrate into mixed oxide (MOX). The selected NDA instruments for conceptual design studies are themore » MOX-product canister counter, holdup measurement systems for calcination and reduction furnaces and for blenders in the co-denitration process, the isotope dilution gamma-ray spectrometer for the spent fuel dissolver solution, and unattended verification systems. For more effective and practical safeguards and material control and accounting at RRP, the authors are also studying the conceptual design for the UO{sub 3} large-barrel counter. This paper discusses the state-of-the-art NDA conceptual design and research and development activities for the above instruments.« less
Theoretical considerations and measurements for phoropters
NASA Astrophysics Data System (ADS)
Zhang, Jiyan; Liu, Wenli; Sun, Jie
2008-10-01
A phoropter is one of the most popular ophthalmic instruments used in current optometry practice. The quality and verification of the instrument are of the utmost importance. In 1997, International Organization for Standardization published the first ISO standard for requirements of phoropters. However, in China, few standard and test method are suggested for phoropters. Research work on test method for phoropters was carried out early in 2004 by China National Institute of Metrology. In this paper, first, structure of phoropters is described. Then, theoretical considerations for its optical design are analyzed. Next, a newly developed instrument is introduced and measurements are taken. By calibration, the indication error of the instrument is not over 0.05m-1. Finally, measurement results show that the quality situation of phoropters is not as good as expected because of production and assembly error. Optical design shall be improved especially for combinations of both spherical and cylindrical lenses with higher power. Besides, optical requirements specified in ISO standard are found to be a little strict and hard to meet. A proposal for revision of this international standard is drafted and discussed on ISO meeting of 2007 held in Tokyo.
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
The Sentinel-4 detectors: architecture and performance
NASA Astrophysics Data System (ADS)
Skegg, Michael P.; Hermsen, Markus; Hohn, Rüdiger; Williges, Christian; Woffinden, Charles; Levillain, Yves; Reulke, Ralf
2017-09-01
The Sentinel-4 instrument is an imaging spectrometer, developed by Airbus under ESA contract in the frame of the joint European Union (EU)/ESA COPERNICUS program. SENTINEL-4 will provide accurate measurements of trace gases from geostationary orbit, including key atmospheric constituents such as ozone, nitrogen dioxide, sulfur dioxide, formaldehyde, as well as aerosol and cloud properties. Key to achieving these atmospheric measurements are the two CCD detectors, covering the wavelengths in the ranges 305 nm to 500 nm (UVVIS) and 750 to 775 nm (NIR) respectively. The paper describes the architecture, and operation of these two CCD detectors, which have an unusually high full-well capacity and a very specific architecture and read-out sequence to match the requirements of the Sentinel- 4 instrument. The key performance aspects and their verification through measurement are presented, with a focus on an unusual, bi-modal dark signal generation rate observed during test.
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos
2016-11-18
This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.
The performance evaluation of innovative and alternative environmental technologies is an integral part of the U.S. Environmental Protection Agency's (EPA) mission. Early efforts focused on evaluation technologies that supported the implementation of the Clean Air and Clean Wate...
Performance assessment of FY-3C/MERSI on early orbit
NASA Astrophysics Data System (ADS)
Hu, Xiuqing; Xu, Na; Wu, Ronghua; Chen, Lin; Min, Min; Wang, Ling; Xu, Hanlie; Sun, Ling; Yang, Zhongdong; Zhang, Peng
2014-11-01
FY-3C/MERSI has some remarkable improvements compared to the previous MERSIs including better spectral response function (SRF) consistency of different detectors within one band, increasing the capability of lunar observation by space view (SV) and the improvement of radiometric response stability of solar bands. During the In-orbit verification (IOV) commissioning phase, early results that indicate the MERSI representative performance were derived, including the signal noise ratio (SNR), dynamic range, MTF, B2B registration, calibration bias and instrument stability. The SNRs at the solar bands (Bands 1-4 and 6-20) was largely beyond the specifications except for two NIR bands. The in-flight calibration and verification for these bands are also heavily relied on the vicarious techniques such as China radiometric calibration sites(CRCS), cross-calibration, lunar calibration, DCC calibration, stability monitoring using Pseudo Invariant Calibration Sites (PICS) and multi-site radiance simulation. This paper will give the results of the above several calibration methods and monitoring the instrument degradation in early on-orbit time.
Safeguards by Design Challenge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alwin, Jennifer Louise
The International Atomic Energy Agency (IAEA) defines Safeguards as a system of inspection and verification of the peaceful uses of nuclear materials as part of the Nuclear Nonproliferation Treaty. IAEA oversees safeguards worldwide. Safeguards by Design (SBD) involves incorporation of safeguards technologies, techniques, and instrumentation during the design phase of a facility, rather that after the fact. Design challenge goals are the following: Design a system of safeguards technologies, techniques, and instrumentation for inspection and verification of the peaceful uses of nuclear materials. Cost should be minimized to work with the IAEA’s limited budget. Dose to workers should always bemore » as low are reasonably achievable (ALARA). Time is of the essence in operating facilities and flow of material should not be interrupted significantly. Proprietary process information in facilities may need to be protected, thus the amount of information obtained by inspectors should be the minimum required to achieve the measurement goal. Then three different design challenges are detailed: Plutonium Waste Item Measurement System, Marine-based Modular Reactor, and Floating Nuclear Power Plant (FNPP).« less
Execution of the Spitzer In-orbit Checkout and Science Verification Plan
NASA Technical Reports Server (NTRS)
Miles, John W.; Linick, Susan H.; Long, Stacia; Gilbert, John; Garcia, Mark; Boyles, Carole; Werner, Michael; Wilson, Robert K.
2004-01-01
The Spitzer Space Telescope is an 85-cm telescope with three cryogenically cooled instruments. Following launch, the observatory was initialized and commissioned for science operations during the in-orbit checkout (IOC) and science verification (SV) phases, carried out over a total of 98.3 days. The execution of the IOC/SV mission plan progressively established Spitzer capabilities taking into consideration thermal, cryogenic, optical, pointing, communications, and operational designs and constraints. The plan was carried out with high efficiency, making effective use of cryogen-limited flight time. One key component to the success of the plan was the pre-launch allocation of schedule reserve in the timeline of IOC/SV activities, and how it was used in flight both to cover activity redesign and growth due to continually improving spacecraft and instrument knowledge, and to recover from anomalies. This paper describes the adaptive system design and evolution, implementation, and lessons learned from IOC/SV operations. It is hoped that this information will provide guidance to future missions with similar engineering challenges
NASA Astrophysics Data System (ADS)
Maroto, Oscar; Diez-Merino, Laura; Carbonell, Jordi; Tomàs, Albert; Reyes, Marcos; Joven-Alvarez, Enrique; Martín, Yolanda; Morales de los Ríos, J. A.; del Peral, Luis; Rodríguez-Frías, M. D.
2014-07-01
The Japanese Experiment Module (JEM) Extreme Universe Space Observatory (EUSO) will be launched and attached to the Japanese module of the International Space Station (ISS). Its aim is to observe UV photon tracks produced by ultra-high energy cosmic rays developing in the atmosphere and producing extensive air showers. The key element of the instrument is a very wide-field, very fast, large-lense telescope that can detect extreme energy particles with energy above 1019 eV. The Atmospheric Monitoring System (AMS), comprising, among others, the Infrared Camera (IRCAM), which is the Spanish contribution, plays a fundamental role in the understanding of the atmospheric conditions in the Field of View (FoV) of the telescope. It is used to detect the temperature of clouds and to obtain the cloud coverage and cloud top altitude during the observation period of the JEM-EUSO main instrument. SENER is responsible for the preliminary design of the Front End Electronics (FEE) of the Infrared Camera, based on an uncooled microbolometer, and the manufacturing and verification of the prototype model. This paper describes the flight design drivers and key factors to achieve the target features, namely, detector biasing with electrical noise better than 100μV from 1Hz to 10MHz, temperature control of the microbolometer, from 10°C to 40°C with stability better than 10mK over 4.8hours, low noise high bandwidth amplifier adaptation of the microbolometer output to differential input before analog to digital conversion, housekeeping generation, microbolometer control, and image accumulation for noise reduction. It also shows the modifications implemented in the FEE prototype design to perform a trade-off of different technologies, such as the convenience of using linear or switched regulation for the temperature control, the possibility to check the camera performances when both microbolometer and analog electronics are moved further away from the power and digital electronics, and the addition of switching regulators to demonstrate the design is immune to the electrical noise the switching converters introduce. Finally, the results obtained during the verification phase are presented: FEE limitations, verification results, including FEE noise for each channel and its equivalent NETD and microbolometer temperature stability achieved, technologies trade-off, lessons learnt, and design improvement to implement in future project phases.
NASA Technical Reports Server (NTRS)
Connelly, Joseph A.; Ohl, Raymond G.; Mink, Ronald G.; Mentzell, J. Eric; Saha, Timo T.; Tveekrem, June L.; Hylan, Jason E.; Sparr, Leroy M.; Chambers, V. John; Hagopian, John G.
2003-01-01
The Infrared Multi-Object Spectrometer (IRMOS) is a facility instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low- to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view using a commercial Micro Electro-Mechanical Systems (MEMS) Digital Micro-mirror Device (DMD) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the DMD field stop, and the spectrograph images the DMD onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and the ambient and cryogenic imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve to venfy alignment, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides further verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides spectral lines at 546.1 nm and 1550 nm, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard test results validate this prediction. We conclude with an instrument performance prediction for first light.
Multichannel forward scattering meter for oceanography
NASA Technical Reports Server (NTRS)
Mccluney, W. R.
1974-01-01
An instrument was designed and built that measures the light scattered at several angles in the forward direction simultaneously. The instrument relies on an optical multiplexing technique for frequency encoding of the different channels suitable for detection by a single photodetector. A Mie theory computer program was used to calculate the theoretical volume scattering function for a suspension of polystyrene latex spheres. The agreement between the theoretical and experimental volume scattering functions is taken as a verification of the calibration technique used.
Ethical checklist for dental practice.
Rinchuse, D J; Rinchuse, D J; Deluzio, C
1995-01-01
A checklist for verification of unethical business practices, originally formulated by Drs. Blanchard and Peale, is adapted to dental practice. A scenario is used as a model to demonstrate the applicability of this instrument to dental practice. The instrument asks three questions in regards to an ethical dilemma: 1) Is it legal? 2) Is it fair? 3) How does it make you feel? The paper concludes the giving of gifts to general dentists by dental specialists for the referral of patients is unethical.
In-flight calibration verification of spaceborne remote sensing instruments
NASA Astrophysics Data System (ADS)
LaBaw, Clayton C.
1990-07-01
The need to verify the pei1ormaixc of untended instrumentation has been recognized since scientists began sending thnse instrumems into hostile environments to quire data. The sea floor and the stratosphere have been explored, and the quality and cury of the data obtained vified by calibrating the instrumentalion in the laboratoiy, both jxior and subsequent to deployment The inability to make the lau measurements on deep-space missions make the calibration vthficatkin of these insiruments a uniclue problem.
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
Cryo-Vacuum Testing of the Integrated Science Instrument Module for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Kimble, Randy A.; Davila, P. S.; Drury, M. P.; Glazer, S. D.; Krom, J. R.; Lundquist, R. A.; Mann, S. D.; McGuffey, D. B.; Perry, R. L.; Ramey, D. D.
2011-01-01
With delivery of the science instruments for the James Webb Space Telescope (JWST) to Goddard Space Flight Center (GSFC) expected in 2012, current plans call for the first cryo-vacuum test of the Integrated Science Instrument Module (ISIM) to be carried out at GSFC in early 2013. Plans are well underway for conducting this ambitious test, which will perform critical verifications of a number of optical, thermal, and operational requirements of the IS 1M hardware, at its deep cryogenic operating temperature. We describe here the facilities, goals, methods, and timeline for this important Integration & Test milestone in the JWST program.
NASA Technical Reports Server (NTRS)
1977-01-01
Two visible infrared spin scan radiometer (VISSR) instruments provided for the Geostationary Operational Environmental Satellite B and C (GOES B and C) spacecrafts are described. The instruments are identical to those supplied previously are summarized. A significant number of changes primarily involving corrections of drawing errors and omissions were also performed. All electrical changes were breadboarded (where complexity required this), were incorporated into the test module, and subjected to verification of proper operation throughout fall instrument temperature range. Evaluation of the changes also included design operating safety margins to account for component variations and life.
Performance of a Microwave Bale Moisture Content Meter
USDA-ARS?s Scientific Manuscript database
Measuring the moisture content of cotton bales has been a topic of intense interest in the last few years. A non-contact microwave-based bale moisture meter, Vomax 851-B (Vomax Instrumentation through Samuel Jackson, Lubbock, TX) has been commercially available but independent verification of these...
Analysis of an Indirect Neutron Signature for Enhanced UF6 Cylinder Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulisek, Jonathan A.; McDonald, Benjamin S.; Smith, Leon E.
2017-02-21
The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capablemore » cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.« less
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Investigations Into Tank Venting for Propellant Resupply
NASA Technical Reports Server (NTRS)
Hearn, H. C.; Harrison, Robert A. (Technical Monitor)
2002-01-01
Models and simulations have been developed and applied to the evaluation of propellant tank ullage venting, which is integral to one approach for propellant resupply. The analytical effort was instrumental in identifying issues associated with resupply objectives, and it was used to help develop an operational procedure to accomplish the desired propellant transfer for a particular storable bipropellant system. Work on the project was not completed, and several topics have been identified as requiring further study; these include the potential for liquid entrainment during the low-g and thermal/freezing effects in the vent line and orifice. Verification of the feasibility of this propellant venting and resupply approach still requires additional analyses as well as testing to investigate the fluid and thermodynamic phenomena involved.
Data requirements for verification of ram glow chemistry
NASA Technical Reports Server (NTRS)
Swenson, G. R.; Mende, S. B.
1985-01-01
A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.
NASA Astrophysics Data System (ADS)
Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.; Dirgantara, Y.; Yuniarti, H.; Sapriadil, S.; Hermita, N.
2018-01-01
This study aimed to investigate the improvement to pre-service teacher’s communication skills through Higher Order Thinking Laboratory (HOT Lab) on electric circuit topic. This research used the quasi-experiment method with pretest-posttest control group design. Research subjects were 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The sample was chosen by random sampling technique. Students’ communication skill data collected using a communication skills test instruments-essays form and observations sheets. The results showed that pre-service teacher communication skills using HOT Lab were higher than verification lab. Student’s communication skills in groups using HOT Lab were not influenced by gender. Communication skills could increase due to HOT Lab based on problems solving that can develop communication through hands-on activities. Therefore, the conclusion of this research shows the application of HOT Lab is more effective than the verification lab to improve communication skills of pre-service teachers in electric circuit topic and gender is not related to a person’s communication skills.
NASA Astrophysics Data System (ADS)
Caillat, A.; Costille, A.; Pascal, S.; Rossin, C.; Vives, S.; Foulon, B.; Sanchez, P.
2017-09-01
Dark matter and dark energy mysteries will be explored by the Euclid ESA M-class space mission which will be launched in 2020. Millions of galaxies will be surveyed through visible imagery and NIR imagery and spectroscopy in order to map in three dimensions the Universe at different evolution stages over the past 10 billion years. The massive NIR spectroscopic survey will be done efficiently by the NISP instrument thanks to the use of grisms (for "Grating pRISMs") developed under the responsibility of the LAM. In this paper, we present the verification philosophy applied to test and validate each grism before the delivery to the project. The test sequence covers a large set of verifications: optical tests to validate efficiency and WFE of the component, mechanical tests to validate the robustness to vibration, thermal tests to validate its behavior in cryogenic environment and a complete metrology of the assembled component. We show the test results obtained on the first grism Engineering and Qualification Model (EQM) which will be delivered to the NISP project in fall 2016.
Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Miller, Karen A.; Garner, James R.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less
NASA Technical Reports Server (NTRS)
1989-01-01
The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.
ATM photoheliograph. [at a solar observatory
NASA Technical Reports Server (NTRS)
Prout, R. A.
1975-01-01
The design and fabrication are presented of a 65 cm photoheliograph functional verification unit (FVU) installed in a major solar observatory. The telescope is used in a daily program of solar observation while serving as a test bed for the development of instrumentation to be included in early space shuttle launched solar telescopes. The 65 cm FVU was designed to be mechanically compatible with the ATM spar/canister and would be adaptable to a second ATM flight utilizing the existing spar/canister configuration. An image motion compensation breadboard and a space-hardened, remotely tuned H alpha filter, as well as solar telescopes of different optical configurations or increased aperture are discussed.
Completing the puzzle: AOLI full-commissioning fresh results and AIV innovations
NASA Astrophysics Data System (ADS)
Velasco, S.; Colodro-Conde, C.; López, R. L.; Oscoz, A.; Valdivia, J. J. F.; Rebolo, R.; Femenía, B.; King, D. L.; Labadie, L.; Mackay, C.; Muthusubramanian, B.; Pérez-Garrido, A.; Puga, M.; Rodríguez-Coira, G.; Rodríguez-Ramos, L. F.; Rodríguez-Ramos, J. M.
2017-03-01
The Adaptive Optics Lucky Imager (AOLI) is a new instrument designed to combine adaptive optics (AO) and lucky imaging (LI) techniques to deliver high spatial resolution in the visible, about 20 mas, from ground-based telescopes. Here we present details of the integration and verification phases explaining the defiance that we have faced and the innovative and versatile solution of modular integration for each of its subsystems that we have developed. Modularity seems a clue key for opto-mechanical integration success in the extremely-big telescopes era. We present here the very fresh preliminary results after its first fully-working observing run on the WHT.
NASA Technical Reports Server (NTRS)
Barsi, Julia A.
1995-01-01
The first Clouds and the Earth's Radiant Energy System (CERES) instrument will be launched in 1997 to collect data on the Earth's radiation budget. The data retrieved from the satellite will be processed through twelve subsystems. The Single Satellite Footprint (SSF) plot generator software was written to assist scientists in the early stages of CERES data analysis, producing two-dimensional plots of the footprint radiation and cloud data generated by one of the subsystems. Until the satellite is launched, however, software developers need verification tools to check their code. This plot generator will aid programmers by geolocating algorithm result on a global map.
IT Security Standards and Legal Metrology - Transfer and Validation
NASA Astrophysics Data System (ADS)
Thiel, F.; Hartmann, V.; Grottker, U.; Richter, D.
2014-08-01
Legal Metrology's requirements can be transferred into the IT security domain applying a generic set of standardized rules provided by the Common Criteria (ISO/IEC 15408). We will outline the transfer and cross validation of such an approach. As an example serves the integration of Legal Metrology's requirements into a recently developed Common Criteria based Protection Profile for a Smart Meter Gateway designed under the leadership of the Germany's Federal Office for Information Security. The requirements on utility meters laid down in the Measuring Instruments Directive (MID) are incorporated. A verification approach to check for meeting Legal Metrology's requirements by their interpretation through Common Criteria's generic requirements is also presented.
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as
INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...
The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s
Marković, Bojan; Ignjatović, Janko; Vujadinović, Mirjana; Savić, Vedrana; Vladimirov, Sote; Karljiković-Rajić, Katarina
2015-01-01
Inter-laboratory verification of European pharmacopoeia (EP) monograph on derivative spectrophotometry (DS) method and its application for chitosan hydrochloride was carried out on two generation of instruments (earlier GBC Cintra 20 and current technology TS Evolution 300). Instruments operate with different versions of Savitzky-Golay algorithm and modes of generating digital derivative spectra. For resolution power parameter, defined as the amplitude ratio A/B in DS method EP monograph, comparable results were obtained only with algorithm's parameters smoothing points (SP) 7 and the 2nd degree polynomial and those provided corresponding data with other two modes on TS Evolution 300 Medium digital indirect and Medium digital direct. Using quoted algorithm's parameters, the differences in percentages between the amplitude ratio A/B averages, were within accepted criteria (±3%) for assay of drug product for method transfer. The deviation of 1.76% for the degree of deacetylation assessment of chitosan hydrochloride, determined on two instruments, (amplitude (1)D202; the 2nd degree polynomial and SP 9 in Savitzky-Golay algorithm), was acceptable, since it was within allowed criteria (±2%) for assay deviation of drug substance, for method transfer in pharmaceutical analyses. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Moses, J. F.; Jain, P.; Johnson, J.; Doiron, J. A.
2017-12-01
New Earth observation instruments are planned to enable advancements in Earth science research over the next decade. Diversity of Earth observing instruments and their observing platforms will continue to increase as new instrument technologies emerge and are deployed as part of National programs such as Joint Polar Satellite System (JPSS), Geostationary Operational Environmental Satellite system (GOES), Landsat as well as the potential for many CubeSat and aircraft missions. The practical use and value of these observational data often extends well beyond their original purpose. The practicing community needs intuitive and standardized tools to enable quick unfettered development of tailored products for specific applications and decision support systems. However, the associated data processing system can take years to develop and requires inherent knowledge and the ability to integrate increasingly diverse data types from multiple sources. This paper describes the adaptation of a large-scale data processing system built for supporting JPSS algorithm calibration and validation (Cal/Val) node to a simplified science data system for rapid application. The new configurable data system reuses scalable JAVA technologies built for the JPSS Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) system to run within a laptop environment and support product generation and data processing of AURA Ozone Monitoring Instrument (OMI) science products. Of particular interest are the root requirements necessary for integrating experimental algorithms and Hierarchical Data Format (HDF) data access libraries into a science data production system. This study demonstrates the ability to reuse existing Ground System technologies to support future missions with minimal changes.
SMAP Instrument Mechanical System Engineering
NASA Technical Reports Server (NTRS)
Slimko, Eric; French, Richard; Riggs, Benjamin
2013-01-01
The Soil Moisture Active Passive (SMAP) mission, scheduled for launch by the end of 2014, is being developed to measure the soil moisture and soil freeze/thaw state on a global scale over a three-year period. The accuracy, resolution, and global coverage of SMAP measurements are invaluable across many science and applications disciplines including hydrology, climate, carbon cycle, and the meteorological, environment, and ecology applications communities. The SMAP observatory is composed of a despun bus and a spinning instrument platform that includes both a deployable 6 meter aperture low structural frequency Astromesh reflector and a spin control system. The instrument section has engendered challenging mechanical system issues associated with the antenna deployment, flexible antenna pointing in the context of a multitude of disturbances, spun section mass properties, spin control system development, and overall integration with the flight system on both mechanical and control system levels. Moreover, the multitude of organizations involved, including two major vendors providing the spin subsystem and reflector boom assembly plus the flight system mechanical and guidance, navigation, and control teams, has led to several unique system engineering challenges. Capturing the key physics associated with the function of the flight system has been challenging due to the many different domains that are applicable. Key interfaces and operational concepts have led to complex negotiations because of the large number of organizations that integrate with the instrument mechanical system. Additionally, the verification and validation concerns associated with the mechanical system have had required far-reaching involvement from both the flight system and other subsystems. The SMAP instrument mechanical systems engineering issues and their solutions are described in this paper.
Analysis of key technologies for virtual instruments metrology
NASA Astrophysics Data System (ADS)
Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang
2008-12-01
Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.
40 CFR 1065.376 - Chiller NO2 penetration.
Code of Federal Regulations, 2010 CFR
2010-07-01
... CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o Measurements § 1065.376... measurement instrument, but you don't use an NO2-to-NO converter upstream of the chiller, you must perform... after major maintenance. (b) Measurement principles. A chiller removes water, which can otherwise...
The L2000DX Analyzer (dimensions: 9 x 9.5 x 4.25 in.) is a field-portable ion-specific electrode instrument, weighing approximately 5 lb 12 oz, designed to quantify concentrations of PCBS, chlorinated solvents, and pesticides in soils, water, transformer oils, and surface wipes. ...
Drinking water distribution systems reach the majority of American homes, business and civic areas, and are therefore an attractive target for terrorist attack via direct contamination, or backflow events. Instrumental monitoring of such systems may be used to signal the prese...
The Jet REMPI (Resonance Enhanced Multiphoton Ionization) monitor was tested on a hazardous waste firing boiler for its ability to determine concentrations of polychlorinated dibenzodioxins and dibenzofurans (PCDDs/Fs). Jet REMPI is a real time instrument capable of highly selec...
Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.
ERIC Educational Resources Information Center
Kaya, Azmi
1982-01-01
Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…
Electronic Noses for Environmental Monitoring Applications
Capelli, Laura; Sironi, Selena; Rosso, Renato Del
2014-01-01
Electronic nose applications in environmental monitoring are nowadays of great interest, because of the instruments' proven capability of recognizing and discriminating between a variety of different gases and odors using just a small number of sensors. Such applications in the environmental field include analysis of parameters relating to environmental quality, process control, and verification of efficiency of odor control systems. This article reviews the findings of recent scientific studies in this field, with particular focus on the abovementioned applications. In general, these studies prove that electronic noses are mostly suitable for the different applications reported, especially if the instruments are specifically developed and fine-tuned. As a general rule, literature studies also discuss the critical aspects connected with the different possible uses, as well as research regarding the development of effective solutions. However, currently the main limit to the diffusion of electronic noses as environmental monitoring tools is their complexity and the lack of specific regulation for their standardization, as their use entails a large number of degrees of freedom, regarding for instance the training and the data processing procedures. PMID:25347583
NASA Astrophysics Data System (ADS)
Ahlers, B.; Hutchinson, I.; Ingley, R.
2017-11-01
A spectrometer for combined Raman and Laser Induced Breakdown Spectroscopy (LIBS) is amongst the different instruments that have been pre-selected for the Pasteur payload of the ExoMars rover. It is regarded as a fundamental, next-generation instrument for organic, mineralogical and elemental characterisation of Martian soil, rock samples and organic molecules. Raman spectroscopy and LIBS will be integrated into a single instrument sharing many hardware commonalities [1]. The combined Raman / LIBS instrument has been recommended as the highest priority mineralogy instrument to be included in the rover's analytical laboratory for the following tasks: Analyse surface and sub-surface soil and rocks on Mars, identify organics in the search for life and determine soil origin & toxicity. The synergy of the system is evident: the Raman spectrometer is dedicated to molecular analysis of organics and minerals; the LIBS provides information on the sample's elemental composition. An international team, under ESA contract and with the leadership of TNO Science and Industry, has built and tested an Elegant Bread Board (EBB) of the combined Raman / LIBS instrument. The EBB comprises a specifically designed, extremely compact, spectrometer with high resolution over a large wavelength range, suitable for both Raman spectroscopy and LIBS measurements. The EBB also includes lasers, illumination and imaging optics as well as fibre optics for light transfer. A summary of the functional and environmental requirements together with a description of the optical design and its expected performance are described in [2]. The EBB was developed and constructed to verify the instruments' end-to-end functional performance with natural samples. The combined Raman / LIBS EBB realisation and test results of natural samples will be presented. For the Flight Model (FM) instrument, currently in the design phase, the Netherlands will be responsible for the design, development and verification of the spectrometer unit, while the UK provides the detector. The differences between the EBB and the FM will be demonstrated.
NASA Technical Reports Server (NTRS)
Glazer, Stuart; Comber, Brian (Inventor)
2016-01-01
The James Webb Space Telescope is a large infrared telescope with a 6.5-meter primary mirror, designed as a successor to the Hubble Space Telescope when launched in 2018. Three of the four science instruments contained within the Integrated Science Instrument Module (ISIM) are passively cooled to their operational temperature range of 36K to 40K with radiators, and the fourth instrument is actively cooled to its operational temperature of approximately 6K. Thermal-vacuum testing of the flight science instruments at the ISIM element level has taken place in three separate highly challenging and extremely complex thermal tests within a gaseous helium-cooled shroud inside Goddard Space Flight Centers Space Environment Simulator. Special data acquisition software was developed for these tests to monitor over 1700 flight and test sensor measurements, track over 50 gradients, component rates, and temperature limits in real time against defined constraints and limitations, and guide the complex transition from ambient to final cryogenic temperatures and back. This extremely flexible system has proven highly successful in safeguarding the nearly $2B science payload during the 3.5-month-long thermal tests. Heat flow measurement instrumentation, or Q-meters, were also specially developed for these tests. These devices provide thermal boundaries o the flight hardware while measuring instrument heat loads up to 600 mW with an estimated uncertainty of 2 mW in test, enabling accurate thermal model correlation, hardware design validation, and workmanship verification. The high accuracy heat load measurements provided first evidence of a potentially serious hardware design issue that was subsequently corrected. This paper provides an overview of the ISIM-level thermal-vacuum tests and thermal objectives; explains the thermal test configuration and thermal balances; describes special measurement instrumentation and monitoring and control software; presents key test thermal results; lists problems encountered during testing and lessons learned.
Verification of Ceramic Structures
NASA Astrophysics Data System (ADS)
Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit
2012-07-01
In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).
Dental equipment test during zero-gravity flight
NASA Technical Reports Server (NTRS)
Young, John; Gosbee, John; Billica, Roger
1991-01-01
The overall objectives of this program were to establish performance criteria and develop prototype equipment for use in the Health Maintenance Facility (HMF) in meeting the needs of dental emergencies during space missions. The primary efforts during this flight test were to test patient-operator relationships, patent (manikin) restraint and positioning, task lighting systems, use and operation of dental rotary instruments, suction and particle containment system, dental hand instrument delivery and control procedures, and the use of dental treatment materials. The initial efforts during the flight focused on verification of the efficiency of the particle containment system. An absorptive barrier was also tested in lieu of the suction collector. To test the instrument delivery system, teeth in the manikin were prepared with the dental drill to receive restorations, some with temporary filling materials and another with definitive filling material (composite resin). The best particle containment came from the combination use of the laminar-air/suction collector in concert with immediate area suction from a surgical high-volume suction tip. Lighting in the treatment area was provided by a flexible fiberoptic probe. This system is quite effective for small areas, but for general tasks ambient illumination is required. The instrument containment system (elastic cord network) was extremely effective and easy to use. The most serious problem with instrument delivey and actual treatment was lack of time during the microgravity sequences. The restorative materials handled and finished well.
von Lühmann, Alexander; Herff, Christian; Heger, Dominic; Schultz, Tanja
2015-01-01
Brain-Computer Interfaces (BCIs) and neuroergonomics research have high requirements regarding robustness and mobility. Additionally, fast applicability and customization are desired. Functional Near-Infrared Spectroscopy (fNIRS) is an increasingly established technology with a potential to satisfy these conditions. EEG acquisition technology, currently one of the main modalities used for mobile brain activity assessment, is widely spread and open for access and thus easily customizable. fNIRS technology on the other hand has either to be bought as a predefined commercial solution or developed from scratch using published literature. To help reducing time and effort of future custom designs for research purposes, we present our approach toward an open source multichannel stand-alone fNIRS instrument for mobile NIRS-based neuroimaging, neuroergonomics and BCI/BMI applications. The instrument is low-cost, miniaturized, wireless and modular and openly documented on www.opennirs.org. It provides features such as scalable channel number, configurable regulated light intensities, programmable gain and lock-in amplification. In this paper, the system concept, hardware, software and mechanical implementation of the lightweight stand-alone instrument are presented and the evaluation and verification results of the instrument's hardware and physiological fNIRS functionality are described. Its capability to measure brain activity is demonstrated by qualitative signal assessments and a quantitative mental arithmetic based BCI study with 12 subjects. PMID:26617510
Completing and sustaining IMS network for the CTBT Verification Regime
NASA Astrophysics Data System (ADS)
Meral Ozel, N.
2015-12-01
The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
V Yashchuk; R Conley; E Anderson
Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1] and [2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanningmore » (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less
NASA Technical Reports Server (NTRS)
Amason, David L.
2008-01-01
The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
Verifax: Biometric instruments measuring neuromuscular disorders/performance impairments
NASA Astrophysics Data System (ADS)
Morgenthaler, George W.; Shrairman, Ruth; Landau, Alexander
1998-01-01
VeriFax, founded in 1990 by Dr. Ruth Shrairman and Mr. Alex Landau, began operations with the aim of developing a biometric tool for the verification of signatures from a distance. In the course of developing this VeriFax Autograph technology, two other related applications for the technologies under development at VeriFax became apparent. The first application was in the use of biometric measurements as clinical monitoring tools for physicians investigating neuromuscular diseases (embodied in VeriFax's Neuroskill technology). The second application was to evaluate persons with critical skills (e.g., airline pilots, bus drivers) for physical and mental performance impairments caused by stress, physiological disorders, alcohol, drug abuse, etc. (represented by VeriFax's Impairoscope prototype instrument). This last application raised the possibility of using a space-qualified Impairoscope variant to evaluate astronaut performance with respect to the impacts of stress, fatigue, excessive workload, build-up of toxic chemicals within the space habitat, etc. The three applications of VeriFax's patented technology are accomplished by application-specific modifications of the customized VeriFax software. Strong commercial market potentials exist for all three VeriFax technology applications, and market progress will be presented in more detail below.
NASA Technical Reports Server (NTRS)
Salomonson, Vincent V.
1991-01-01
The Moderate Resolution Imaging Spectrometer (MODIS) is a key observing facility to be flown on the Earth Observing System (EOS). The facility is composed of two instruments called MODIS-N (nadir) and MODIS-T (tilt). The MODIS-N is being built under contract to NASA by the Santa Barbara Research Center. The MODIS-T is being fabricated by the Engineering Directorate at the Goddard Space Flight Center. The MODIS Science Team has defined nearly 40 biogeophysical data products for studies of the ocean and land surface and properties of the atmosphere including clouds that can be expected to be produced from the MODIS instruments shortly after the launch of EOS. The ocean, land, atmosphere, and calibration groups of the MODIS Science Team are now proceeding to plan and implement the operations and facilities involving the analysis of data from existing spaceborne, airborne, and in-situ sensors required to develop and validate the algorithms that will produce the geophysical data products. These algorithm development and validation efforts will be accomplished wherever possible within the context of existing or planned national and international experiments or programs such as those in the World Climate Research Program.
NASA Technical Reports Server (NTRS)
DiPirro, M.; Homan, J.; Havey, K.; Ousley, W.
2017-01-01
The James Webb Space Telescope (JWST) is the largest cryogenic instrument telescope to be developed for space flight. The telescope will be passively cooled to 50 K and the instrument package will be at 40 K with the mid-infrared instrument at 6 K. The final cryogenic test of the Optical Telescope Element (OTE) and Integrated Science Instrument Module (ISIM) as an assembly (OTE + ISIM OTIS) will be performed in the largest 15 K chamber in the world, Chamber A at Johnson Space Center. The planned duration of this test will be 100 days in the middle of 2017. Needless to say, this ultimate test of OTIS, the cryogenic portion of JWST will be crucial in verifying the end-to-end performance of JWST. A repeat of this test would not only be expensive, but would delay the launch schedule (currently October 2018). Therefore a series of checkouts and verifications of the chamber and ground support equipment were planned and carried out between 2012 and 2016. This paper will provide a top-level summary of those tests, trades in coming up with the test plan, as well as some details of individual issues that were encountered and resolved in the course of testing.
A Method For The Verification Of Wire Crimp Compression Using Ultrasonic Inspection
NASA Technical Reports Server (NTRS)
Cramer, K. E.; Perey, Daniel F.; Yost, William t.
2010-01-01
The development of a new ultrasonic measurement technique to assess quantitatively wire crimp terminations is discussed. The amplitude change of a compressional ultrasonic wave propagating at right angles to the wire axis and through the junction of a crimp termination is shown to correlate with the results of a destructive pull test, which is a standard for assessing crimp wire junction quality. To demonstrate the technique, the case of incomplete compression of crimped connections is ultrasonically tested, and the results are correlated with pull tests. Results show that the nondestructive ultrasonic measurement technique consistently predicts good crimps when the ultrasonic transmission is above a certain threshold amplitude level. A quantitative measure of the quality of the crimped connection based on the ultrasonic energy transmitted is shown to respond accurately to crimp quality. A wave propagation model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process. This model is in agreement within 6% of the ultrasonic measurements. A prototype instrument for applying this technique while wire crimps are installed is also presented. The instrument is based on a two-jaw type crimp tool suitable for butt-splice type connections. A comparison of the results of two different instruments is presented and shows reproducibility between instruments within a 95% confidence bound.
Sediq, Amany Mohy-Eldin; Abdel-Azeez, Ahmad GabAllahm Hala
2014-01-01
The current practice in Zagazig University Hospitals Laboratories (ZUHL) is manual verification of all results for the later release of reports. These processes are time consuming and tedious, with large inter-individual variation that slows the turnaround time (TAT). Autoverification is the process of comparing patient results, generated from interfaced instruments, against laboratory-defined acceptance parameters. This study describes an autoverification engine designed and implemented in ZUHL, Egypt. A descriptive study conducted at ZUHL, from January 2012-December 2013. A rule-based system was used in designing an autoverification engine. The engine was preliminarily evaluated on a thyroid function panel. A total of 563 rules were written and tested on 563 simulated cases and 1673 archived cases. The engine decisions were compared to that of 4 independent expert reviewers. The impact of engine implementation on TAT was evaluated. Agreement was achieved among the 4 reviewers in 55.5% of cases, and with the engine in 51.5% of cases. The autoverification rate for archived cases was 63.8%. Reported lab TAT was reduced by 34.9%, and TAT segment from the completion of analysis to verification was reduced by 61.8%. The developed rule-based autoverification system has a verification rate comparable to that of the commercially available software. However, the in-house development of this system had saved the hospital the cost of commercially available ones. The implementation of the system shortened the TAT and minimized the number of samples that needed staff revision, which enabled laboratory staff to devote more time and effort to handle problematic test results and to improve patient care quality.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
Verification of the Sentinel-4 focal plane subsystem
NASA Astrophysics Data System (ADS)
Williges, Christian; Uhlig, Mathias; Hilbert, Stefan; Rossmann, Hannes; Buchwinkler, Kevin; Babben, Steffen; Sebastian, Ilse; Hohn, Rüdiger; Reulke, Ralf
2017-09-01
The Sentinel-4 payload is a multi-spectral camera system, designed to monitor atmospheric conditions over Europe from a geostationary orbit. The German Aerospace Center, DLR Berlin, conducted the verification campaign of the Focal Plane Subsystem (FPS) during the second half of 2016. The FPS consists, of two Focal Plane Assemblies (FPAs), two Front End Electronics (FEEs), one Front End Support Electronic (FSE) and one Instrument Control Unit (ICU). The FPAs are designed for two spectral ranges: UV-VIS (305 nm - 500 nm) and NIR (750 nm - 775 nm). In this publication, we will present in detail the set-up of the verification campaign of the Sentinel-4 Qualification Model (QM). This set up will also be used for the upcoming Flight Model (FM) verification, planned for early 2018. The FPAs have to be operated at 215 K +/- 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. The test campaign consists mainly of radiometric tests. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Selected test analyses and results will be presented.
In-orbit commissioning of the NIRSpec instrument on the James Webb Space Telescope
NASA Astrophysics Data System (ADS)
Böker, T.; Muzerolle, J.; Bacinski, J.; Alves de Oliveira, C.; Birkmann, S.; Ferruit, P.; Karl, H.; Lemke, R.; Lützgendorf, N.; Marston, A.; Mosner, P.; Rawle, T.; Sirianni, M.
2016-07-01
The James Webb Space Telescope (JWST), scheduled for launch in 2018, promises to revolutionize observational astronomy, due to its unprecedented sensitivity at near and mid-infrared wavelengths. Following launch, a ~6 month long commissioning campaign aims to verify the observatory performance. A key element in this campaign is the verification and early calibration of the four JWST science instruments, one of which is the Near-Infrared Spectrograph (NIRSpec). This paper summarizes the objectives of the NIRSpec commissioning campaign, and outlines the sequence of activities needed to achieve these objectives.
NASA Astrophysics Data System (ADS)
Ohno, M.; Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y.; Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K.; Wada, Y.; Nakazawa, K.; Mimura, T.; Kataoka, J.; Ichinohe, Y.; Uchida, Y.; Katsuragawa, M.; Yoneda, H.; Sato, G.; Sato, R.; Kawaharada, M.; Harayama, A.; Odaka, H.; Hayashi, K.; Ohta, M.; Watanabe, S.; Kokubun, M.; Takahashi, T.; Takeda, S.; Kinoshita, M.; Yamaoka, K.; Tajima, H.; Yatsu, Y.; Uchiyama, H.; Saito, S.; Yuasa, T.; Makishima, K.; ASTRO-H HXI/SGD Team
2016-09-01
The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5-80 keV) and soft gamma-rays (60-600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector.
Large Volume, Optical and Opto-Mechanical Metrology Techniques for ISIM on JWST
NASA Technical Reports Server (NTRS)
Hadjimichael, Theo
2015-01-01
The final, flight build of the Integrated Science Instrument Module (ISIM) element of the James Webb Space Telescope is the culmination of years of work across many disciplines and partners. This paper covers the large volume, ambient, optical and opto-mechanical metrology techniques used to verify the mechanical integration of the flight instruments in ISIM, including optical pupil alignment. We present an overview of ISIM's integration and test program, which is in progress, with an emphasis on alignment and optical performance verification. This work is performed at NASA Goddard Space Flight Center, in close collaboration with the European Space Agency, the Canadian Space Agency, and the Mid-Infrared Instrument European Consortium.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
Microgravity Acceleration Measurement System (MAMS) Flight Configuration Verification and Status
NASA Technical Reports Server (NTRS)
Wagar, William
2000-01-01
The Microgravity Acceleration Measurement System (MAMS) is a precision spaceflight instrument designed to measure and characterize the microgravity environment existing in the US Lab Module of the International Space Station. Both vibratory and quasi-steady triaxial acceleration data are acquired and provided to an Ethernet data link. The MAMS Double Mid-Deck Locker (DMDL) EXPRESS Rack payload meets all the ISS IDD and ICD interface requirements as discussed in the paper which also presents flight configuration illustrations. The overall MAMS sensor and data acquisition performance and verification data are presented in addition to a discussion of the Command and Data Handling features implemented via the ISS, downlink and the GRC Telescience Center displays.
Methods for identification and verification using vacuum XRF system
NASA Technical Reports Server (NTRS)
Kaiser, Bruce (Inventor); Schramm, Fred (Inventor)
2005-01-01
Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.
24 CFR 960.259 - Family information and verification.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...
The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.
ERIC Educational Resources Information Center
National Evaluation Systems, Inc., Amherst, MA.
National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…
24 CFR 960.259 - Family information and verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...
NASA Technical Reports Server (NTRS)
Yechout, T. R.; Braman, K. B.
1984-01-01
The development, implementation and flight test evaluation of a performance modeling technique which required a limited amount of quasisteady state flight test data to predict the overall one g performance characteristics of an aircraft. The concept definition phase of the program include development of: (1) the relationship for defining aerodynamic characteristics from quasi steady state maneuvers; (2) a simplified in flight thrust and airflow prediction technique; (3) a flight test maneuvering sequence which efficiently provided definition of baseline aerodynamic and engine characteristics including power effects on lift and drag; and (4) the algorithms necessary for cruise and flight trajectory predictions. Implementation of the concept include design of the overall flight test data flow, definition of instrumentation system and ground test requirements, development and verification of all applicable software and consolidation of the overall requirements in a flight test plan.
75 FR 82140 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
... post-test verification. Two or three teams of verifiers will then be sent out with field instruments to... test pilot project beginning April 1, 2011, for a period of up to 1 year on the main tracks between... nonstop continuous rail test, CSX will not perform parallel/redundant start/stop rail testing on track...
77 FR 8325 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... defective locations for post-test verification. Verifiers will then be sent out with field instruments to... III of its nonstop continuous rail test pilot project beginning April 1, 2012, for a period of up to 1... nonstop continuous rail test, CSX will not perform parallel or redundant start/stop rail testing on track...
Survey of L Band Tower and Airborne Sensor Systems Relevant to Upcoming Soil Moisture Missions
USDA-ARS?s Scientific Manuscript database
Basic research on the physics of microwave remote sensing of soil moisture has been conducted for almost thirty years using ground-based (tower- or truck-mounted) microwave instruments at L band frequencies. Early small point-scale studies were aimed at improved understanding and verification of mi...
Validation and Verification of Composite Pressure Vessel Design
NASA Technical Reports Server (NTRS)
Kreger, Stephen T.; Ortyl, Nicholas; Grant, Joseph; Taylor, F. Tad
2006-01-01
Ten composite pressure vessels were instrumented with fiber Bragg grating sensors and pressure tested Through burst. This paper and presentation will discuss the testing methodology, the test results, compare the testing results to the analytical model, and also compare the fiber Bragg grating sensor data with data obtained against that obtained from foil strain gages.
Interpreter composition issues in the formal verification of a processor-memory module
NASA Technical Reports Server (NTRS)
Fura, David A.; Cohen, Gerald C.
1994-01-01
This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
24 CFR 5.512 - Verification of eligible immigration status.
Code of Federal Regulations, 2010 CFR
2010-04-01
... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...
The space shuttle launch vehicle aerodynamic verification challenges
NASA Technical Reports Server (NTRS)
Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.
1985-01-01
The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.
Built-in-Test Verification Techniques
1987-02-01
report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
24 CFR 985.3 - Indicators, HUD verification methods and ratings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...
Verification test report on a solar heating and hot water system
NASA Technical Reports Server (NTRS)
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
24 CFR 1000.128 - Is income verification required for assistance under NAHASDA?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Is income verification required for assistance under NAHASDA? 1000.128 Section 1000.128 Housing and Urban Development Regulations Relating to... § 1000.128 Is income verification required for assistance under NAHASDA? (a) Yes, the recipient must...
A study on Korean nursing students' educational outcomes
Oh, Kasil; Lee, Hyang-Yeon; Lee, Sook-Ja; Kim, In-Ja; Choi, Kyung-Sook; Ko, Myung-Sook
2011-01-01
The purpose of this study was to describe outcome indicators of nursing education including critical thinking, professionalism, leadership, and communication and to evaluate differences among nursing programs and academic years. A descriptive research design was employed. A total of 454 students from four year baccalaureate (BS) nursing programs and two three-year associate degree (AD) programs consented to complete self-administered questionnaires. The variables were critical thinking, professionalism, leadership and communication. Descriptive statistics, χ2-test, t-tests, ANOVA, and the Tukey test were utilized for the data analysis. All the mean scores of the variables were above average for the test instruments utilized. Among the BS students, those in the upper classes tended to attain higher scores, but this tendency was not identified in AD students. There were significant differences between BS students and AD students for the mean scores of leadership and communication. These findings suggested the need for further research to define properties of nursing educational outcomes, and to develop standardized instruments for research replication and verification. PMID:21602914
NASA Astrophysics Data System (ADS)
Brázdil, R.; Büntgen, U.; Dobrovolný, P.; Trnka, M.; Kyncl, T.
2010-09-01
Precipitation is one of the most important meteorological elements for different natural processes as well as for human society. Its long term fluctuations in the Czech Lands (recent Czech Republic) can be studied using long instrumental series (Brno since January 1803, Prague-Klementinum since May 1804), a tree-ring chronology from southern Moravian fir Abies alba Mill. developed from living and historical trees (since A.D. 1376), and monthly precipitation indices derived from documentary evidence (from A.D. 1500). The analysis focuses on May-June precipitation and drought patterns represented by the Z-index for the past 500 years showing the highest response of the tree-ring chronology to the mentioned months in the calibration/verification period between 1803 and 1932. Tree-ring and documentary-based May-June Z-index reconstructions explaining ca 30-40% of its variability are compared with existing reconstructions of hydroclimatic patterns of the Central European region. Uncertainties of tree-ring and documentary datasets and corresponding reconstructions are discussed.
NASA Astrophysics Data System (ADS)
Lall, U.; Allaire, M.; Ceccato, P.; Haraguchi, M.; Cian, F.; Bavandi, A.
2017-12-01
Catastrophic floods can pose a significant challenge for response and recovery. A key bottleneck in the speed of response is the availability of funds to a country or regions finance ministry to mobilize resources. Parametric instruments, where the release of funs is tied to the exceedance of a specified index or threshold, rather than to loss verification are well suited for this purpose. However, designing and appropriate index, that is not subject to manipulation and accurately reflects the need is a challenge, especially in developing countries which have short hydroclimatic and loss records, and where rapid land use change has led to significant changes in exposure and hydrology over time. The use of long records of rainfall from climate re-analyses, flooded area and land use from remote sensing to design and benchmark a parametric index considering the uncertainty and representativeness of potential loss is explored with applications to Bangladesh and Thailand. Prospects for broader applicability and limitations are discussed.
A Novel Data System for Verification of Internal Parameters of Motor Design
NASA Technical Reports Server (NTRS)
Smith, Doug; Saint Jean, Paul; Everton, Randy; Uresk, Bonnie
2003-01-01
Three major obstacles have limited the amount of information that can be obtained from inside an operating solid rocket motor. The first is a safety issue due to the presence of live propellant interacting with classical, electrical instrumentation. The second is a pressure vessel feed through risk arising from bringing a large number of wires through the rocket motor wall safely. The third is an attachment/protection issue associated with connecting gages to live propellant. Thiokol has developed a highly miniaturized, networked, electrically isolated data system that has safely delivered information from classical, electrical instrumentation (even on the burning propellant surface) to the outside world. This system requires only four wires to deliver 80 channels of data at 2300 samples/second/channel. The feed through leak path risk is massively reduced from the current situation where each gage requires at least three pressure vessel wire penetrations. The external electrical isolation of the system is better than that of the propellant itself. This paper describes the new system.
NASA Astrophysics Data System (ADS)
de Lera Acedo, E.; Bolli, P.; Paonessa, F.; Virone, G.; Colin-Beltran, E.; Razavi-Ghods, N.; Aicardi, I.; Lingua, A.; Maschio, P.; Monari, J.; Naldi, G.; Piras, M.; Pupillo, G.
2018-03-01
In this paper we present the electromagnetic modeling and beam pattern measurements of a 16-elements ultra wideband sparse random test array for the low frequency instrument of the Square Kilometer Array telescope. We discuss the importance of a small array test platform for the development of technologies and techniques towards the final telescope, highlighting the most relevant aspects of its design. We also describe the electromagnetic simulations and modeling work as well as the embedded-element and array pattern measurements using an Unmanned Aerial Vehicle system. The latter are helpful both for the validation of the models and the design as well as for the future instrumental calibration of the telescope thanks to the stable, accurate and strong radio frequency signal transmitted by the UAV. At this stage of the design, these measurements have shown a general agreement between experimental results and numerical data and have revealed the localized effect of un-calibrated cable lengths in the inner side-lobes of the array pattern.
Cryogen-free operation of the Soft X-ray Spectrometer instrument
NASA Astrophysics Data System (ADS)
Sneiderman, Gary A.; Shirron, Peter J.; Fujimoto, Ryuichi; Bialas, Thomas G.; Boyce, Kevin R.; Chiao, Meng P.; DiPirro, Michael J.; Eckart, Megan E.; Hartz, Leslie; Ishisaki, Yoshitaka; Kelley, Richard L.; Kilbourne, Caroline A.; Masters, Candace; McCammon, Dan; Mitsuda, Kazuhisa; Noda, Hirofumi; Porter, Frederick S.; Szymkowiak, Andrew E.; Takei, Yoh; Tsujimoto, Masahiro; Yoshida, Seiji
2016-07-01
The Soft X-ray Spectrometer (SXS) is the first space-based instrument to implement redundancy in the operation of a sub-Kelvin refrigerator. The SXS cryogenic system consists of a superfluid helium tank and a combination of Stirling and Joule-Thompson (JT) cryocoolers that support the operation of a 3-stage adiabatic demagnetization refrigerator (ADR). When liquid helium is present, the x-ray microcalorimeter detectors are cooled to their 50 mK operating temperature by two ADR stages, which reject their heat directly to the liquid at 1.1 K. When the helium is depleted, all three ADR stages are used to accomplish detector cooling while rejecting heat to the JT cooler operating at 4.5 K. Compared to the simpler helium mode operation, the cryogen-free mode achieves the same instrument performance by controlling the active cooling devices within the cooling system differently. These include the three ADR stages and four active heat switches, provided by NASA, and five cryocoolers, provided by JAXA. Development and verification details of this capability are presented within this paper and offer valuable insights into the challenges, successes, and lessons that can benefit other missions, particularly those employing cryogen-free cooling systems.
Make the World Safer from Nuclear Weapons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Ted
Senior Nuclear Scientist Ted Bowyer knows firsthand the challenges associated with protecting our nation. Ted and his colleagues help detect the proliferation of nuclear weapons. They developed award-winning technologies that give international treaty verification authorities “eyes and ears” around the globe. The instruments, located in 80 countries, help ensure compliance with the Comprehensive Nuclear Test-Ban Treaty, or CTBT. They are completely automated radionuclide monitoring systems that would detect airborne radioactive particles if a nuclear detonation occurred in the air, underground or at sea. Some samples collected through these technologies are sent to PNNL’s Shallow Underground Laboratory—the only certified U.S. radionuclidemore » laboratory for the CTBT’s International Monitoring System Organization.« less
Composite Overwrapped Pressure Vessel (COPV) Stress Rupture Testing
NASA Technical Reports Server (NTRS)
Greene, Nathanael J.; Saulsberry, Regor L.; Leifeste, Mark R.; Yoder, Tommy B.; Keddy, Chris P.; Forth, Scott C.; Russell, Rick W.
2010-01-01
This paper reports stress rupture testing of Kevlar(TradeMark) composite overwrapped pressure vessels (COPVs) at NASA White Sands Test Facility. This 6-year test program was part of the larger effort to predict and extend the lifetime of flight vessels. Tests were performed to characterize control parameters for stress rupture testing, and vessel life was predicted by statistical modeling. One highly instrumented 102-cm (40-in.) diameter Kevlar(TradeMark) COPV was tested to failure (burst) as a single-point model verification. Significant data were generated that will enhance development of improved NDE methods and predictive modeling techniques, and thus better address stress rupture and other composite durability concerns that affect pressure vessel safety, reliability and mission assurance.
Measurement Sets and Sites Commonly Used for High Spatial Resolution Image Product Characterization
NASA Technical Reports Server (NTRS)
Pagnutti, Mary
2006-01-01
Scientists within NASA's Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site has enabled the in-flight characterization of satellite high spatial resolution remote sensing system products form Space Imaging IKONOS, Digital Globe QuickBird, and ORBIMAGE OrbView, as well as advanced multispectral airborne digital camera products. SSC utilizes engineered geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment and their Instrument Validation Laboratory to characterize high spatial resolution remote sensing data products. This presentation describes the SSC characterization capabilities and techniques in the visible through near infrared spectrum and examples of calibration results.
NASA Technical Reports Server (NTRS)
Hughes, Mark S.; Davis, Dawn M.; Bakker, Henry J.; Jensen, Scott L.
2007-01-01
This viewgraph presentation reviews the design of the electrical systems that are required for the testing of rockets at the Rocket Propulsion Facility at NASA Stennis Space Center (NASA SSC). NASA/SSC s Mission in Rocket Propulsion Testing Is to Acquire Test Performance Data for Verification, Validation and Qualification of Propulsion Systems Hardware. These must be accurate reliable comprehensive and timely. Data acquisition in a rocket propulsion test environment is challenging: severe temporal transient dynamic environments, large thermal gradients, vacuum to 15 ksi pressure regimes SSC has developed and employs DAS, control systems and control systems and robust instrumentation that effectively satisfies these challenges.
Holmes, Robert R.; Singh, Vijay P.
2016-01-01
The importance of streamflow data to the world’s economy, environmental health, and public safety continues to grow as the population increases. The collection of streamflow data is often an involved and complicated process. The quality of streamflow data hinges on such things as site selection, instrumentation selection, streamgage maintenance and quality assurance, proper discharge measurement techniques, and the development and continued verification of the streamflow rating. This chapter serves only as an overview of the streamflow data collection process as proper treatment of considerations, techniques, and quality assurance cannot be addressed adequately in the space limitations of this chapter. Readers with the need for the detailed information on the streamflow data collection process are referred to the many references noted in this chapter.
The NIRspec assembly integration and test status
NASA Astrophysics Data System (ADS)
Wettemann, Thomas; Ehrenwinkler, Ralf; Johnson, Thomas E.; Maschmann, Marc; Mosner, Peter; te Plate, Maurice; Rödel, Andreas
2017-11-01
The Near-Infrared Spectrograph (NIRSpec) is one of the four instruments on the James Webb Space Telescope (JWST) scheduled for launch in 2018. NIRSpec has been manufactured and tested by an European industrial consortium led by Airbus Defence and Space and delivered to the European Space Agency (ESA) and NASA in September 2013. Since then it has successfully been integrated into the JWST Integrated Science Instrument Module (ISIM) and is currently in ISIM Cryo-Vacuum Test#2. Since however two of its most important assemblies, the Focal Plane Assembly (FPA) and the Micro-Shutter Assembly (MSA) need to be replaced by new units we will present the status of the instrument, the status of its new flight assemblies in manufacturing and testing and give an outlook on the planned exchange activities and the following instrument re-verification.
A 2100-Year Reconstruction of July Rainfall Over Westcentral New Mexico
NASA Astrophysics Data System (ADS)
Stahle, D.; Cleaveland, M.; Therrell, M.; Grissino-Mayer, H.; Griffin, D.; Fye, F.
2007-05-01
We have developed a new 2,141-year long tree-ring chronology of latewood (LW) width from ancient Douglas-fir (Pseudotsuga menziesii) and ponderosa pine (Pinus ponderosae) at El Malpais National Monument, New Mexico. This is one of the longest precipitation-sensitive tree-ring chronologies yet constructed for the American Southwest and has been used to develop the first continuous multi-millennial tree-ring reconstruction of July precipitation in the region of the North American Monsoon System (NAMS). Monthly average precipitation increases sharply in July over western New Mexico, marking the dramatic onset to the summer monsoon season. The LW chronology explains 44 percent of the interannual variability of July precipitation in the instrumental record for New Mexico climate divisions 1 and 4 (1960-2004), after removal of the linear dependence of LW width on earlywood width following Meko and Baisan (2001), and has passed statistical tests of verification on independent July precipitation data (1895-1959). The instrumental and tree-ring reconstructed July precipitation data are correlated with the concurrent 500 mb height field over western North America and with the sea surface temperature gradient from the central to eastern North Pacific. The reconstruction exhibits several severe sustained July droughts that exceed any witnessed during the instrumental era, and has significant spectral power at periods near 3-5, 20, and 70 years.
New optical sensor systems for high-resolution satellite, airborne and terrestrial imaging systems
NASA Astrophysics Data System (ADS)
Eckardt, Andreas; Börner, Anko; Lehmann, Frank
2007-10-01
The department of Optical Information Systems (OS) at the Institute of Robotics and Mechatronics of the German Aerospace Center (DLR) has more than 25 years experience with high-resolution imaging technology. The technology changes in the development of detectors, as well as the significant change of the manufacturing accuracy in combination with the engineering research define the next generation of spaceborne sensor systems focusing on Earth observation and remote sensing. The combination of large TDI lines, intelligent synchronization control, fast-readable sensors and new focal-plane concepts open the door to new remote-sensing instruments. This class of instruments is feasible for high-resolution sensor systems regarding geometry and radiometry and their data products like 3D virtual reality. Systemic approaches are essential for such designs of complex sensor systems for dedicated tasks. The system theory of the instrument inside a simulated environment is the beginning of the optimization process for the optical, mechanical and electrical designs. Single modules and the entire system have to be calibrated and verified. Suitable procedures must be defined on component, module and system level for the assembly test and verification process. This kind of development strategy allows the hardware-in-the-loop design. The paper gives an overview about the current activities at DLR in the field of innovative sensor systems for photogrammetric and remote sensing purposes.
Flight instrument and telemetry response and its inversion
NASA Technical Reports Server (NTRS)
Weinberger, M. R.
1971-01-01
Mathematical models of rate gyros, servo accelerometers, pressure transducers, and telemetry systems were derived and their parameters were obtained from laboratory tests. Analog computer simulations were used extensively for verification of the validity for fast and large input signals. An optimal inversion method was derived to reconstruct input signals from noisy output signals and a computer program was prepared.
Code of Federal Regulations, 2010 CFR
2010-07-01
... safety deposit box or other safekeeping services, or cash management, custodian, and trust services. (ii... documents, non-documentary methods, or a combination of both methods as described in this paragraph (b)(2... agreement, or trust instrument. (B) Verification through non-documentary methods. For a bank relying on non...
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Brantley
2016-01-01
A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less
An unattended verification station for UF6 cylinders: Field trial findings
NASA Astrophysics Data System (ADS)
Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.
2017-12-01
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.
V&V Plan for FPGA-based ESF-CCS Using System Engineering Approach.
NASA Astrophysics Data System (ADS)
Maerani, Restu; Mayaka, Joyce; El Akrat, Mohamed; Cheon, Jung Jae
2018-02-01
Instrumentation and Control (I&C) systems play an important role in maintaining the safety of Nuclear Power Plant (NPP) operation. However, most current I&C safety systems are based on Programmable Logic Controller (PLC) hardware, which is difficult to verify and validate, and is susceptible to software common cause failure. Therefore, a plan for the replacement of the PLC-based safety systems, such as the Engineered Safety Feature - Component Control System (ESF-CCS), with Field Programmable Gate Arrays (FPGA) is needed. By using a systems engineering approach, which ensures traceability in every phase of the life cycle, from system requirements, design implementation to verification and validation, the system development is guaranteed to be in line with the regulatory requirements. The Verification process will ensure that the customer and stakeholder’s needs are satisfied in a high quality, trustworthy, cost efficient and schedule compliant manner throughout a system’s entire life cycle. The benefit of the V&V plan is to ensure that the FPGA based ESF-CCS is correctly built, and to ensure that the measurement of performance indicators has positive feedback that “do we do the right thing” during the re-engineering process of the FPGA based ESF-CCS.
Ultrasonic Method for Deployment Mechanism Bolt Element Preload Verification
NASA Technical Reports Server (NTRS)
Johnson, Eric C.; Kim, Yong M.; Morris, Fred A.; Mitchell, Joel; Pan, Robert B.
2014-01-01
Deployment mechanisms play a pivotal role in mission success. These mechanisms often incorporate bolt elements for which a preload within a specified range is essential for proper operation. A common practice is to torque these bolt elements to a specified value during installation. The resulting preload, however, can vary significantly with applied torque for a number of reasons. The goal of this effort was to investigate ultrasonic methods as an alternative for bolt preload verification in such deployment mechanisms. A family of non-explosive release mechanisms widely used by satellite manufacturers was chosen for the work. A willing contractor permitted measurements on a sampling of bolt elements for these release mechanisms that were installed by a technician following a standard practice. A variation of approximately 50% (+/- 25%) in the resultant preloads was observed. An alternative ultrasonic method to set the preloads was then developed and calibration data was accumulated. The method was demonstrated on bolt elements installed in a fixture instrumented with a calibrated load cell and designed to mimic production practice. The ultrasonic method yielded results within +/- 3% of the load cell reading. The contractor has since adopted the alternative method for its future production. Introduction
NASA Technical Reports Server (NTRS)
Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.
1981-01-01
The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.
Development of advanced high-temperature heat flux sensors. Phase 2: Verification testing
NASA Technical Reports Server (NTRS)
Atkinson, W. H.; Cyr, M. A.; Strange, R. R.
1985-01-01
A two-phase program is conducted to develop heat flux sensors capable of making heat flux measurements throughout the hot section of gas turbine engines. In Phase 1, three types of heat flux sensors are selected; embedded thermocouple, laminated, and Gardon gauge sensors. A demonstration of the ability of these sensors to operate in an actual engine environment is reported. A segmented liner of each of two combustors being used in the Broad Specification Fuels Combustor program is instrumented with the three types of heat flux sensors then tested in a high pressure combustor rig. Radiometer probes are also used to measure the radiant heat loads to more fully characterize the combustor environment. Test results show the heat flux sensors to be in good agreement with radiometer probes and the predicted data trends. In general, heat flux sensors have strong potential for use in combustor development programs.
Ada(R) Test and Verification System (ATVS)
NASA Technical Reports Server (NTRS)
Strelich, Tom
1986-01-01
The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.
The Environmental Technology Verification Program, established by the EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance.
NASA Astrophysics Data System (ADS)
Trifoglio, M.; Gianotti, F.; Conforti, V.; Franceschi, E.; Stephen, J. B.; Bulgarelli, A.; Fioretti, V.; Maiorano, E.; Nicastro, L.; Valenziano, L.; Zoli, A.; Auricchio, N.; Balestra, A.; Bonino, D.; Bonoli, C.; Bortoletto, F.; Capobianco, V.; Chiarusi, T.; Corcione, L.; Debei, S.; De Rosa, A.; Dusini, S.; Fornari, F.; Giacomini, F.; Guizzo, G. P.; Ligori, S.; Margiotta, A.; Mauri, N.; Medinaceli, E.; Morgante, G.; Patrizii, L.; Sirignano, C.; Sirri, G.; Sortino, F.; Stanco, L.; Tenti, M.
2016-07-01
The NISP instrument on board the Euclid ESA mission will be developed and tested at different levels of integration using various test equipment which shall be designed and procured through a collaborative and coordinated effort. The NISP Instrument Workstation (NI-IWS) will be part of the EGSE configuration that will support the NISP AIV/AIT activities from the NISP Warm Electronics level up to the launch of Euclid. One workstation is required for the NISP EQM/AVM, and a second one for the NISP FM. Each workstation will follow the respective NISP model after delivery to ESA for Payload and Satellite AIV/AIT and launch. At these levels the NI-IWS shall be configured as part of the Payload EGSE, the System EGSE, and the Launch EGSE, respectively. After launch, the NI-IWS will be also re-used in the Euclid Ground Segment in order to support the Commissioning and Performance Verification (CPV) phase, and for troubleshooting purposes during the operational phase. The NI-IWS is mainly aimed at the local storage in a suitable format of the NISP instrument data and metadata, at local retrieval, processing and display of the stored data for on-line instrument assessment, and at the remote retrieval of the stored data for off-line analysis on other computers. We describe the design of the IWS software that will create a suitable interface to the external systems in each of the various configurations envisaged at the different levels, and provide the capabilities required to monitor and verify the instrument functionalities and performance throughout all phases of the NISP lifetime.
NASA Astrophysics Data System (ADS)
Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.
2017-08-01
In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.
Development and performance validation of a cryogenic linear stage for SPICA-SAFARI verification
NASA Astrophysics Data System (ADS)
Ferrari, Lorenza; Smit, H. P.; Eggens, M.; Keizer, G.; de Jonge, A. W.; Detrain, A.; de Jonge, C.; Laauwen, W. M.; Dieleman, P.
2014-07-01
In the context of the SAFARI instrument (SpicA FAR-infrared Instrument) SRON is developing a test environment to verify the SAFARI performance. The characterization of the detector focal plane will be performed with a backilluminated pinhole over a reimaged SAFARI focal plane by an XYZ scanning mechanism that consists of three linear stages stacked together. In order to reduce background radiation that can couple into the high sensitivity cryogenic detectors (goal NEP of 2•10-19 W/√Hz and saturation power of few femtoWatts) the scanner is mounted inside the cryostat in the 4K environment. The required readout accuracy is 3 μm and reproducibility of 1 μm along the total travel of 32 mm. The stage will be operated in "on the fly" mode to prevent vibrations of the scanner mechanism and will move with a constant speed varying from 60 μm/s to 400 μm/s. In order to meet the requirements of large stroke, low dissipation (low friction) and high accuracy a DC motor plus spindle stage solution has been chosen. In this paper we will present the stage design and stage characterization, describing also the measurements setup. The room temperature performance has been measured with a 3D measuring machine cross calibrated with a laser interferometer and a 2-axis tilt sensor. The low temperature verification has been performed in a wet 4K cryostat using a laser interferometer for measuring the linear displacements and a theodolite for measuring the angular displacements. The angular displacements can be calibrated with a precision of 4 arcsec and the position could be determined with high accuracy. The presence of friction caused higher values of torque than predicted and consequently higher dissipation. The thermal model of the stage has also been verified at 4K.
WRAP-RIB antenna technology development
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Garcia, N. F.; Iwamoto, H.
1985-01-01
The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.
Hubble Space Telescope: Cycle 1 calibration plan. Version 1.0
NASA Technical Reports Server (NTRS)
Stanley, Peggy (Editor); Blades, Chris (Editor)
1990-01-01
The framework for quantitative scientific analysis with the Hubble Space Telescope (HST) will be established from a detailed calibration program, and a major responsibility of staff in the Telescope & Instruments Branch (TIB) is the development and maintenance of this calibration for the instruments and the telescope. The first in-orbit calibration will be performed by the SI Investigation Definition Teams (IDTs) during the Science Verification (SV) period in Cycle 0 (expected to start 3 months after launch and last for 5 months). Subsequently, instrument scientists in the TIB become responsible for all aspects of the calibration program. Because of the long lead times involved, TIB scientists have already formulated a calibration plan for the next observing period, Cycle 1 (expected to last a year after the end of SV), which has been reviewed and approved by the STScI Director. The purpose here is to describe the contents of this plan. Our primary aim has been to maintain through Cycle 1 the level of calibration that is anticipated by the end of SV. Anticipated accuracies are given here in tabular form - of course, these accuracies can only be best guesses because we do not know how each instrument will actually perform on-orbit. The calibration accuracies are expected to satisfy the normal needs of both the General Observers (GOs) and the Guaranteed Time Observers (GTOs).
Assessment of sensor performance
NASA Astrophysics Data System (ADS)
Waldmann, C.; Tamburri, M.; Prien, R. D.; Fietzek, P.
2010-02-01
There is an international commitment to develop a comprehensive, coordinated and sustained ocean observation system. However, a foundation for any observing, monitoring or research effort is effective and reliable in situ sensor technologies that accurately measure key environmental parameters. Ultimately, the data used for modelling efforts, management decisions and rapid responses to ocean hazards are only as good as the instruments that collect them. There is also a compelling need to develop and incorporate new or novel technologies to improve all aspects of existing observing systems and meet various emerging challenges. Assessment of Sensor Performance was a cross-cutting issues session at the international OceanSensors08 workshop in Warnemünde, Germany, which also has penetrated some of the papers published as a result of the workshop (Denuault, 2009; Kröger et al., 2009; Zielinski et al., 2009). The discussions were focused on how best to classify and validate the instruments required for effective and reliable ocean observations and research. The following is a summary of the discussions and conclusions drawn from this workshop, which specifically addresses the characterisation of sensor systems, technology readiness levels, verification of sensor performance and quality management of sensor systems.
Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3
The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.
ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION
The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, Valeriy V; Conley, Raymond; Anderson, Erik H
Verification of the reliability of metrology data from high quality x-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [Proc. SPIE 7077-7 (2007), Opt. Eng. 47(7), 073602-1-5 (2008)} and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [Nucl. Instr. and Meth. A 616, 172-82 (2010)]. Here we describe the details ofmore » development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.« less
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application
NASA Technical Reports Server (NTRS)
Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond
2018-01-01
The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.
Martins, Thomas B
2002-01-01
The ability of the Luminex system to simultaneously quantitate multiple analytes from a single sample source has proven to be a feasible and cost-effective technology for assay development. In previous studies, my colleagues and I introduced two multiplex profiles consisting of 20 individual assays into the clinical laboratory. With the Luminex instrument's ability to classify up to 100 distinct microspheres, however, we have only begun to realize the enormous potential of this technology. By utilizing additional microspheres, it is now possible to add true internal controls to each individual sample. During the development of a seven-analyte serologic viral respiratory antibody profile, internal controls for detecting sample addition and interfering rheumatoid factor (RF) were investigated. To determine if the correct sample was added, distinct microspheres were developed for measuring the presence of sufficient quantities of immunoglobulin G (IgG) or IgM in the diluted patient sample. In a multiplex assay of 82 samples, the IgM verification control correctly identified 23 out of 23 samples with low levels (<20 mg/dl) of this antibody isotype. An internal control microsphere for RF detected 30 out of 30 samples with significant levels (>10 IU/ml) of IgM RF. Additionally, RF-positive samples causing false-positive adenovirus and influenza A virus IgM results were correctly identified. By exploiting the Luminex instrument's multiplexing capabilities, I have developed true internal controls to ensure correct sample addition and identify interfering RF as part of a respiratory viral serologic profile that includes influenza A and B viruses, adenovirus, parainfluenza viruses 1, 2, and 3, and respiratory syncytial virus. Since these controls are not assay specific, they can be incorporated into any serologic multiplex assay.
Ultra-Sensitive Electrostatic Accelerometers and Future Fundamental Physics Missions
NASA Astrophysics Data System (ADS)
Touboul, Pierre; Christophe, Bruno; Rodrigues, M.; Marque, Jean-Pierre; Foulon, Bernard
Ultra-sensitive electrostatic accelerometers have in the last decade demonstrated their unique performance and reliability in orbit leading to the success of the three Earth geodesy missions presently in operation. In the near future, space fundamental physics missions are in preparation and highlight the importance of this instrument for achieving new scientific objectives. Corner stone of General Relativity, the Equivalence Principle may be violated as predicted by attempts of Grand Unification. Verification experiment at a level of at least 10-15 is the objective of the CNES-ESA mission MICROSCOPE, thanks to a differential accelerometer configuration with concentric cylindrical test masses. To achieve the numerous severe requirements of the mission, the instrument is also used to control the attitude and the orbital motion of the space laboratory leading to a pure geodesic motion of the drag-free satellite. The performance of the accelerometer is a few tenth of femto-g, at the selected frequency of the test about 10-3 Hz, i.e several orbit frequencies. Another important experimental research in Gravity is the verification of the Einstein metric, in particular its dependence with the distance to the attractive body. The Gravity Advanced Package (GAP) is proposed for the future EJSM planetary mission, with the objective to verify this scale dependence of the gravitation law from Earth to Jupiter. This verification is performed, during the interplanetary cruise, by following precisely the satellite trajectory in the planet and Sun fields with an accurate measurement of the non-gravitational accelerations in order to evaluate the deviations to the geodesic motion. Accelerations at DC and very low frequency domain are concerned and the natural bias of the electrostatic accelerometer is thus compensated down to 5 10-11 m/s2 thanks to a specific bias calibration device. More ambitious, the dedicated mission Odyssey, proposed for Cosmic Vision, will fly in the Solar System beyond Saturn. Based on the same instrument, the scientific return will be enlarged by the better performance achievable on a dedicated satellite and by the larger distance to the Sun. Fly by gravitational effects will also be carefully observed. At last, gravitational sensors take advantage of similar instrument concept, configuration and technologies to achieve pure free inertial masses, references of the LISA mission interferometer for the observation of gravity waves.
NASA Astrophysics Data System (ADS)
Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo
Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.
Engineering of the LISA Pathfinder mission—making the experiment a practical reality
NASA Astrophysics Data System (ADS)
Warren, Carl; Dunbar, Neil; Backler, Mike
2009-05-01
LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.
The commissioning instrument for the Gran Telescopio Canarias: made in Mexico
NASA Astrophysics Data System (ADS)
Cuevas, Salvador; Sánchez, Beatriz; Bringas, Vicente; Espejo, Carlos; Flores, Rubén; Chapa, Oscar; Lara, Gerardo; Chavoya, Armando; Anguiano, Gustavo; Arciniega, Sadot; Dorantes, Ariel; Gonzalez, José L.; Montoya, Juan M.; Toral, Rafael; Hernández, Hugo; Nava, Roberto; Devaney, Nicolas; Castro, Javier; Cavaller, Luis; Farah, Alejandro; Godoy, Javier; Cobos, Francisco; Tejada, Carlos; Garfias, Fernando
2006-02-01
In March 2004 was accepted in the site of Gran Telescopio Canarias (GTC) in La Palma Island, Spain, the Commissioning Instrument (CI) for the GTC. During the GTC integration phase, the CI will be a diagnostic tool for performance verification. The CI features four operation modes-imaging, pupil imaging, Curvature Wave-front sensing (WFS), and high resolution Shack-Hartmann WFS. This instrument was built by the Instituto de Astronomia UNAM in Mexico City and the Centro de Ingenieria y Desarrollo Industrial (CIDESI) in Queretaro, Qro under a GRANTECAN contract after an international public bid. Some optical components were built by Centro de Investigaciones en Optica (CIO) in Leon Gto and the biggest mechanical parts were manufactured by Vatech in Morelia Mich. In this paper we made a general description of the CI and we relate how this instrument, build under international standards, was entirely made in Mexico.
High precision radial velocities with GIANO spectra
NASA Astrophysics Data System (ADS)
Carleo, I.; Sanna, N.; Gratton, R.; Benatti, S.; Bonavita, M.; Oliva, E.; Origlia, L.; Desidera, S.; Claudi, R.; Sissa, E.
2016-06-01
Radial velocities (RV) measured from near-infrared (NIR) spectra are a potentially excellent tool to search for extrasolar planets around cool or active stars. High resolution infrared (IR) spectrographs now available are reaching the high precision of visible instruments, with a constant improvement over time. GIANO is an infrared echelle spectrograph at the Telescopio Nazionale Galileo (TNG) and it is a powerful tool to provide high resolution spectra for accurate RV measurements of exoplanets and for chemical and dynamical studies of stellar or extragalactic objects. No other high spectral resolution IR instrument has GIANO's capability to cover the entire NIR wavelength range (0.95-2.45 μm) in a single exposure. In this paper we describe the ensemble of procedures that we have developed to measure high precision RVs on GIANO spectra acquired during the Science Verification (SV) run, using the telluric lines as wavelength reference. We used the Cross Correlation Function (CCF) method to determine the velocity for both the star and the telluric lines. For this purpose, we constructed two suitable digital masks that include about 2000 stellar lines, and a similar number of telluric lines. The method is applied to various targets with different spectral type, from K2V to M8 stars. We reached different precisions mainly depending on the H-magnitudes: for H ˜ 5 we obtain an rms scatter of ˜ 10 m s-1, while for H ˜ 9 the standard deviation increases to ˜ 50 ÷ 80 m s-1. The corresponding theoretical error expectations are ˜ 4 m s-1 and 30 m s-1, respectively. Finally we provide the RVs measured with our procedure for the targets observed during GIANO Science Verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matloch, L.; Vaccaro, S.; Couland, M.
The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less
Engine condition monitoring: CF6 family 60's through the 80's
NASA Technical Reports Server (NTRS)
Kent, H. J.; Dienger, G.
1981-01-01
The on condition program is described in terms of its effectiveness as a maintenance tool both at the line station as well as at home base by the early detection of engine faults, erroneous instrumentation signals and by verification of engine health. The system encompasses all known methods from manual procedures to the fully automated airborne integrated data system.
Software development for airborne radar
NASA Astrophysics Data System (ADS)
Sundstrom, Ingvar G.
Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.
ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS
The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...
ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR
The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario
2016-01-01
Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441
The Mars Science Laboratory Organic Check Material
NASA Technical Reports Server (NTRS)
Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.
2011-01-01
The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.
24 CFR 5.659 - Family information and verification.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...
This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...
24 CFR 5.659 - Family information and verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...
BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR
The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...
NASA Astrophysics Data System (ADS)
Meleshenkovskii, I.; Borella, A.; Van der Meer, K.; Bruggeman, M.; Pauly, N.; Labeau, P. E.; Schillebeeckx, P.
2018-01-01
Nowadays, there is interest in developing gamma-ray measuring devices based on the room temperature operated medium resolution detectors such as semiconductor detectors of the CdZnTe type and scintillators of the LaBr3 type. This is true also for safeguards applications and the International Atomic Energy Agency (IAEA) has launched a project devoted to the assessment of medium resolution gamma-ray spectroscopy for the verification of the isotopic composition of U and Pu bearing samples. This project is carried out within the Non-Destructive Assay Working Group of the European Safeguards Research and Development Association (ESARDA). In this study we analyze medium resolution spectra of U and Pu standards with the aim to develop an isotopic composition determination algorithm, particularly suited for these types of detectors. We show how the peak shape of a CdZnTe detector is influenced by the instrumentation parameters. The experimental setup consisted of a 500 mm3 CdZnTe detector, a 2×2 inch LaBr3 detector, two types of measurement instrumentation - an analogue one and a digital one, and a set of certified samples - a 207Bi point source and U and Pu CBNM standards. The results of our measurements indicate that the lowest contribution to the peak asymmetry and thus the smallest impact on the resolution of the 500 mm3 CdZnTe detector was achieved with the digital MCA. Analysis of acquired spectra allowed to reject poor quality measurement runs and produce summed spectra files with the least impact of instrumentation instabilities. This work is preliminary to further studies concerning the development of an isotopic composition determination algorithm particularly suited for CZT and LaBr3 detectors for safeguards applications.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
An unattended verification station for UF 6 cylinders: Field trial findings
Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...
2017-08-26
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
An unattended verification station for UF 6 cylinders: Field trial findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L. E.; Miller, K. A.; McDonald, B. S.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
NASA Technical Reports Server (NTRS)
Mckenzie, Robert L.
1988-01-01
An analytical study and its experimental verification are described which show the performance capabilities and the hardware requirements of a method for measuring atmospheric density along the Space Shuttle flightpath during entry. Using onboard instrumentation, the technique relies on Rayleigh scattering of light from a pulsed ArF excimer laser operating at a wavelength of 193 nm. The method is shown to be capable of providing density measurements with an uncertainty of less than 1 percent and with a spatial resolution along the flightpath of 1 km, over an altitude range from 50 to 90 km. Experimental verification of the signal linearity and the expected signal-to-noise ratios is demonstrated in a simulation facility at conditions that duplicate the signal levels of the flight environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lafleur, Adrienne M.; Ulrich, Timothy J. II; Menlove, Howard O.
Objective is to investigate the use of Passive Neutron Albedo Reactivity (PNAR) and Self-Interrogation Neutron Resonance Densitometry (SINRD) to quantify fissile content in FUGEN spent fuel assemblies (FAs). Methodology used is: (1) Detector was designed using fission chambers (FCs); (2) Optimized design via MCNPX simulations; and (3) Plan to build and field test instrument in FY13. Significance was to improve safeguards verification of spent fuel assemblies in water and increase sensitivity to partial defects. MCNPX simulations were performed to optimize the design of the SINRD+PNAR detector. PNAR ratio was less sensitive to FA positioning than SINRD and SINRD ratio wasmore » more sensitive to Pu fissile mass than PNAR. Significance was that the integration of these techniques can be used to improve verification of spent fuel assemblies in water.« less
NASA Technical Reports Server (NTRS)
1976-01-01
System specifications to be used by the mission control center (MCC) for the shuttle orbital flight test (OFT) time frame were described. The three support systems discussed are the communication interface system (CIS), the data computation complex (DCC), and the display and control system (DCS), all of which may interfere with, and share processing facilities with other applications processing supporting current MCC programs. The MCC shall provide centralized control of the space shuttle OFT from launch through orbital flight, entry, and landing until the Orbiter comes to a stop on the runway. This control shall include the functions of vehicle management in the area of hardware configuration (verification), flight planning, communication and instrumentation configuration management, trajectory, software and consumables, payloads management, flight safety, and verification of test conditions/environment.
The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification.
Wang, Hui; Shi, Tujin; Qian, Wei-Jun; Liu, Tao; Kagan, Jacob; Srivastava, Sudhir; Smith, Richard D; Rodland, Karin D; Camp, David G
2016-01-01
Mass spectrometry (MS) -based proteomics has become an indispensable tool with broad applications in systems biology and biomedical research. With recent advances in liquid chromatography (LC) and MS instrumentation, LC-MS is making increasingly significant contributions to clinical applications, especially in the area of cancer biomarker discovery and verification. To overcome challenges associated with analyses of clinical samples (for example, a wide dynamic range of protein concentrations in bodily fluids and the need to perform high throughput and accurate quantification of candidate biomarker proteins), significant efforts have been devoted to improve the overall performance of LC-MS-based clinical proteomics platforms. Reviewed here are the recent advances in LC-MS and its applications in cancer biomarker discovery and quantification, along with the potentials, limitations and future perspectives.
Measuring Fluctuating Pressure Levels and Vibration Response in a Jet Plume
NASA Technical Reports Server (NTRS)
Osterholt, Douglas J.; Knox, Douglas M.
2011-01-01
The characterization of loads due to solid rocket motor plume impingement allows for moreaccurate analyses of components subjected to such an environment. Typically, test verification of predicted loads due to these conditions is widely overlooked or unsuccessful. ATA Engineering, Inc., performed testing during a solid rocket motor firing to obtain acceleration and pressure responses in the hydrodynamic field surrounding the jet plume. The test environment necessitated a robust design to facilitate measurements being made in close proximity to the jet plume. This paper presents the process of designing a test fixture and an instrumentation package that could withstand the solid rocket plume environment and protect the required instrumentation.
A3 Subscale Diffuser Test Article Design
NASA Technical Reports Server (NTRS)
Saunders, G. P.
2009-01-01
This paper gives a detailed description of the design of the A3 Subscale Diffuser Test (SDT) Article Design. The subscale diffuser is a geometrically accurate scale model of the A3 altitude rocket facility. It was designed and built to support the SDT risk mitigation project located at the E3 facility at Stennis Space Center, MS (SSC) supporting the design and construction of the A3 facility at SSC. The subscale test article is outfitted with a large array of instrumentation to support the design verification of the A3 facility. The mechanical design of the subscale diffuser and test instrumentation are described here
The Environmental Technology Verification (ETV) Program, established by the U.S. EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance. The Air Pollution Control Technolog...
The U.S. EPA's Office of Research and Development operates the Environmental Technology Verification (ETV) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. Congress funds ETV in response to the belief ...
24 CFR 4001.112 - Income verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements: (a...
NASA Project Constellation Systems Engineering Approach
NASA Technical Reports Server (NTRS)
Dumbacher, Daniel L.
2005-01-01
NASA's Office of Exploration Systems (OExS) is organized to empower the Vision for Space Exploration with transportation systems that result in achievable, affordable, and sustainable human and robotic journeys to the Moon, Mars, and beyond. In the process of delivering these capabilities, the systems engineering function is key to implementing policies, managing mission requirements, and ensuring technical integration and verification of hardware and support systems in a timely, cost-effective manner. The OExS Development Programs Division includes three main areas: (1) human and robotic technology, (2) Project Prometheus for nuclear propulsion development, and (3) Constellation Systems for space transportation systems development, including a Crew Exploration Vehicle (CEV). Constellation Systems include Earth-to-orbit, in-space, and surface transportation systems; maintenance and science instrumentation; and robotic investigators and assistants. In parallel with development of the CEV, robotic explorers will serve as trailblazers to reduce the risk and costs of future human operations on the Moon, as well as missions to other destinations, including Mars. Additional information is included in the original extended abstract.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
Surface contamination analysis technology team overview
NASA Astrophysics Data System (ADS)
Burns, H. Dewitt, Jr.
1996-11-01
The surface contamination analysis technology (SCAT) team was originated as a working roup of NASA civil service, Space Shuttle contractor, and university groups. Participating members of the SCAT Team have included personnel from NASA Marshall Space Flight Center's Materials and Processes Laboratory and Langley Research Center's Instrument Development Group; contractors-Thiokol Corporation's Inspection Technology Group, AC Engineering support contractor, Aerojet, SAIC, and Lockheed MArtin/Oak Ridge Y-12 support contractor and Shuttle External Tank prime contractor; and the University of Alabama in Huntsville's Center for Robotics and Automation. The goal of the SCAT team as originally defined was to develop and integrate a multi-purpose inspection head for robotic application to in-process inspection of contamination sensitive surfaces. One area of interest was replacement of ozone depleting solvents currently used for surface cleanliness verification. The team approach brought together the appropriate personnel to determine what surface inspection techniques were applicable to multi-program surface cleanliness inspection. Major substrates of interest were chosen to simulate space shuttle critical bonding surface or surfaces sensitive to contamination such as fuel system component surfaces. Inspection techniques evaluated include optically stimulated electron emission or photoelectron emission; Fourier transform infrared spectroscopy; near infrared fiber optic spectroscopy; and, ultraviolet fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992. Instrumentation specifications and designs developed under this effort include a portable diffuse reflectance FTIR system built by Surface Optics Corporation and a third generation optically stimulated electron emission system built by LaRC. This paper will discuss the evaluation of the various techniques on a number of substrate materials contaminated with hydrocarbons, silicones, and fluorocarbons. Discussion will also include standards development for instrument calibration and testing.
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
Fresh Fuel Measurements With the Differential Die-Away Self-Interrogation Instrument
NASA Astrophysics Data System (ADS)
Trahan, Alexis C.; Belian, Anthony P.; Swinhoe, Martyn T.; Menlove, Howard O.; Flaska, Marek; Pozzi, Sara A.
2017-07-01
The purpose of the Next Generation Safeguards Initiative (NGSI)-Spent Fuel (SF) Project is to strengthen the technical toolkit of safeguards inspectors and/or other interested parties. The NGSI-SF team is working to achieve the following technical goals more easily and efficiently than in the past using nondestructive assay measurements of spent fuel assemblies: 1) verify the initial enrichment, burnup, and cooling time of facility declaration; 2) detect the diversion or replacement of pins; 3) estimate the plutonium mass; 4) estimate decay heat; and 5) determine the reactivity of spent fuel assemblies. The differential die-away self-interrogation (DDSI) instrument is one instrument that was assessed for years regarding its feasibility for robust, timely verification of spent fuel assemblies. The instrument was recently built and was tested using fresh fuel assemblies in a variety of configurations, including varying enrichment, neutron absorber content, and symmetry. The early die-away method, a multiplication determination method developed in simulation space, was successfully tested on the fresh fuel assembly data and determined multiplication with a root-mean-square (RMS) error of 2.9%. The experimental results were compared with MCNP simulations of the instrument as well. Low multiplication assemblies had agreement with an average RMS error of 0.2% in the singles count rate (i.e., total neutrons detected per second) and 3.4% in the doubles count rates (i.e., neutrons detected in coincidence per second). High-multiplication assemblies had agreement with an average RMS error of 4.1% in the singles and 13.3% in the doubles count rates.
Mathur, Gagan; Haugen, Thomas H; Davis, Scott L; Krasowski, Matthew D
2014-01-01
Interfacing of clinical laboratory instruments with the laboratory information system (LIS) via "middleware" software is increasingly common. Our clinical laboratory implemented capillary electrophoresis using a Sebia(®) Capillarys-2™ (Norcross, GA, USA) instrument for serum and urine protein electrophoresis. Using Data Innovations Instrument Manager, an interface was established with the LIS (Cerner) that allowed for bi-directional transmission of numeric data. However, the text of the interpretive pathology report was not properly transferred. To reduce manual effort and possibility for error in text data transfer, we developed scripts in AutoHotkey, a free, open-source macro-creation and automation software utility. Scripts were written to create macros that automated mouse and key strokes. The scripts retrieve the specimen accession number, capture user input text, and insert the text interpretation in the correct patient record in the desired format. The scripts accurately and precisely transfer narrative interpretation into the LIS. Combined with bar-code reading by the electrophoresis instrument, the scripts transfer data efficiently to the correct patient record. In addition, the AutoHotKey script automated repetitive key strokes required for manual entry into the LIS, making protein electrophoresis sign-out easier to learn and faster to use by the pathology residents. Scripts allow for either preliminary verification by residents or final sign-out by the attending pathologist. Using the open-source AutoHotKey software, we successfully improved the transfer of text data between capillary electrophoresis software and the LIS. The use of open-source software tools should not be overlooked as tools to improve interfacing of laboratory instruments.
In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauld, Ian C.; Hu, Jianwei; De Baere, P.
Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the frameworkmore » of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel verification technique.« less
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
MESA: Message-Based System Analysis Using Runtime Verification
NASA Technical Reports Server (NTRS)
Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter
2017-01-01
In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.
Assessment of test methods for evaluating effectiveness of cleaning flexible endoscopes.
Washburn, Rebecca E; Pietsch, Jennifer J
2018-06-01
Strict adherence to each step of reprocessing is imperative to removing potentially infectious agents. Multiple methods for verifying proper reprocessing exist; however, each presents challenges and limitations, and best practice within the industry has not been established. Our goal was to evaluate endoscope cleaning verification tests with particular interest in the evaluation of the manual cleaning step. The results of the cleaning verification tests were compared with microbial culturing to see if a positive cleaning verification test would be predictive of microbial growth. This study was conducted at 2 high-volume endoscopy units within a multisite health care system. Each of the 90 endoscopes were tested for adenosine triphosphate, protein, microbial growth via agar plate, and rapid gram-negative culture via assay. The endoscopes were tested in 3 locations: the instrument channel, control knob, and elevator mechanism. This analysis showed substantial level of agreement between protein detection postmanual cleaning and protein detection post-high-level disinfection at the control head for scopes sampled sequentially. This study suggests that if protein is detected postmanual cleaning, there is a significant likelihood that protein will also be detected post-high-level disinfection. It also infers that a cleaning verification test is not predictive of microbial growth. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
ICESat-2 / ATLAS Flight Science Receiver Algorithms
NASA Astrophysics Data System (ADS)
Mcgarry, J.; Carabajal, C. C.; Degnan, J. J.; Mallama, A.; Palm, S. P.; Ricklefs, R.; Saba, J. L.
2013-12-01
NASA's Advanced Topographic Laser Altimeter System (ATLAS) will be the single instrument on the ICESat-2 spacecraft which is expected to launch in 2016 with a 3 year mission lifetime. The ICESat-2 orbital altitude will be 500 km with a 92 degree inclination and 91-day repeat tracks. ATLAS is a single photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz and a 6 spot pattern on the Earth's surface. Without some method of eliminating solar background noise in near real-time, the volume of ATLAS telemetry would far exceed the normal X-band downlink capability. To reduce the data volume to an acceptable level a set of onboard Receiver Algorithms has been developed. These Algorithms limit the daily data volume by distinguishing surface echoes from the background noise and allow the instrument to telemeter only a small vertical region about the signal. This is accomplished through the use of an onboard Digital Elevation Model (DEM), signal processing techniques, and an onboard relief map. Similar to what was flown on the ATLAS predecessor GLAS (Geoscience Laser Altimeter System) the DEM provides minimum and maximum heights for each 1 degree x 1 degree tile on the Earth. This information allows the onboard algorithm to limit its signal search to the region between minimum and maximum heights (plus some margin for errors). The understanding that the surface echoes will tend to clump while noise will be randomly distributed led us to histogram the received event times. The selection of the signal locations is based on those histogram bins with statistically significant counts. Once the signal location has been established the onboard Digital Relief Map (DRM) is used to determine the vertical width of the telemetry band about the signal. The ATLAS Receiver Algorithms are nearing completion of the development phase and are currently being tested using a Monte Carlo Software Simulator that models the instrument, the orbit and the environment. This Simulator makes it possible to check all logic paths that could be encountered by the Algorithms on orbit. In addition the NASA airborne instrument MABEL is collecting data with characteristics similar to what ATLAS will see. MABEL data is being used to test the ATLAS Receiver Algorithms. Further verification will be performed during Integration and Testing of the ATLAS instrument and during Environmental Testing on the full ATLAS instrument. Results from testing to date show the Receiver Algorithms have the ability to handle a wide range of signal and noise levels with a very good sensitivity at relatively low signal to noise ratios. In addition, preliminary tests have demonstrated, using the ICESat-2 Science Team's selected land ice and sea ice test cases, the capability of the Algorithms to successfully find and telemeter the surface echoes. In this presentation we will describe the ATLAS Flight Science Receiver Algorithms and the Software Simulator, and will present results of the testing to date. The onboard databases (DEM, DRM and the Surface Reference Mask) are being developed at the University of Texas at Austin as part of the ATLAS Flight Science Receiver Algorithms. Verification of the onboard databases is being performed by ATLAS Receiver Algorithms team members Claudia Carabajal and Jack Saba.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.
2003-01-01
Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.
Bulk silica transmission grating made by reactive ion etching for NIR space instruments
NASA Astrophysics Data System (ADS)
Caillat, Amandine; Pascal, Sandrine; Tisserand, Stéphane; Dohlen, Kjetil; Grange, Robert; Sauget, Vincent; Gautier, Sophie
2014-07-01
A GRISM, made of a grating on a prism, allow combining image and spectroscopy of the same field of view with the same optical system and detector, thus simplify instrument concept. New GRISM designs impose technical specifications difficult to reach with classical grating manufacturing processes: large useful aperture (>100mm), low groove frequency (<30g/mm), small blaze angle (<3°) and, last but not least, line curvature allowing wavefront corrections. In addition, gratings are commonly made of resin which may not be suitable to withstand the extreme space environment. Therefore, in the frame of a R&D project financed by the CNES, SILIOS Technologies developed a new resin-free grating manufacturing process and realized a first 80mm diameter prototype optically tested at LAM. We present detailed specifications of this resin-free grating, the manufacturing process, optical setups and models for optical performance verification and very encouraging results obtained on the first 80mm diameter grating prototype: >80% transmitted efficiency, <30nm RMS wavefront error, groove shape and roughness very close to theory and uniform over the useful aperture.
NASA Astrophysics Data System (ADS)
Coulter, Phillip; Beaton, Alexander; Gum, Jeffery S.; Hadjimichael, Theodore J.; Hayden, Joseph E.; Hummel, Susann; Hylan, Jason E.; Lee, David; Madison, Timothy J.; Maszkiewicz, Michael; Mclean, Kyle F.; McMann, Joseph; Melf, Markus; Miner, Linda; Ohl, Raymond G.; Redman, Kevin; Roedel, Andreas; Schweiger, Paul; Te Plate, Maurice; Wells, Martyn; Wenzel, Greg W.; Williams, Patrick K.; Young, Jerrod
2014-09-01
While efforts within the optics community focus on the development of high-quality systems and data products, comparatively little attention is paid to their use. Our standards for verification and validation are high; but in some user domains, standards are either lax or do not exist at all. In forensic imagery analysis, for example, standards exist to judge image quality, but do not exist to judge the quality of an analysis. In litigation, a high quality analysis is by default the one performed by the victorious attorney's expert. This paper argues for the need to extend quality standards into the domain of imagery analysis, which is expected to increase in national visibility and significance with the increasing deployment of unmanned aerial vehicle—UAV, or "drone"—sensors in the continental U. S.. It argues that like a good radiometric calibration, made as independent of the calibrated instrument as possible, a good analysis should be subject to standards the most basic of which is the separation of issues of scientific fact from analysis results.
Experimental Verification of an Instrument to Test Flooring Materials
NASA Astrophysics Data System (ADS)
Philip, Rony; Löfgren, Hans, Dr
2018-02-01
The focus of this work is to validate the fluid model with different flooring materials and the measurements of an instrument to test flooring materials and its force attenuating capabilities using mathematical models to describe the signature and coefficients of the floor. The main contribution of the present work focus on the development of a mathematical fluid model for floors. The aim of the thesis was to analyze, compare different floor materials and to study the linear dynamics of falling impacts on floors. The impact of the hammer during a fall is captured by an accelerometer and response is collected using a picoscope. The collected data was analyzed using matlab least square method which is coded as per the fluid model. The finding from this thesis showed that the fluid model works with more elastic model but it doesn’t work for rigid materials like wood. The importance of parameters like velocity, mass, energy loss and other coefficients of floor which influences the model during the impact of falling on floors were identified and a standardized testing method was set.
Wire Crimp Termination Verification Using Ultrasonic Inspection
NASA Technical Reports Server (NTRS)
Perey, Daniel F.; Cramer, K. Elliott; Yost, William T.
2007-01-01
The development of a new ultrasonic measurement technique to quantitatively assess wire crimp terminations is discussed. The amplitude change of a compressional ultrasonic wave propagating through the junction of a crimp termination and wire is shown to correlate with the results of a destructive pull test, which is a standard for assessing crimp wire junction quality. Various crimp junction pathologies such as undercrimping, missing wire strands, incomplete wire insertion, partial insulation removal, and incorrect wire gauge are ultrasonically tested, and their results are correlated with pull tests. Results show that the nondestructive ultrasonic measurement technique consistently (as evidenced with destructive testing) predicts good crimps when ultrasonic transmission is above a certain threshold amplitude level. A physics-based model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process. This model is in agreement within 6% of the ultrasonic measurements. A prototype instrument for applying this technique while wire crimps are installed is also presented. The instrument is based on a two-jaw type crimp tool suitable for butt-splice type connections. Finally, an approach for application to multipin indenter type crimps will be discussed.
Community Radiative Transfer Model for Inter-Satellites Calibration and Verification
NASA Astrophysics Data System (ADS)
Liu, Q.; Nalli, N. R.; Ignatov, A.; Garrett, K.; Chen, Y.; Weng, F.; Boukabara, S. A.; van Delst, P. F.; Groff, D. N.; Collard, A.; Joseph, E.; Morris, V. R.; Minnett, P. J.
2014-12-01
Developed at the Joint Center for Satellite Data Assimilation, the Community Radiative Transfer Model (CRTM) [1], operationally supports satellite radiance assimilation for weather forecasting. The CRTM also supports JPSS/NPP and GOES-R missions [2] for instrument calibration, validation, monitoring long-term trending, and satellite retrieved products [3]. The CRTM is used daily at the NOAA NCEP to quantify the biases and standard deviations between radiance simulations and satellite radiance measurements in a time series and angular dependency. The purposes of monitoring the data assimilation system are to ensure the proper performance of the assimilation system and to diagnose problems with the system for future improvements. The CRTM is a very useful tool for cross-sensor verifications. Using the double difference method, it can remove the biases caused by slight differences in spectral response and geometric angles between measurements of the two instruments. The CRTM is particularly useful to reduce the difference between instruments for climate studies [4]. In this study, we will carry out the assessment of the Suomi National Polar-orbiting Partnership (SNPP) [5] Cross-track Infrared Sounder (CrIS) data [6], Advanced Technology Microwave Sounder (ATMS) data, and data for Visible Infrared Imaging Radiometer Suite (VIIRS) [7][8] thermal emissive bands. We use dedicated radiosondes and surface data acquired from NOAA Aerosols and Ocean Science Expeditions (AEROSE) [9]. The high quality radiosondes were launched when Suomi NPP flew over NOAA Ship Ronald H. Brown situated in the tropical Atlantic Ocean. The atmospheric data include profiles of temperature, water vapor, and ozone, as well as total aerosol optical depths. The surface data includes air temperature and humidity at 2 meters, skin temperature (Marine Atmospheric Emitted Radiance Interferometer, M-AERI [10]), surface temperature, and surface wind vector. [1] Liu, Q., and F. Weng, 2006: JAS [2] Liu, Q., and S. Boukabara, 2013: RSE [3] Boukabara et al., 2011: TGARS [4] Wang, LK, Zou C-Z. 2013: JGR [5] Weng et al, 2012: JGR [6] Han, Y., et al., 2013: JGR [7] Caoet al, 2013: GR [8] Liang, X, A. Ignatov, 2013: JGR [9] Nalliet al 2011: BAMS [10] Minnett et al, 2001: JAOT
NASA Astrophysics Data System (ADS)
Houtz, Derek Anderson
Microwave radiometers allow remote sensing of earth and atmospheric temperatures from space, anytime, anywhere, through clouds, and in the dark. Data from microwave radiometers are high-impact operational inputs to weather forecasts, and are used to provide a vast array of climate data products including land and sea surface temperatures, soil moisture, ocean salinity, cloud precipitation and moisture height profiles, and even wind speed and direction, to name a few. Space-borne microwave radiometers have a major weakness when it comes to long-term climate trends due to their lack of traceability. Because there is no standard, or absolute reference, for microwave brightness temperature, nationally or internationally, individual instruments must each rely on their own internal calibration source to set an absolute reference to the fundamental unit of Kelvin. This causes each subsequent instrument to have a calibration offset and there is no 'true' reference. The work introduced in this thesis addresses this vacancy by proposing and introducing a NIST microwave brightness temperature source that may act as the primary reference. The NIST standard will allow pre-launch calibration of radiometers across a broad range of remote sensing pertinent frequencies between 18 GHz and 220 GHz. The blackbody will be capable of reaching temperatures ranging between liquid nitrogen boiling at approximately 77 K and warm-target temperature of 350 K. The brightness temperature of the source has associated standard uncertainty ranging as a function of frequency between 0.084 K and 0.111 K. The standard can be transferred to the calibration source in the instrument, providing traceability of all subsequent measurements back to the primary standard. The development of the NIST standard source involved predicting and measuring its brightness temperature, and minimizing the associated uncertainty of this quantity. Uniform and constant physical temperature along with well characterized and maximized emissivity are fundamental to a well characterized blackbody. The chosen geometry is a microwave absorber coated copper cone. Electromagnetic and thermal simulations are introduced to optimize the design. Experimental verifications of the simulated quantities confirm the predicted performance of the blackbody.
NASA Technical Reports Server (NTRS)
Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; Harrington, Gary; Frisbie, Troy
2006-01-01
A simple and cost-effective, hyperspectral sun photometer for radiometric vicarious remote sensing system calibration, air quality monitoring, and potentially in-situ planetary climatological studies, was developed. The device was constructed solely from off the shelf components and was designed to be easily deployable for support of short-term verification and validation data collects. This sun photometer not only provides the same data products as existing multi-band sun photometers, this device requires a simpler setup, less data acquisition time and allows for a more direct calibration approach. Fielding this instrument has also enabled Stennis Space Center (SSC) Applied Sciences Directorate personnel to cross calibrate existing sun photometers. This innovative research will position SSC personnel to perform air quality assessments in support of the NASA Applied Sciences Program's National Applications program element as well as to develop techniques to evaluate aerosols in a Martian or other planetary atmosphere.
Control research in the NASA high-alpha technology program
NASA Technical Reports Server (NTRS)
Gilbert, William P.; Nguyen, Luat T.; Gera, Joseph
1990-01-01
NASA is conducting a focused technology program, known as the High-Angle-of-Attack Technology Program, to accelerate the development of flight-validated technology applicable to the design of fighters with superior stall and post-stall characteristics and agility. A carefully integrated effort is underway combining wind tunnel testing, analytical predictions, piloted simulation, and full-scale flight research. A modified F-18 aircraft has been extensively instrumented for use as the NASA High-Angle-of-Attack Research Vehicle used for flight verification of new methods and concepts. This program stresses the importance of providing improved aircraft control capabilities both by powered control (such as thrust-vectoring) and by innovative aerodynamic control concepts. The program is accomplishing extensive coordinated ground and flight testing to assess and improve available experimental and analytical methods and to develop new concepts for enhanced aerodynamics and for effective control, guidance, and cockpit displays essential for effective pilot utilization of the increased agility provided.
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
The U.S. EPA's Office of Research and Development operates the Environmental Technology Verification (ETV) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. Congress funds ETV in response to the belief ...
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...
Joint ETV/NOWATECH test plan for the Sorbisense GSW40 passive sampler
The joint test plan is the implementation of a test design developed for verification of the performance of an environmental technology following the NOWATECH ETV method. The verification is a joint verification with the US EPA ETV scheme and the Advanced Monitoring Systems Cent...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S R; Bihari, B L; Salari, K
As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.
NASA Astrophysics Data System (ADS)
Cohen, K. K.; Klara, S. M.; Srivastava, R. D.
2004-12-01
The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared sensors and; aerial and satellite imagery. This abstract will describe results, similarities, and contrasts for funded studies from the U.S. DOE's Carbon Sequestration Program including examples from the Sleipner North Sea Project, the Canadian Weyburn Field/Dakota Gasification Plant Project, the Frio Formation Texas Project, and Yolo County Bioreactor Landfill Project. The abstract will also address the following: How are the terms ``measurement,'' ``mitigation''and ``verification'' defined in the Program? What is the U.S. DOE's Carbon Sequestration Program Roadmap and what are the Roadmap goals for MM&V? What is the current status of MM&V technologies?
The NIRCam Optical Telescope Simulator (NOTES)
NASA Technical Reports Server (NTRS)
Kubalak, David; Hakun, Claef; Greeley, Bradford; Eichorn, William; Leviton, Douglas; Guishard, Corina; Gong, Qian; Warner, Thomas; Bugby, David; Robinson, Frederick;
2007-01-01
The Near Infra-Red Camera (NIRCam), the 0.6-5.0 micron imager and wavefront sensing instrument for the James Webb Space Telescope (JWST), will be used on orbit both as a science instrument, and to tune the alignment of the telescope. The NIRCam Optical Telescope Element Simulator (NOTES) will be used during ground testing to provide an external stimulus to verify wavefront error, imaging characteristics, and wavefront sensing performance of this crucial instrument. NOTES is being designed and built by NASA Goddard Space Flight Center with the help of Swales Aerospace and Orbital Sciences Corporation. It is a single-point imaging system that uses an elliptical mirror to form an U20 image of a point source. The point source will be fed via optical fibers from outside the vacuum chamber. A tip/tilt mirror is used to change the chief ray angle of the beam as it passes through the aperture stop and thus steer the image over NIRCam's field of view without moving the pupil or introducing field aberrations. Interchangeable aperture stop elements allow us to simulate perfect JWST wavefronts for wavefront error testing, or introduce transmissive phase plates to simulate a misaligned JWST segmented mirror for wavefront sensing verification. NOTES will be maintained at an operating temperature of 80K during testing using thermal switches, allowing it to operate within the same test chamber as the NIRCam instrument. We discuss NOTES' current design status and on-going development activities.
Implementation and verification of global optimization benchmark problems
NASA Astrophysics Data System (ADS)
Posypkin, Mikhail; Usov, Alexander
2017-12-01
The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.
Portable traceability solution for ground-based calibration of optical instruments
NASA Astrophysics Data System (ADS)
El Gawhary, Omar; van Veghel, Marijn; Kenter, Pepijn; van der Leden, Natasja; Dekker, Paul; Revtova, Elena; Heemskerk, Maurice; Trarbach, André; Vink, Ramon; Doyle, Dominic
2017-11-01
We present a portable traceability solution for the ground-based optical calibration of earth observation (EO) instruments. Currently, traceability for this type of calibration is typically based on spectral irradiance sources (e.g. FEL lamps) calibrated at a national metrology institute (NMI). Disadvantages of this source-based traceability are the inflexibility in operating conditions of the source, which are limited to the settings used during calibration at the NMI, and the susceptibility to aging, which requires frequent recalibrations, and which cannot be easily checked on-site. The detector-based traceability solution presented in this work uses a portable filter radiometer to calibrate light sources onsite, immediately before and after, or even during instrument calibration. The filter radiometer itself is traceable to the primary standard of radiometry in the Netherlands. We will discuss the design and realization, calibration and performance verification.
Practical aspects of instrumentation system installation, volume 13
NASA Technical Reports Server (NTRS)
Borek, R. W.; Pool, A. (Editor); Sanderson, K. C. (Editor)
1981-01-01
A review of factors influencing installation of aircraft flight test instrumentation is presented. Requirements, including such factors as environment, reliability, maintainability, and system safety are discussed. The assessment of the mission profile is followed by an overview of electrical and mechanical installation factors with emphasis on shock/vibration isolation systems and standardization of the electric wiring installation, two factors often overlooked by instrumentation engineers. A discussion of installation hardware reviews the performance capabilities of wiring, connectors, fuses and circuit breakers, and a guide to proper selections is provided. The discussion of the installation is primarily concerned with the electrical wire routing, shield terminations and grounding. Also inclued are some examples of installation mistakes that could affect system accuracy. System verification procedures and special considerations such as sneak circuits, pyrotechnics, aircraft antenna patterns, and lightning strikes are discussed.
Aqueous cleaning and verification processes for precision cleaning of small parts
NASA Technical Reports Server (NTRS)
Allen, Gale J.; Fishell, Kenneth A.
1995-01-01
The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.
Formal verification of mathematical software
NASA Technical Reports Server (NTRS)
Sutherland, D.
1984-01-01
Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
Guidelines for qualifying cleaning and verification materials
NASA Technical Reports Server (NTRS)
Webb, D.
1995-01-01
This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.
This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuel, D; Testa, M; Park, Y
Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less
NASA Astrophysics Data System (ADS)
Randunu Pathirannehelage, Nishantha
Fourier telescopy imaging is a recently-developed imaging method that relies on active structured-light illumination of the object. Reflected/scattered light is measured by a large "light bucket" detector; processing of the detected signal yields the magnitude and phase of spatial frequency components of the object reflectance or transmittance function. An inverse Fourier transform results in the image. In 2012 a novel method, known as time-average Fourier telescopy (TAFT), was introduced by William T. Rhodes as a means for diffraction-limited imaging through ground-level atmospheric turbulence. This method, which can be applied to long horizontal-path terrestrial imaging, addresses a need that is not solved by the adaptive optics methods being used in astronomical imaging. Field-experiment verification of the TAFT concept requires instrumentation that is not available at Florida Atlantic University. The objective of this doctoral research program is thus to demonstrate, in the absence of full-scale experimentation, the feasibility of time-average Fourier telescopy through (a) the design, construction, and testing of small-scale laboratory instrumentation capable of exploring basic Fourier telescopy data-gathering operations, and (b) the development of MATLAB-based software capable of demonstrating the effect of kilometer-scale passage of laser beams through ground-level turbulence in a numerical simulation of TAFT.
Solid electrolyte oxygen regeneration system
NASA Technical Reports Server (NTRS)
Shumar, J. W.; See, G. G.; Schubert, F. H.; Powell, J. D.
1976-01-01
A program to design, develop, fabricate and assemble a one-man, self-contained, solid electrolyte oxygen regeneration system (SX-1) incorporating solid electrolyte electrolyzer drums was completed. The SX-1 is a preprototype engineering model designed to produce 0.952 kg (2.1 lb)/day of breathable oxygen (O2) from the electrolysis of metabolic carbon dioxide (CO2) and water vapor. The CO2 supply rate was established based on the metabolic CO2 generation rate for one man of 0.998 kg (2.2 lb)/day. The water supply rate (0.254 kg (0.56 lb)/day) was designed to be sufficient to make up the difference between the 0.952 kg (2.1 lb)/day O2 generation specification and the O2 available through CO2 electrolysis, 0.726 kg (1.6 lb)/day. The SX-1 was successfully designed, fabricated and assembled. Design verification tests (DVT) or the CO Disproportionators, H2 separators, control instrumentation, monitor instrumentation, water feed mechanism were successfully completed. The erratic occurrence of electrolyzer drum leakage prevented the completion of the CO2 electrolyzer module and water electrolyzer module DVT's and also prevented the performance of SX-1 integrated testing. Further development work is required to improve the solid electrolyte cell high temperature seals.
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
Cryo Testing of tbe James Webb Space Telescope's Integrated Science Instrument Module
NASA Technical Reports Server (NTRS)
VanCampen, Julie
2004-01-01
The Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope will be integrated and tested at the Environmental Test Facilities at Goddard Space Flight Center (GSFC). The cryogenic thermal vacuum testing of the ISIM will be the most difficult and problematic portion of the GSFC Integration and Test flow. The test is to validate the coupled interface of the science instruments and the ISIM structure and to sufficiently stress that interface while validating image quality of the science instruments. The instruments and the structure are not made from the same materials and have different CTE. Test objectives and verification rationale are currently being evaluated in Phase B of the project plan. The test program will encounter engineering challenges and limitations, which are derived by cost and technology many of which can be mitigated by facility upgrades, creative GSE, and thorough forethought. The cryogenic testing of the ISIM will involve a number of risks such as the implementation of unique metrology techniques, mechanical, electrical and optical simulators housed within the cryogenic vacuum environment. These potential risks are investigated and possible solutions are proposed.
FPGA Flash Memory High Speed Data Acquisition
NASA Technical Reports Server (NTRS)
Gonzalez, April
2013-01-01
The purpose of this research is to design and implement a VHDL ONFI Controller module for a Modular Instrumentation System. The goal of the Modular Instrumentation System will be to have a low power device that will store data and send the data at a low speed to a processor. The benefit of such a system will give an advantage over other purchased binary IP due to the capability of allowing NASA to re-use and modify the memory controller module. To accomplish the performance criteria of a low power system, an in house auxiliary board (Flash/ADC board), FPGA development kit, debug board, and modular instrumentation board will be jointly used for the data acquisition. The Flash/ADC board contains four, 1 MSPS, input channel signals and an Open NAND Flash memory module with an analog to digital converter. The ADC, data bits, and control line signals from the board are sent to an Microsemi/Actel FPGA development kit for VHDL programming of the flash memory WRITE, READ, READ STATUS, ERASE, and RESET operation waveforms using Libero software. The debug board will be used for verification of the analog input signal and be able to communicate via serial interface with the module instrumentation. The scope of the new controller module was to find and develop an ONFI controller with the debug board layout designed and completed for manufacture. Successful flash memory operation waveform test routines were completed, simulated, and tested to work on the FPGA board. Through connection of the Flash/ADC board with the FPGA, it was found that the device specifications were not being meet with Vdd reaching half of its voltage. Further testing showed that it was the manufactured Flash/ADC board that contained a misalignment with the ONFI memory module traces. The errors proved to be too great to fix in the time limit set for the project.
Recommendations for fluorescence instrument qualification: the new ASTM Standard Guide.
DeRose, Paul C; Resch-Genger, Ute
2010-03-01
Aimed at improving quality assurance and quantitation for modern fluorescence techniques, ASTM International (ASTM) is about to release a Standard Guide for Fluorescence, reviewed here. The guide's main focus is on steady state fluorometry, for which available standards and instrument characterization procedures are discussed along with their purpose, suitability, and general instructions for use. These include the most relevant instrument properties needing qualification, such as linearity and spectral responsivity of the detection system, spectral irradiance reaching the sample, wavelength accuracy, sensitivity or limit of detection for an analyte, and day-to-day performance verification. With proper consideration of method-inherent requirements and limitations, many of these procedures and standards can be adapted to other fluorescence techniques. In addition, procedures for the determination of other relevant fluorometric quantities including fluorescence quantum yields and fluorescence lifetimes are briefly introduced. The guide is a clear and concise reference geared for users of fluorescence instrumentation at all levels of experience and is intended to aid in the ongoing standardization of fluorescence measurements.
Results of the SOLCON FREESTAR Total Solar Irradiance measurements
NASA Astrophysics Data System (ADS)
Dewitte, S.; Joukoff, A.; Crommelynck, D.
2003-04-01
The measurement of the Total Solar Irradiance from space is ongoing since 1978. A long term series requires the combination of the time limited measurements of individual measurements. The accuracy of the long term series is limited by the absolute accuracy of the instruments, and by their ageing in space, due to exposure to UV radiation. As a reference for the combination of the different instruments, we use the measurements of the SOLar CONstant (SOLCON) instrument, which is flown regularly on the space shuttle. In this paper we will present the results of the most recent SOLCON flight, which is the Fast Reaction Experiments Enabling Science, Technology, Applications and Research (FREESTAR) flight foreseen from 16 Jan. 2003 to 1 Feb. 2003. The anticipated results are: 1) comparison of SOLCON with the new instruments Active Cavity Radiometer Irradiance Monitor (ACRIM) III, and 2) the Total Irradiance Monitor (TIM) on the Solar Radiation and Climate Experiment (SORCE) satellite, 3) verification of the ageing of the Variability of IRradiance and Gravity Oscillations (VIRGO) radiometers.
Validation and Verification (V and V) Testing on Midscale Flame Resistant (FR) Test Method
2016-12-16
Method for Evaluation of Flame Resistant Clothing for Protection against Fire Simulations Using an Instrumented Manikin. Validation and...complement (not replace) the capabilities of the ASTM F1930 Standard Test Method for Evaluation of Flame Resistant Clothing for Protection against Fire ...Engineering Center (NSRDEC) to complement the ASTM F1930 Standard Test Method for Evaluation of Flame Resistant Clothing for Protection against Fire
Some General Principles in Cryogenic Design, Implementation, and Testing
NASA Technical Reports Server (NTRS)
Dipirro, Michael James
2015-01-01
Brief Course Description: In 2 hours only the most basic principles of cryogenics can be presented. I will concentrate on the differences between a room temperature thermal analysis and cryogenic thermal analysis, namely temperature dependent properties. I will talk about practical materials for thermal contact and isolation. I will finish by describing the verification process and instrumentation used that is unique to cryogenic (in general less than 100K) systems.
A Study on Run Time Assurance for Complex Cyber Physical Systems
2013-04-18
safety verification approach was applied to synchronization of distributed local clocks of the nodes on a CAN bus by Jiang et al. [36]. The class of...mode of interaction between the instrumented system and the checker, we distin- guish between synchronous and asynchronous monitoring. In synchronous ...occurred. Synchronous monitoring may deliver a higher degree of assurance than the asynchronous one, because it can block a dangerous action. However
2017-06-06
OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...Geophysical Mapping, Electromagnetic Induction, Instrument Verification Strip, Time Domain Electromagnetic, Unexploded Ordnance 16. SECURITY...Munitions Response QA Quality Assurance QC Quality Control ROC Receiver Operating Characteristic RTK Real- time Kinematic s Second SNR
Calculation of far-field scattering from nonspherical particles using a geometrical optics approach
NASA Technical Reports Server (NTRS)
Hovenac, Edward A.
1991-01-01
A numerical method was developed using geometrical optics to predict far-field optical scattering from particles that are symmetric about the optic axis. The diffractive component of scattering is calculated and combined with the reflective and refractive components to give the total scattering pattern. The phase terms of the scattered light are calculated as well. Verification of the method was achieved by assuming a spherical particle and comparing the results to Mie scattering theory. Agreement with the Mie theory was excellent in the forward-scattering direction. However, small-amplitude oscillations near the rainbow regions were not observed using the numerical method. Numerical data from spheroidal particles and hemispherical particles are also presented. The use of hemispherical particles as a calibration standard for intensity-type optical particle-sizing instruments is discussed.
Experimental Flow Models for SSME Flowfield Characterization
NASA Technical Reports Server (NTRS)
Abel, L. C.; Ramsey, P. E.
1989-01-01
Full scale flow models with extensive instrumentation were designed and manufactured to provide data necessary for flow field characterization in rocket engines of the Space Shuttle Main Engine (SSME) type. These models include accurate flow path geometries from the pre-burner outlet through the throat of the main combustion chamber. The turbines are simulated with static models designed to provide the correct pressure drop and swirl for specific power levels. The correct turbopump-hot gas manifold interfaces were designed into the flow models to permit parametric/integration studies for new turbine designs. These experimental flow models provide a vehicle for understanding the fluid dynamics associated with specific engine issues and also fill the more general need for establishing a more detailed fluid dynamic base to support development and verification of advanced math models.
Comparison of Fiber Optic Strain Demodulation Implementations
NASA Technical Reports Server (NTRS)
Quach, Cuong C.; Vazquez, Sixto L.
2005-01-01
NASA Langley Research Center is developing instrumentation based upon principles of Optical Frequency-Domain Reflectometry (OFDR) for the provision of large-scale, dense distribution of strain sensors using fiber optics embedded with Bragg gratings. Fiber Optic Bragg Grating technology enables the distribution of thousands of sensors immune to moisture and electromagnetic interference with negligible weight penalty. At Langley, this technology provides a key component for research and development relevant to comprehensive aerospace vehicle structural health monitoring. A prototype system is under development that includes hardware and software necessary for the acquisition of data from an optical network and conversion of the data into strain measurements. This report documents the steps taken to verify the software that implements the algorithm for calculating the fiber strain. Brief descriptions of the strain measurement system and the test article are given. The scope of this report is the verification of software implementations as compared to a reference model. The algorithm will be detailed along with comparison results.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Disclosure and verification of Social Security and Employer Identification Numbers by owners. 886.305 Section 886.305 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF THE ASSISTANT...
24 CFR 5.233 - Mandated use of HUD's Enterprise Income Verification (EIV) System.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Mandated use of HUD's Enterprise Income Verification (EIV) System. 5.233 Section 5.233 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development GENERAL HUD PROGRAM REQUIREMENTS; WAIVERS Disclosure...
24 CFR 5.240 - Family disclosure of income information to the responsible entity and verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family disclosure of income information to the responsible entity and verification. 5.240 Section 5.240 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development GENERAL HUD PROGRAM REQUIREMENTS...
OFCC based voltage and transadmittance mode instrumentation amplifier
NASA Astrophysics Data System (ADS)
Nand, Deva; Pandey, Neeta; Pandey, Rajeshwari; Tripathi, Prateek; Gola, Prashant
2017-07-01
The operational floating current conveyor (OFCC) is a versatile active block due to the availability of both low and high input and output impedance terminals. This paper addresses the realization of OFCC based voltage and transadmittance mode instrumentation amplifiers (VMIA and TAM IA). It employs three OFCCs and seven resistors. The transadmittance mode operation can easily be obtained by simply connecting an OFCC based voltage to current converter at the output. The effect of non-idealities of OFCC, in particular finite transimpedance and tracking error, on system performance is also dealt with and corresponding mathematical expressions are derived. The functional verification is performed through SPICE simulation using CMOS based implementation of OFCC.
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
MC and A instrumentation catalog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neymotin, L.; Sviridova, V.
1998-06-01
In 1981 and 1985, two editions of a catalog of non-destructive nuclear measurement instrumentation, and material control and surveillance equipment, were published by Brookhaven National Laboratory (BNL). The last edition of the catalog included one hundred and twenty-five entries covering a wide range of devices developed in the US and abroad. More than ten years have elapsed since the publication of the more recent Catalog. Devices described in it have undergone significant modifications, and new devices have been developed. Therefore, in order to assist specialists in the field of Material Control and Accounting (MC and A), a new catalog hasmore » been created. Work on this instrumentation catalog started in 1997 as a cooperative effort of Brookhaven National Laboratory (BNL), operated by Brookhaven Science Associates under contract to the US Department of Energy, and the All-Russian Research Institute of Automatics (VNIIA), subordinate institute of the Atomic Energy Ministry of the Russian Federation, within the collaborative US-Russia Material Protection, Control, and Accounting (MPC and A) Program. Most of the equipment included in the Catalog are non-destructive assay (NDA) measurement devices employed for purposes of accounting, confirmation, and verification of nuclear materials. Other devices also included in the Catalog are employed in the detection and deterrence of unauthorized access to or removal of nuclear materials (material control: containment and surveillance). Equipment found in the Catalog comprises either: (1) complete devices or systems that can be used for MC and A applications; or (2) parts or components of complete systems, such as multi-channel analyzers, detectors, neutron generators, and software. All devices are categorized by their status of development--from prototype to serial production.« less
Construction Status and Early Science with the Daniel K. Inouye Solar Telescope
NASA Astrophysics Data System (ADS)
McMullin, Joseph P.; Rimmele, Thomas R.; Warner, Mark; Martinez Pillet, Valentin; Craig, Simon; Woeger, Friedrich; Tritschler, Alexandra; Berukoff, Steven J.; Casini, Roberto; Goode, Philip R.; Knoelker, Michael; Kuhn, Jeffrey Richard; Lin, Haosheng; Mathioudakis, Mihalis; Reardon, Kevin P.; Rosner, Robert; Schmidt, Wolfgang
2016-05-01
The 4-m Daniel K. Inouye Solar Telescope (DKIST) is in its seventh year of overall development and its fourth year of site construction on the summit of Haleakala, Maui. The Site Facilities (Utility Building and Support & Operations Building) are in place with ongoing construction of the Telescope Mount Assembly within. Off-site the fabrication of the component systems is completing with early integration testing and verification starting.Once complete this facility will provide the highest sensitivity and resolution for study of solar magnetism and the drivers of key processes impacting Earth (solar wind, flares, coronal mass ejections, and variability in solar output). The DKIST will be equipped initially with a battery of first light instruments which cover a spectral range from the UV (380 nm) to the near IR (5000 nm), and capable of providing both imaging and spectro-polarimetric measurements throughout the solar atmosphere (photosphere, chromosphere, and corona); these instruments are being developed by the National Solar Observatory (Visible Broadband Imager), High Altitude Observatory (Visible Spectro-Polarimeter), Kiepenheuer Institute (Visible Tunable Filter) and the University of Hawaii (Cryogenic Near-Infrared Spectro-Polarimeter and the Diffraction-Limited Near-Infrared Spectro-Polarimeter). Further, a United Kingdom consortium led by Queen's University Belfast is driving the development of high speed cameras essential for capturing the highly dynamic processes measured by these instruments. Finally, a state-of-the-art adaptive optics system will support diffraction limited imaging capable of resolving features approximately 20 km in scale on the Sun.We present the overall status of the construction phase along with the current challenges as well as a review of the planned science testing and the transition into early science operations.
Advanced verification methods for OVI security ink
NASA Astrophysics Data System (ADS)
Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom
2006-02-01
OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luke, S J
2011-12-20
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...
Signature Verification Using N-tuple Learning Machine.
Maneechot, Thanin; Kitjaidure, Yuttana
2005-01-01
This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.
Review and verification of CARE 3 mathematical model and code
NASA Technical Reports Server (NTRS)
Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.
1983-01-01
The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.
NASA Technical Reports Server (NTRS)
Bourkland, Kristin L.; Liu, Kuo-Chia
2011-01-01
The Solar Dynamics Observatory (SDO), launched in 2010, is a NASA-designed spacecraft built to study the Sun. SDO has tight pointing requirements and instruments that are sensitive to spacecraft jitter. Two High Gain Antennas (HGAs) are used to continuously send science data to a dedicated ground station. Preflight analysis showed that jitter resulting from motion of the HGAs was a cause for concern. Three jitter mitigation techniques were developed and implemented to overcome effects of jitter from different sources. These mitigation techniques include: the random step delay, stagger stepping, and the No Step Request (NSR). During the commissioning phase of the mission, a jitter test was performed onboard the spacecraft, in which various sources of jitter were examined to determine their level of effect on the instruments. During the HGA portion of the test, the jitter amplitudes from the single step of a gimbal were examined, as well as the amplitudes due to the execution of various gimbal rates. The jitter levels were compared with the gimbal jitter allocations for each instrument. The decision was made to consider implementing two of the jitter mitigating techniques on board the spacecraft: stagger stepping and the NSR. Flight data with and without jitter mitigation enabled was examined, and it is shown in this paper that HGA tracking is not negatively impacted with the addition of the jitter mitigation techniques. Additionally, the individual gimbal steps were examined, and it was confirmed that the stagger stepping and NSRs worked as designed. An Image Quality Test was performed to determine the amount of cumulative jitter from the reaction wheels, HGAs, and instruments during various combinations of typical operations. The HGA-induced jitter on the instruments is well within the jitter requirement when the stagger step and NSR mitigation options are enabled.
Towards SMOS: The 2006 National Airborne Field Experiment Plan
NASA Astrophysics Data System (ADS)
Walker, J. P.; Merlin, O.; Panciera, R.; Kalma, J. D.
2006-05-01
The 2006 National Airborne Field Experiment (NAFE) is the second in a series of two intensive experiments to be conducted in different parts of Australia. The NAFE'05 experiment was undertaken in the Goulburn River catchment during November 2005, with the objective to provide high resolution data for process level understanding of soil moisture retrieval, scaling and data assimilation. The NAFE'06 experiment will be undertaken in the Murrumbidgee catchment during November 2006, with the objective to provide data for SMOS (Soil Moisture and Ocean Salinity) level soil moisture retrieval, downscaling and data assimilation. To meet this objective, PLMR (Polarimetric L-band Multibeam Radiometer) and supporting instruments (TIR and NDVI) will be flown at an altitude of 10,000 ft AGL to provide 1km resolution passive microwave data (and 20m TIR) across a 50km x 50km area every 2-3 days. This will both simulate a SMOS pixel and provide the 1km soil moisture data required for downscale verification, allowing downscaling and near-surface soil moisture assimilation techniques to be tested with remote sensing data which is consistent with that from current (MODIS) and planned (SMOS) satellite sensors.. Additionally, two transects will be flown across the area to provide both 1km multi-angular passive microwave data for SMOS algorithm development, and on the same day, 50m resolution passive microwave data for algorithm verification. The study area contains a total of 13 soil moisture profile and rainfall monitoring sites for assimilation verification, and the transect fight lines are planned to go through 5 of these. Ground monitoring of surface soil moisture and vegetation for algorithm verification will be targeted at these 5 focus farms, with soil moisture measurements made at 250m spacing for 1km resolution flights and 50m spacing for 50m resolution flights. While this experiment has a particular emphasis on the remote sensing of soil moisture, it is open for collaboration from interested scientists from all disciplines of environmental remote sensing and its application. See www.nafe.unimelb.edu.au for more detailed information on these experiments.
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
Martignon, Stefania; Bautista-Mendoza, Gloria; González-Carrera, María; Lafaurie-Villamil, Gloria; Morales, Veicy; Santamaría, Ruth
2008-01-01
Designing three instruments for evaluating oral health knowledge, attitudes and practice in parents/caregivers of low social-economic status 0-5 year-olds. Evaluating the instruments' reliability in terms of internal consistency and analysing items. Three instruments were constructed for evaluating low social-economic status 0-5 year-olds' parents/caregivers' oral health knowledge, attitudes and practice in the municipality of Usaquén , Bogotá , Colombia . 47 parents/caregivers were given a test establishing the instrument's reliability in terms of internal consistency and the adults' level of knowledge, attitudes and practice. A sub-sample was qualitatively analysed (content verification and understanding). Reliability was evaluated using Cronbach's alpha coefficient. Items were analysed for improving constructing and understanding the questions, taking four criteria into account: corrected homogeneity index (CHI), response trend, correlation between items and qualitative analysis. Cronbach's alpha coefficient for knowledge, attitudes and practice was 0,82, 0,80 and 0,62, respectively. Participants' level of knowledge, attitudes and practice was acceptable (60 %, 55 % and 91 %, respectively). This study found two out of the three evaluated instruments to be reliable (knowledge and attitudes); all three of them were then redesigned. The resulting instruments represent a valuable tool which can be used in future studies for describing and evaluating preventative programmes.
Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument
NASA Technical Reports Server (NTRS)
Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather;
2011-01-01
NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.
Development and Verification of Sputtered Thin-Film Nickel-Titanium (NiTi) Shape Memory Alloy (SMA)
2015-08-01
Shape Memory Alloy (SMA) by Cory R Knick and Christopher J Morris Approved for public release; distribution unlimited...Laboratory Development and Verification of Sputtered Thin-Film Nickel-Titanium (NiTi) Shape Memory Alloy (SMA) by Cory R Knick and Christopher
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
Options and Risk for Qualification of Electric Propulsion System
NASA Technical Reports Server (NTRS)
Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)
2002-01-01
Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
Pinheiro, Leonardo B; O'Brien, Helen; Druce, Julian; Do, Hongdo; Kay, Pippa; Daniels, Marissa; You, Jingjing; Burke, Daniel; Griffiths, Kate; Emslie, Kerry R
2017-11-07
Use of droplet digital PCR technology (ddPCR) is expanding rapidly in the diversity of applications and number of users around the world. Access to relatively simple and affordable commercial ddPCR technology has attracted wide interest in use of this technology as a molecular diagnostic tool. For ddPCR to effectively transition to a molecular diagnostic setting requires processes for method validation and verification and demonstration of reproducible instrument performance. In this study, we describe the development and characterization of a DNA reference material (NMI NA008 High GC reference material) comprising a challenging methylated GC-rich DNA template under a novel 96-well microplate format. A scalable process using high precision acoustic dispensing technology was validated to produce the DNA reference material with a certified reference value expressed in amount of DNA molecules per well. An interlaboratory study, conducted using blinded NA008 High GC reference material to assess reproducibility among seven independent laboratories demonstrated less than 4.5% reproducibility relative standard deviation. With the exclusion of one laboratory, laboratories had appropriate technical competency, fully functional instrumentation, and suitable reagents to perform accurate ddPCR based DNA quantification measurements at the time of the study. The study results confirmed that NA008 High GC reference material is fit for the purpose of being used for quality control of ddPCR systems, consumables, instrumentation, and workflow.
Screenee perception and health-related quality of life in colorectal cancer screening: a review.
Pizzo, Elena; Pezzoli, Alessandro; Stockbrugger, Reinhold; Bracci, Enrico; Vagnoni, Emidia; Gullini, Sergio
2011-01-01
Screening for colorectal cancer (CRC) has become established to varying degrees in several Western countries for the past 30 years. Because of its effectiveness, screening has been adopted or is planned in a number of other countries. In most countries, the screening method (e.g., fecal occult blood test [FOBT], sigmoidoscopy) is followed by colonoscopy, for verification. In other countries (e.g., United States, Germany), colonoscopy is the preferred first-line investigation method. However, because colonoscopy is considered to be invasive, might be poorly tolerated, and can be associated with complications, the idea of adopting colonoscopy as the primary screening method suffers. Negative effects of screening methods can reduce participation in programs and thereby negate the desired effect on individual and societal health. At present, there is no generally accepted method either to assess the perception and satisfaction of patients screened or the outcome of the screening procedures in CRC. In this review, we discuss the past development and present availability of instruments to measure health-related quality of life (HRQoL), the scarce studies in which such instruments have been used in screening campaigns, and the findings. We suggest the creation of a specific instrument for the assessment of HRQoL in CRC screening. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Investigation into Practical Implementations of a Zero Knowledge Protocol.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Krentz-Wee, Rebecca E.
In recent years, the concept of Zero Knowledge Protocols (ZKP) as a useful approach to nuclear warhead verification has become increasingly popular. Several implementations of ZKP have been proposed, driving technology development toward proof of concept demonstrations. Whereas proposed implementations seem to fall within the general class of template-based techniques, all physical implementations of ZKPs proposed to date have a complication: once the instrumentation is prepared, it is no longer authenticatable; the instrument physically contains sensitive information. In this work we explore three different concepts that may offer more authenticatable and practical ZKP implementations and evaluate the sensitive information thatmore » may be at risk when doing so: sharing a subset of detector counts in a preloaded image (with spatial information removed), real-time image subtraction, and a new concept, CONfirmation using a Fast-neutron Imaging Detector with Anti-image NULL-positive Time Encoding (CONFIDANTE). CONFIDANTE promises to offer an almost ideal implementation of ZKP: a positive result is indicated by a constant rate at all times enabling the monitoring party the possibility of full access to the instrument before, during, and after confirmation. A prototype of CONFIDANTE was designed, built, and its performance evaluated in a series of measurements of several objects including a set of plutonium dioxide Hemispheres. Very encouraging results proving feasibility are presented. 1 Rebecca is currently a graduate student in Nuclear Engineering at UC Berkeley« less
VizieR Online Data Catalog: NIR spectroscopy of new L and T dwarf candidates (Kellogg+, 2017)
NASA Astrophysics Data System (ADS)
Kellogg, K.; Metchev, S.; Miles-Paez, P. A.; Tannock, M. E.
2018-02-01
We implemented a photometric search for peculiar L and T dwarfs using combined optical (SDSS), near-infrared (2MASS) and mid-infrared (WISE) fluxes. In Paper I (Kellogg et al. 2015AJ....150..182K), we reported a sample of 314 objects that passed all of our selection criteria and visual verification. After refining our visual verification, our total candidate L and T dwarf list was cut to 156 objects including 104 new candidates. We obtained near-infrared spectroscopic observations of the remaining 104 objects in our survey (66 peculiarly red, 13 candidate binary, and 25 general ultra-cool dwarf candidates) using the SpeX instrument on the NASA Infrared Telescope Facility (IRTF) and the Gemini Near-Infrared Spectrograph (GNIRS) instrument on the Gemini North telescope. We obtained the majority of our follow-up observations (91 of 104) with the SpeX spectrograph on the IRTF in prism mode (0.75-2.5μm; R~75-150), between 2014 October and 2016 April. The observing sequences and instrument settings were the same as those in Paper I (Kellogg et al. 2015AJ....150..182K). Table1 gives observation epochs and SpeX instrument settings for each science target. We followed-up the remaining 13 objects in our candidate list using the Gemini Near-Infrared Spectrograph (GNIRS) on Gemini North (0.9-2.5μm). We observed these objects in queue mode between 2015 October and 2017 May. We took the observations in cross-dispersed mode with the short-blue camera with 32l/mm grating and a 1.0''*7.0'' slit, resulting in a resolution of R~500. We used a standard A-B-B-A nodding sequence along the slit to record object and sky spectra. Individual exposure times were 120s per pointing. Table2 gives Gemini/GNIRS observation epochs for each science target. (4 data files).
The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...
Handbook: Design of automated redundancy verification
NASA Technical Reports Server (NTRS)
Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.
1971-01-01
The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.
The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) development
NASA Technical Reports Server (NTRS)
Shimamoto, Mike S.
1993-01-01
The development of an anthropomorphic, undersea manipulator system, the TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) is described. The TOPS system's design philosophy, which results from NRaD's experience in undersea vehicles and manipulator systems development and operations, is presented. The TOPS design approach, task teams, manipulator, and vision system development and results, conclusions, and recommendations are presented.
Statistical Time Series Models of Pilot Control with Applications to Instrument Discrimination
NASA Technical Reports Server (NTRS)
Altschul, R. E.; Nagel, P. M.; Oliver, F.
1984-01-01
A general description of the methodology used in obtaining the transfer function models and verification of model fidelity, frequency domain plots of the modeled transfer functions, numerical results obtained from an analysis of poles and zeroes obtained from z plane to s-plane conversions of the transfer functions, and the results of a study on the sequential introduction of other variables, both exogenous and endogenous into the loop are contained.
Grazing-Angle Fourier Transform Infrared Spectroscopy for Surface Cleanliness Verification
2003-03-01
coating. 34 North Island personnel were also interested in using the portable FTIR instrument to detect a trivalent chromium conversion coating on... trivalent chromium coating on aluminum panels. 35 Following the successful field-test at NADEP North Island in December 2000, a second demonstration of...contaminated, the panels were allowed to dry under a fume hood to evaporate the solvent. They were then placed in a desiccator for final drying. This
Verification of Disarmament or Limitation of Armaments: Instruments, Negotiations, Proposals
1992-05-01
explosions and may complicate the process of detection. An even greater difficulty faced by seismologists is the ambient background of seismic "noise...suspected event would be a complex operation. It would consist of surveys of the area of the presumed nuclear explosion in order to measure ambient ...Draft Resolution to the OAS General Assembly, June 1991 and OAS Resolution "Cooperacion para la seguridad en el hemisferio. Limitacion de la
Development and Verification of the Charring Ablating Thermal Protection Implicit System Solver
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan D.; Kirk, Benjamin S.
2010-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method with first and second order implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton's method, while the fully implicit linear system is solved with the Generalized Minimal Residual method. Verification results from exact solutions and the Method of Manufactured Solutions are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
Development and Verification of the Charring, Ablating Thermal Protection Implicit System Simulator
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan; Kirk, Benjamin S.
2011-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver (CATPISS) is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method (FEM) with first and second order fully implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton s method, while the linear system is solved via the Generalized Minimum Residual method (GMRES). Verification results from exact solutions and Method of Manufactured Solutions (MMS) are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
NASA Astrophysics Data System (ADS)
Roed-Larsen, Trygve; Flach, Todd
The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.
Requirement Specifications for a Design and Verification Unit.
ERIC Educational Resources Information Center
Pelton, Warren G.; And Others
A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…
NASA Technical Reports Server (NTRS)
Haigh, R.; Krimchansky, S. (Technical Monitor)
2000-01-01
This is the Performance Verification Report, METSAT (S/N 108) AMSU-A1 Receiver Assemblies P/N 1356429-1 S/N F05 and P/N 1356409-1 S/N F05, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The ATP for the AMSU-A Receiver Subsystem, AE-26002/6A, is prepared to describe in detail the configuration of the test setups and the procedures of the tests to verify that the receiver subsystem meets the specifications as required either in the AMSU-A Instrument Performance and Operation Specifications, S-480-80, or in AMSU-A Receiver Subsystem Specifications, AE-26608, derived by the Aerojet System Engineering. Test results that verify the conformance to the specifications demonstrate the acceptability of that particular receiver subsystem.
NASA Technical Reports Server (NTRS)
Tsoucalas, George; Daniels, Taumi S.; Zysko, Jan; Anderson, Mark V.; Mulally, Daniel J.
2010-01-01
As part of the National Aeronautics and Space Administration's Aviation Safety and Security Program, the Tropospheric Airborne Meteorological Data Reporting project (TAMDAR) developed a low-cost sensor for aircraft flying in the lower troposphere. This activity was a joint effort with support from Federal Aviation Administration, National Oceanic and Atmospheric Administration, and industry. This paper reports the TAMDAR sensor performance validation and verification, as flown on board NOAA Lockheed WP-3D aircraft. These flight tests were conducted to assess the performance of the TAMDAR sensor for measurements of temperature, relative humidity, and wind parameters. The ultimate goal was to develop a small low-cost sensor, collect useful meteorological data, downlink the data in near real time, and use the data to improve weather forecasts. The envisioned system will initially be used on regional and package carrier aircraft. The ultimate users of the data are National Centers for Environmental Prediction forecast modelers. Other users include air traffic controllers, flight service stations, and airline weather centers. NASA worked with an industry partner to develop the sensor. Prototype sensors were subjected to numerous tests in ground and flight facilities. As a result of these earlier tests, many design improvements were made to the sensor. The results of tests on a final version of the sensor are the subject of this report. The sensor is capable of measuring temperature, relative humidity, pressure, and icing. It can compute pressure altitude, indicated air speed, true air speed, ice presence, wind speed and direction, and eddy dissipation rate. Summary results from the flight test are presented along with corroborative data from aircraft instruments.
Fresh Fuel Measurements With the Differential Die-Away Self-Interrogation Instrument
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trahan, Alexis C.; Belian, Anthony P.; Swinhoe, Martyn T.
The purpose of the Next Generation Safeguards Initiative (NGSI)-Spent Fuel (SF) Project is to strengthen the technical toolkit of safeguards inspectors and/or other interested parties. Thus the NGSI-SF team is working to achieve the following technical goals more easily and efficiently than in the past using nondestructive assay measurements of spent fuel assemblies: 1) verify the initial enrichment, burnup, and cooling time of facility declaration; 2) detect the diversion or replacement of pins; 3) estimate the plutonium mass; 4) estimate decay heat; and 5) determine the reactivity of spent fuel assemblies. The differential die-away self-interrogation (DDSI) instrument is one instrumentmore » that was assessed for years regarding its feasibility for robust, timely verification of spent fuel assemblies. The instrument was recently built and was tested using fresh fuel assemblies in a variety of configurations, including varying enrichment, neutron absorber content, and symmetry. The early die-away method, a multiplication determination method developed in simulation space, was successfully tested on the fresh fuel assembly data and determined multiplication with a root-mean-square (RMS) error of 2.9%. The experimental results were compared with MCNP simulations of the instrument as well. Low multiplication assemblies had agreement with an average RMS error of 0.2% in the singles count rate (i.e., total neutrons detected per second) and 3.4% in the doubles count rates (i.e., neutrons detected in coincidence per second). High-multiplication assemblies had agreement with an average RMS error of 4.1% in the singles and 13.3% in the doubles count rates.« less
Fresh Fuel Measurements With the Differential Die-Away Self-Interrogation Instrument
Trahan, Alexis C.; Belian, Anthony P.; Swinhoe, Martyn T.; ...
2017-01-05
The purpose of the Next Generation Safeguards Initiative (NGSI)-Spent Fuel (SF) Project is to strengthen the technical toolkit of safeguards inspectors and/or other interested parties. Thus the NGSI-SF team is working to achieve the following technical goals more easily and efficiently than in the past using nondestructive assay measurements of spent fuel assemblies: 1) verify the initial enrichment, burnup, and cooling time of facility declaration; 2) detect the diversion or replacement of pins; 3) estimate the plutonium mass; 4) estimate decay heat; and 5) determine the reactivity of spent fuel assemblies. The differential die-away self-interrogation (DDSI) instrument is one instrumentmore » that was assessed for years regarding its feasibility for robust, timely verification of spent fuel assemblies. The instrument was recently built and was tested using fresh fuel assemblies in a variety of configurations, including varying enrichment, neutron absorber content, and symmetry. The early die-away method, a multiplication determination method developed in simulation space, was successfully tested on the fresh fuel assembly data and determined multiplication with a root-mean-square (RMS) error of 2.9%. The experimental results were compared with MCNP simulations of the instrument as well. Low multiplication assemblies had agreement with an average RMS error of 0.2% in the singles count rate (i.e., total neutrons detected per second) and 3.4% in the doubles count rates (i.e., neutrons detected in coincidence per second). High-multiplication assemblies had agreement with an average RMS error of 4.1% in the singles and 13.3% in the doubles count rates.« less
NASA Astrophysics Data System (ADS)
Destyanto, A. R.; Putri, O. A.; Hidayatno, A.
2017-11-01
Due to the advantages that serious simulation game offered, many areas of studies, including energy, have used serious simulation games as their instruments. However, serious simulation games in the field of energy transition still have few attentions. In this study, serious simulation game is developed and tested as the activity of public education about energy transition which is a conversion from oil to natural gas program. The aim of the game development is to create understanding and awareness about the importance of energy transition for society in accelerating the process of energy transition in Indonesia since 1987 the energy transition program has not achieved the conversion target yet due to the lack of education about energy transition for society. Developed as a digital serious simulation game following the framework of integrated game design, the Transergy game has been tested to 15 users and then analysed. The result of verification and validation of the game shows that Transergy gives significance to the users for understanding and triggering the needs of oil to natural gas conversion.
Development of syntax of intuition-based learning model in solving mathematics problems
NASA Astrophysics Data System (ADS)
Yeni Heryaningsih, Nok; Khusna, Hikmatul
2018-01-01
The aim of the research was to produce syntax of Intuition Based Learning (IBL) model in solving mathematics problem for improving mathematics students’ achievement that valid, practical and effective. The subject of the research were 2 classes in grade XI students of SMAN 2 Sragen, Central Java. The type of the research was a Research and Development (R&D). Development process adopted Plomp and Borg & Gall development model, they were preliminary investigation step, design step, realization step, evaluation and revision step. Development steps were as follow: (1) Collected the information and studied of theories in Preliminary Investigation step, studied about intuition, learning model development, students condition, and topic analysis, (2) Designed syntax that could bring up intuition in solving mathematics problem and then designed research instruments. They were several phases that could bring up intuition, Preparation phase, Incubation phase, Illumination phase and Verification phase, (3) Realized syntax of Intuition Based Learning model that has been designed to be the first draft, (4) Did validation of the first draft to the validator, (5) Tested the syntax of Intuition Based Learning model in the classrooms to know the effectiveness of the syntax, (6) Conducted Focus Group Discussion (FGD) to evaluate the result of syntax model testing in the classrooms, and then did the revision on syntax IBL model. The results of the research were produced syntax of IBL model in solving mathematics problems that valid, practical and effective. The syntax of IBL model in the classroom were, (1) Opening with apperception, motivations and build students’ positive perceptions, (2) Teacher explains the material generally, (3) Group discussion about the material, (4) Teacher gives students mathematics problems, (5) Doing exercises individually to solve mathematics problems with steps that could bring up students’ intuition: Preparations, Incubation, Illumination, and Verification, (6) Closure with the review of students have learned or giving homework.
Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...
Formally verifying Ada programs which use real number types
NASA Technical Reports Server (NTRS)
Sutherland, David
1986-01-01
Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
Dynamic testing for shuttle design verification
NASA Technical Reports Server (NTRS)
Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.
1972-01-01
Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.
Developing a lower-cost atmospheric CO2 monitoring system using commercial NDIR sensor
NASA Astrophysics Data System (ADS)
Arzoumanian, E.; Bastos, A.; Gaynullin, B.; Laurent, O.; Vogel, F. R.
2017-12-01
Cities release to the atmosphere about 44 % of global energy-related CO2. It is clear that accurate estimates of the magnitude of anthropogenic and natural urban emissions are needed to assess their influence on the carbon balance. A dense ground-based CO2 monitoring network in cities would potentially allow retrieving sector specific CO2 emission estimates when combined with an atmospheric inversion framework using reasonably accurate observations (ca. 1 ppm for hourly means). One major barrier for denser observation networks can be the high cost of high precision instruments or high calibration cost of cheaper and unstable instruments. We have developed and tested a novel inexpensive NDIR sensors for CO2 measurements which fulfils cost and typical parameters requirements (i.e. signal stability, efficient handling, and connectivity) necessary for this task. Such sensors are essential in the market of emissions estimates in cities from continuous monitoring networks as well as for leak detection of MRV (monitoring, reporting, and verification) services for industrial sites. We conducted extensive laboratory tests (short and long-term repeatability, cross-sensitivities, etc.) on a series of prototypes and the final versions were also tested in a climatic chamber. On four final HPP prototypes the sensitivity to pressure and temperature were precisely quantified and correction&calibration strategies developed. Furthermore, we fully integrated these HPP sensors in a Raspberry PI platform containing the CO2 sensor and additional sensors (pressure, temperature and humidity sensors), gas supply pump and a fully automated data acquisition unit. This platform was deployed in parallel to Picarro G2401 instruments in the peri-urban site Saclay - next to Paris, and in the urban site Jussieu - Paris, France. These measurements were conducted over several months in order to characterize the long-term drift of our HPP instruments and the ability of the correction and calibration scheme to provide bias free observations. From the lessons learned in the laboratory tests and field measurements, we developed a specific correction and calibration strategy for our NDIR sensors. Latest results and calibration strategies will be shown.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael
This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less
NASA Technical Reports Server (NTRS)
Lay, Richard R.; Lee, Karen A.; Holden, James R.; Oswald, John E.; Jarnot, Robert F.; Pickett, Herbert M.; Stek, Paul C.; Cofield, Richard E., III; Flower, Dennis A.; Schwartz, Michael J.;
2005-01-01
The Microwave Limb Sounder instrument was launched aboard NASA's EOS AURA satellite in July, 2004. The overall scientific objectives for MLS are to measure temperature, pressure, and several important chemical species in the upper troposphere and stratosphere relevant to ozone processes and climate change. MLS consists of a suite of radiometers designed to operate from 11 8 GHz to 2.5 THz, with two antennas (one for 2.5 THz, the other for the lower frequencies) that scan vertically through the atmospheric limb, and spectrometers with spectral resolution of 6 MHz at spectral line centers. This paper describes the on-orbit commissioning the MLS instrument which includes activation and engineering functional verifications and calibrations.
NASA Technical Reports Server (NTRS)
Sharp, William E.; Knoll, Glenn
1989-01-01
A feasibility study of conducting a joint NASA/GSFC and Soviet Space Agency long duration balloon flight at the Antarctic in Jan. 1993 is reported. The objective of the mission is the verification and calibration of gamma ray and neutron remote sensing instruments which can be used to obtain geochemical maps of the surface of planetary bodies. The gamma ray instruments in question are the GRAD and the Soviet Phobos prototype. The neutron detectors are supplied by Los Alamos National Laboratory and the Soviet Phobos prototype. These are to be carried aboard a gondola that supplies the data and supplies the power for the period of up to two weeks.