Hubble Space Telescope high speed photometer science verification test report
NASA Technical Reports Server (NTRS)
Richards, Evan E.
1992-01-01
The purpose of this report is to summarize the results of the HSP Science Verification (SV) tests, the status of the HSP at the end of the SV period, and the work remaining to be done. The HSP OV report (November 1991) covered all activities (OV, SV, and SAO) from launch to the completion of phase three alignment, OV 3233 performed in the 91154 SMS, on June 8, 1991. This report covers subsequent activities through May 1992.
Spitzer Space Telescope in-orbit checkout and science verification operations
NASA Technical Reports Server (NTRS)
Linick, Sue H.; Miles, John W.; Gilbert, John B.; Boyles, Carol A.
2004-01-01
Spitzer Space Telescope, the fourth and final of NASA's great observatories, and the first mission in NASA's Origins Program was launched 25 August 2003 into an Earth-trailing solar orbit. The observatory was designed to probe and explore the universe in the infrared. Before science data could be acquired, however, the observatory had to be initialized, characterized, calibrated, and commissioned. A two phased operations approach was defined to complete this work. These phases were identified as In-Orbit Checkout (IOC) and Science Verification (SV). Because the observatory lifetime is cryogen-limited these operations had to be highly efficient. The IOC/SV operations design accommodated a pre-defined distributed organizational structure and a complex, cryogenic flight system. Many checkout activities were inter-dependent, and therefore the operations concept and ground data system had to provide the flexibility required for a 'short turn-around' environment. This paper describes the adaptive operations system design and evolution, implementation, and lessons-learned from the completion of IOC/SV.
NASA Astrophysics Data System (ADS)
Kärcher, Hans J.; Kunz, Nans; Temi, Pasquale; Krabbe, Alfred; Wagner, Jörg; Süß, Martin
2014-07-01
The original pointing accuracy requirement of the Stratospheric Observatory for Infrared Astronomy SOFIA was defined at the beginning of the program in the late 1980s as very challenging 0.2 arcsec rms. The early science flights of the observatory started in December 2010 and the observatory has reached in the mean time nearly 0.7 arcsec rms, which is sufficient for most of the SOFIA science instruments. NASA and DLR, the owners of SOFIA, are planning now a future 4 year program to bring the pointing down to the ultimate 0.2 arcsec rms. This may be the right time to recall the history of the pointing requirement and its verification and the possibility of its achievement via early computer models and wind tunnel tests, later computer aided end-to-end simulations up to the first commissioning flights some years ago. The paper recollects the tools used in the different project phases for the verification of the pointing performance, explains the achievements and may give hints for the planning of the upcoming final pointing improvement phase.
Scientific Data Purchase Project Overview Presentation
NASA Technical Reports Server (NTRS)
Holekamp, Kara; Fletcher, Rose
2001-01-01
The Scientific Data Purchase (SDP) project acquires science data from commercial sources. It is a demonstration project to test a new way of doing business, tap new sources of data, support Earth science research, and support the commercial remote sensing industry. Phase I of the project reviews simulated/prototypical data sets from 10 companies. Phase II of the project is a 3 year purchase/distribution of select data from 5 companies. The status of several SDP projects is reviewed in this viewgraph presentation, as is the SDP process of tasking, verification, validation, and data archiving. The presentation also lists SDP results for turnaround time, metrics, customers, data use, science research, applications research, and user feedback.
NASA Astrophysics Data System (ADS)
Maud, L. T.; Tilanus, R. P. J.; van Kempen, T. A.; Hogerheijde, M. R.; Schmalzl, M.; Yoon, I.; Contreras, Y.; Toribio, M. C.; Asaki, Y.; Dent, W. R. F.; Fomalont, E.; Matsushita, S.
2017-09-01
The Atacama Large millimetre/submillimetre Array (ALMA) makes use of water vapour radiometers (WVR), which monitor the atmospheric water vapour line at 183 GHz along the line of sight above each antenna to correct for phase delays introduced by the wet component of the troposphere. The application of WVR derived phase corrections improve the image quality and facilitate successful observations in weather conditions that were classically marginal or poor. We present work to indicate that a scaling factor applied to the WVR solutions can act to further improve the phase stability and image quality of ALMA data. We find reduced phase noise statistics for 62 out of 75 datasets from the long-baseline science verification campaign after a WVR scaling factor is applied. The improvement of phase noise translates to an expected coherence improvement in 39 datasets. When imaging the bandpass source, we find 33 of the 39 datasets show an improvement in the signal-to-noise ratio (S/N) between a few to 30 percent. There are 23 datasets where the S/N of the science image is improved: 6 by <1%, 11 between 1 and 5%, and 6 above 5%. The higher frequencies studied (band 6 and band 7) are those most improved, specifically datasets with low precipitable water vapour (PWV), <1 mm, where the dominance of the wet component is reduced. Although these improvements are not profound, phase stability improvements via the WVR scaling factor come into play for the higher frequency (>450 GHz) and long-baseline (>5 km) observations. These inherently have poorer phase stability and are taken in low PWV (<1 mm) conditions for which we find the scaling to be most effective. A promising explanation for the scaling factor is the mixing of dry and wet air components, although other origins are discussed. We have produced a python code to allow ALMA users to undertake WVR scaling tests and make improvements to their data.
Execution of the Spitzer In-orbit Checkout and Science Verification Plan
NASA Technical Reports Server (NTRS)
Miles, John W.; Linick, Susan H.; Long, Stacia; Gilbert, John; Garcia, Mark; Boyles, Carole; Werner, Michael; Wilson, Robert K.
2004-01-01
The Spitzer Space Telescope is an 85-cm telescope with three cryogenically cooled instruments. Following launch, the observatory was initialized and commissioned for science operations during the in-orbit checkout (IOC) and science verification (SV) phases, carried out over a total of 98.3 days. The execution of the IOC/SV mission plan progressively established Spitzer capabilities taking into consideration thermal, cryogenic, optical, pointing, communications, and operational designs and constraints. The plan was carried out with high efficiency, making effective use of cryogen-limited flight time. One key component to the success of the plan was the pre-launch allocation of schedule reserve in the timeline of IOC/SV activities, and how it was used in flight both to cover activity redesign and growth due to continually improving spacecraft and instrument knowledge, and to recover from anomalies. This paper describes the adaptive system design and evolution, implementation, and lessons learned from IOC/SV operations. It is hoped that this information will provide guidance to future missions with similar engineering challenges
Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation
NASA Technical Reports Server (NTRS)
Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna
2000-01-01
This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.
78 FR 18305 - Notice of Request for Extension of a Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Identity Verification (PIV) Request for Credential, the USDA Homeland Security Presidential Directive 12... consists of two phases of implementation: Personal Identity Verification phase I (PIV I) and Personal Identity Verification phase II (PIV II). The information requested must be provided by Federal employees...
Three-step method for menstrual and oral contraceptive cycle verification.
Schaumberg, Mia A; Jenkins, David G; Janse de Jonge, Xanne A K; Emmerton, Lynne M; Skinner, Tina L
2017-11-01
Fluctuating endogenous and exogenous ovarian hormones may influence exercise parameters; yet control and verification of ovarian hormone status is rarely reported and limits current exercise science and sports medicine research. The purpose of this study was to determine the effectiveness of an individualised three-step method in identifying the mid-luteal or high hormone phase in endogenous and exogenous hormone cycles in recreationally-active women and determine hormone and demographic characteristics associated with unsuccessful classification. Cross-sectional study design. Fifty-four recreationally-active women who were either long-term oral contraceptive users (n=28) or experiencing regular natural menstrual cycles (n=26) completed step-wise menstrual mapping, urinary ovulation prediction testing and venous blood sampling for serum/plasma hormone analysis on two days, 6-12days after positive ovulation prediction to verify ovarian hormone concentrations. Mid-luteal phase was successfully verified in 100% of oral contraceptive users, and 70% of naturally-menstruating women. Thirty percent of participants were classified as luteal phase deficient; when excluded, the success of the method was 89%. Lower age, body fat and longer menstrual cycles were significantly associated with luteal phase deficiency. A step-wise method including menstrual cycle mapping, urinary ovulation prediction and serum/plasma hormone measurement was effective at verifying ovarian hormone status. Additional consideration of age, body fat and cycle length enhanced identification of luteal phase deficiency in physically-active women. These findings enable the development of stricter exclusion criteria for female participants in research studies and minimise the influence of ovarian hormone variations within sports and exercise science and medicine research. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
75 FR 43943 - Defense Science Board; Task Force on Nuclear Treaty Monitoring and Verification
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board; Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... held September 13-14, and 25-26, 2010. ADDRESSES: The meetings will be held at Science Applications...
SAMS-II Requirements and Operations
NASA Technical Reports Server (NTRS)
Wald, Lawrence W.
1998-01-01
The Space Acceleration Measurements System (SAMS) II is the primary instrument for the measurement, storage, and communication of the microgravity environment aboard the International Space Station (ISS). SAMS-II is being developed by the NASA Lewis Research Center Microgravity Science Division to primarily support the Office of Life and Microgravity Science and Applications (OLMSA) Microgravity Science and Applications Division (MSAD) payloads aboard the ISS. The SAMS-II is currently in the test and verification phase at NASA LeRC, prior to its first hardware delivery scheduled for July 1998. This paper will provide an overview of the SAMS-II instrument, including the system requirements and topology, physical and electrical characteristics, and the Concept of Operations for SAMS-II aboard the ISS.
NASA Technical Reports Server (NTRS)
Gracey, Renee; Bartoszyk, Andrew; Cofie, Emmanuel; Comber, Brian; Hartig, George; Howard, Joseph; Sabatke, Derek; Wenzel, Greg; Ohl, Raymond
2016-01-01
The James Webb Space Telescope includes the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. We performed extensive structural, thermal, and optical performance(STOP) modeling in support of all phases of ISIM development. In this paper, we focus on modeling and results associated with test and verification. ISIMs test program is bound by ground environments, mostly notably the 1g and test chamber thermal environments. This paper describes STOP modeling used to predict ISIM system performance in 0g and at various on-orbit temperature environments. The predictions are used to project results obtained during testing to on-orbit performance.
Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H
2018-01-01
The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.
The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?
Schaun, Gustavo Z
2017-12-08
Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.
75 FR 34439 - Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-17
... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... Applications International Corporation, 4001 North Fairfax Drive, Suite 300, Arlington, VA. FOR FURTHER...
NASA Astrophysics Data System (ADS)
Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang
2017-01-01
A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
First SN Discoveries from the Dark Energy Survey
NASA Astrophysics Data System (ADS)
Abbott, T.; Abdalla, F.; Achitouv, I.; Ahn, E.; Aldering, G.; Allam, S.; Alonso, D.; Amara, A.; Annis, J.; Antonik, M.; Aragon-Salamanca, A.; Armstrong, R.; Ashall, C.; Asorey, J.; Bacon, D.; Balbinot, E.; Banerji, M.; Barbary, K.; Barkhouse, W.; Baruah, L.; Bauer, A.; Bechtol, K.; Becker, M.; Bender, R.; Benoist, C.; Benoit-Levy, A.; Bernardi, M.; Bernstein, G.; Bernstein, J. P.; Bernstein, R.; Bertin, E.; Beynon, E.; Bhattacharya, S.; Biesiadzinski, T.; Biswas, R.; Blake, C.; Bloom, J. S.; Bocquet, S.; Brandt, C.; Bridle, S.; Brooks, D.; Brown, P. J.; Brunner, R.; Buckley-Geer, E.; Burke, D.; Burkert, A.; Busha, M.; Campa, J.; Campbell, H.; Cane, R.; Capozzi, D.; Carlstrom, J.; Carnero Rosell, A.; Carollo, M.; Carrasco-Kind, M.; Carretero, J.; Carter, M.; Casas, R.; Castander, F. J.; Chen, Y.; Chiu, I.; Chue, C.; Clampitt, J.; Clerkin, L.; Cohn, J.; Colless, M.; Copeland, E.; Covarrubias, R. A.; Crittenden, R.; Crocce, M.; Cunha, C.; da Costa, L.; d'Andrea, C.; Das, S.; Das, R.; Davis, T. M.; Deb, S.; DePoy, D.; Derylo, G.; Desai, S.; de Simoni, F.; Devlin, M.; Diehl, H. T.; Dietrich, J.; Dodelson, S.; Doel, P.; Dolag, K.; Efstathiou, G.; Eifler, T.; Erickson, B.; Eriksen, M.; Estrada, J.; Etherington, J.; Evrard, A.; Farrens, S.; Fausti Neto, A.; Fernandez, E.; Ferreira, P. C.; Finley, D.; Fischer, J. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Furlanetto, C.; Garcia-Bellido, J.; Gaztanaga, E.; Gelman, M.; Gerdes, D.; Giannantonio, T.; Gilhool, S.; Gill, M.; Gladders, M.; Gladney, L.; Glazebrook, K.; Gray, M.; Gruen, D.; Gruendl, R.; Gupta, R.; Gutierrez, G.; Habib, S.; Hall, E.; Hansen, S.; Hao, J.; Heitmann, K.; Helsby, J.; Henderson, R.; Hennig, C.; High, W.; Hirsch, M.; Hoffmann, K.; Holhjem, K.; Honscheid, K.; Host, O.; Hoyle, B.; Hu, W.; Huff, E.; Huterer, D.; Jain, B.; James, D.; Jarvis, M.; Jarvis, M. J.; Jeltema, T.; Johnson, M.; Jouvel, S.; Kacprzak, T.; Karliner, I.; Katsaros, J.; Kent, S.; Kessler, R.; Kim, A.; Kim-Vy, T.; King, L.; Kirk, D.; Kochanek, C.; Kopp, M.; Koppenhoefer, J.; Kovacs, E.; Krause, E.; Kravtsov, A.; Kron, R.; Kuehn, K.; Kuemmel, M.; Kuhlmann, S.; Kunder, A.; Kuropatkin, N.; Kwan, J.; Lahav, O.; Leistedt, B.; Levi, M.; Lewis, P.; Liddle, A.; Lidman, C.; Lilly, S.; Lin, H.; Liu, J.; Lopez-Arenillas, C.; Lorenzon, W.; LoVerde, M.; Ma, Z.; Maartens, R.; Maccrann, N.; Macri, L.; Maia, M.; Makler, M.; Manera, M.; Maraston, C.; March, M.; Markovic, K.; Marriner, J.; Marshall, J.; Marshall, S.; Martini, P.; Marti Sanahuja, P.; Mayers, J.; McKay, T.; McMahon, R.; Melchior, P.; Merritt, K. W.; Merson, A.; Miller, C.; Miquel, R.; Mohr, J.; Moore, T.; Mortonson, M.; Mosher, J.; Mould, J.; Mukherjee, P.; Neilsen, E.; Ngeow, C.; Nichol, R.; Nidever, D.; Nord, B.; Nugent, P.; Ogando, R.; Old, L.; Olsen, J.; Ostrovski, F.; Paech, K.; Papadopoulos, A.; Papovich, C.; Patton, K.; Peacock, J.; Pellegrini, P. S. S.; Peoples, J.; Percival, W.; Perlmutter, S.; Petravick, D.; Plazas, A.; Ponce, R.; Poole, G.; Pope, A.; Refregier, A.; Reyes, R.; Ricker, P.; Roe, N.; Romer, K.; Roodman, A.; Rooney, P.; Ross, A.; Rowe, B.; Rozo, E.; Rykoff, E.; Sabiu, C.; Saglia, R.; Sako, M.; Sanchez, A.; Sanchez, C.; Sanchez, E.; Sanchez, J.; Santiago, B.; Saro, A.; Scarpine, V.; Schindler, R.; Schmidt, B. P.; Schmitt, R. L.; Schubnell, M.; Seitz, S.; Senger, R.; Sevilla, I.; Sharp, R.; Sheldon, E.; Sheth, R.; Smith, R. C.; Smith, M.; Snigula, J.; Soares-Santos, M.; Sobreira, F.; Song, J.; Soumagnac, M.; Spinka, H.; Stebbins, A.; Stoughton, C.; Suchyta, E.; Suhada, R.; Sullivan, M.; Sun, F.; Suntzeff, N.; Sutherland, W.; Swanson, M. E. C.; Sypniewski, A. J.; Szepietowski, R.; Talaga, R.; Tarle, G.; Tarrant, E.; Balan, S. Thaithara; Thaler, J.; Thomas, D.; Thomas, R. C.; Tucker, D.; Uddin, S. A.; Ural, S.; Vikram, V.; Voigt, L.; Walker, A. R.; Walker, T.; Wechsler, R.; Weinberg, D.; Weller, J.; Wester, W.; Wetzstein, M.; White, M.; Wilcox, H.; Wilman, D.; Yanny, B.; Young, J.; Zablocki, A.; Zenteno, A.; Zhang, Y.; Zuntz, J.
2012-12-01
The Dark Energy Survey (DES) report the discovery of the first set of supernovae (SN) from the project. Images were observed as part of the DES Science Verification phase using the newly-installed 570-Megapixel Dark Energy Camera on the CTIO Blanco 4-m telescope by observers J. Annis, E. Buckley-Geer, and H. Lin. SN observations are planned throughout the observing campaign on a regular cadence of 4-6 days in each of the ten 3-deg2 fields in the DES griz filters.
NASA Astrophysics Data System (ADS)
Buttu, Marco; D'Amico, Nichi; Egron, Elise; Iacolina, Maria Noemi; Marongiu, Pasqualino; Migoni, Carlo; Pellizzoni, Alberto; Poppi, Sergio; Possenti, Andrea; Trois, Alessio; Vargiu, Gian Paolo
2013-05-01
During the Sardinia Radio Telescope (SRT) science verification phase, we observed PSR J1745-2900, firstly detected as an X-ray flare from Sgr A* by Swift and then identified as a 3.76 s X-ray magnetar with NuSTAR telescope (ATels #5006, #5020, #5027, #5032, #5033, #5035), at a central frequency of 7.30 GHz. We used a Beam Wave Guide focus cryogenically cooled receiver (system temperature ~25 K).
Goddard high resolution spectrograph science verification and data analysis
NASA Technical Reports Server (NTRS)
1992-01-01
The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.
Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies
NASA Technical Reports Server (NTRS)
Shum, C. K.
2000-01-01
This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.
Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research
NASA Technical Reports Server (NTRS)
Ryan, Robert E.; Terrie, Greg; Berglund, Judith
2006-01-01
This presentation introduces a draft plan for characterizing commercial data products for Earth science research. The general approach to the commercial product verification and validation includes focused selection of a readily available commercial remote sensing products that support Earth science research. Ongoing product verification and characterization will question whether the product meets specifications and will examine its fundamental properties, potential and limitations. Validation will encourage product evaluation for specific science research and applications. Specific commercial products included in the characterization plan include high-spatial-resolution multispectral (HSMS) imagery and LIDAR data products. Future efforts in this process will include briefing NASA headquarters and modifying plans based on feedback, increased engagement with the science community and refinement of details, coordination with commercial vendors and The Joint Agency Commercial Imagery Evaluation (JACIE) for HSMS satellite acquisitions, acquiring waveform LIDAR data and performing verification and validation.
The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...
EPA created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. It seeks to achieve this goal by providing high-quality, peer r...
Using Small-Step Refinement for Algorithm Verification in Computer Science Education
ERIC Educational Resources Information Center
Simic, Danijela
2015-01-01
Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…
SMAP Verification and Validation Project - Final Report
NASA Technical Reports Server (NTRS)
Murry, Michael
2012-01-01
In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... are engineers. UL today is comprised of five businesses, Product Safety, Verification Services, Life..., Director--Global Technical Research, UL Verification Services. Subscribed and sworn to before me this 20... (431.447(c)(4)) General Personnel Overview UL is a global independent safety science company with more...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melchior, P.; Suchyta, E.; Huff, E.
2015-03-31
We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modeling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modeling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. In addition, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1 degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less
Melchior, P.; Suchyta, E.; Huff, E.; ...
2015-03-31
We measure the weak-lensing masses and galaxy distributions of four massive galaxy clusters observed during the Science Verification phase of the Dark Energy Survey. This pathfinder study is meant to 1) validate the DECam imager for the task of measuring weak-lensing shapes, and 2) utilize DECam's large field of view to map out the clusters and their environments over 90 arcmin. We conduct a series of rigorous tests on astrometry, photometry, image quality, PSF modelling, and shear measurement accuracy to single out flaws in the data and also to identify the optimal data processing steps and parameters. We find Sciencemore » Verification data from DECam to be suitable for the lensing analysis described in this paper. The PSF is generally well-behaved, but the modelling is rendered difficult by a flux-dependent PSF width and ellipticity. We employ photometric redshifts to distinguish between foreground and background galaxies, and a red-sequence cluster finder to provide cluster richness estimates and cluster-galaxy distributions. By fitting NFW profiles to the clusters in this study, we determine weak-lensing masses that are in agreement with previous work. For Abell 3261, we provide the first estimates of redshift, weak-lensing mass, and richness. Additionally, the cluster-galaxy distributions indicate the presence of filamentary structures attached to 1E 0657-56 and RXC J2248.7-4431, stretching out as far as 1degree (approximately 20 Mpc), showcasing the potential of DECam and DES for detailed studies of degree-scale features on the sky.« less
NASA Technical Reports Server (NTRS)
Pines, D.
1999-01-01
This is the Performance Verification Report, METSAT (Meteorological Satellites) Phase Locked Oscillator Assembly, P/N 1348360-1, S/N F09 and F10, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).
The Environmental Technology Verification report discusses the technology and performance of the Static Pac System, Phase II, natural gas reciprocating compressor rod packing manufactured by the C. Lee Cook Division, Dover Corporation. The Static Pac System is designed to seal th...
Ishioka, Noriaki; Suzuki, Hiromi; Asashima, Makoto; Kamisaka, Seiichiro; Mogami, Yoshihiro; Ochiai, Toshimasa; Aizawa-Yano, Sachiko; Higashibata, Akira; Ando, Noboru; Nagase, Mutsumu; Ogawa, Shigeyuki; Shimazu, Toru; Fukui, Keiji; Fujimoto, Nobuyoshi
2004-03-01
Japan Aerospace Exploration Agency (JAXA) has developed a cell biology experiment facility (CBEF) and a clean bench (CB) as a common hardware in which life science experiments in the Japanese Experiment Module (JEM known as "Kibo") of the International Space Station (ISS) can be performed. The CBEF, a CO2 incubator with a turntable that provides variable gravity levels, is the basic hardware required to carry out the biological experiments using microorganisms, cells, tissues, small animals, plants, etc. The CB provides a closed aseptic operation area for life science and biotechnology experiments in Kibo. A phase contrast and fluorescence microscope is installed inside CB. The biological experiment units (BEU) are designed to run individual experiments using the CBEF and the CB. A plant experiment unit (PEU) and two cell experiment units (CEU type1 and type2) for the BEU have been developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capozzi, D.; et al.
We present the first study of the evolution of the galaxy luminosity and stellar-mass functions (GLF and GSMF) carried out by the Dark Energy Survey (DES). We describe the COMMODORE galaxy catalogue selected from Science Verification images. This catalogue is made ofmore » $$\\sim 4\\times 10^{6}$$ galaxies at $$0« less
PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory
NASA Astrophysics Data System (ADS)
Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.
2018-02-01
PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.
1981-03-01
overcome the shortcomings of this system. A phase III study develops the breakup model of the Space Shuttle clus’ter at various times into flight. The...2-1 ROCKET MODEL ..................................................... 2-5 COMBUSTION CHAMBER OPERATION ................................... 2-5...2-19 RESULTS .......................................................... 2-22 ROCKET MODEL
NASA Technical Reports Server (NTRS)
1991-01-01
The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...
ERIC Educational Resources Information Center
Applied Management Sciences, Inc., Silver Spring, MD.
Presented in this report are selected findings of the Income Verification Pilot Project (IVPP), an investigation examining misreporting of applicant income and family size on applications for government-sponsored school meal benefits. As reported here, Phase II of the project provided for a comprehensive assessment of specific quality assurance…
ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM
This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
The Mars Science Laboratory Organic Check Material
NASA Astrophysics Data System (ADS)
Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.
2012-09-01
Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).
NASA Astrophysics Data System (ADS)
Lobanov, P. D.; Usov, E. V.; Butov, A. A.; Pribaturin, N. A.; Mosunova, N. A.; Strizhov, V. F.; Chukhno, V. I.; Kutlimetov, A. E.
2017-10-01
Experiments with impulse gas injection into model coolants, such as water or the Rose alloy, performed at the Novosibirsk Branch of the Nuclear Safety Institute, Russian Academy of Sciences, are described. The test facility and the experimental conditions are presented in details. The dependence of coolant pressure on the injected gas flow and the time of injection was determined. The purpose of these experiments was to verify the physical models of thermohydraulic codes for calculation of the processes that could occur during the rupture of tubes of a steam generator with heavy liquid metal coolant or during fuel rod failure in water-cooled reactors. The experimental results were used for verification of the HYDRA-IBRAE/LM system thermohydraulic code developed at the Nuclear Safety Institute, Russian Academy of Sciences. The models of gas bubble transportation in a vertical channel that are used in the code are described in detail. A two-phase flow pattern diagram and correlations for prediction of friction of bubbles and slugs as they float up in a vertical channel and of two-phase flow friction factor are presented. Based on the results of simulation of these experiments using the HYDRA-IBRAE/LM code, the arithmetic mean error in predicted pressures was calculated, and the predictions were analyzed considering the uncertainty in the input data, geometry of the test facility, and the error of the empirical correlation. The analysis revealed major factors having a considerable effect on the predictions. The recommendations are given on updating of the experimental results and improvement of the models used in the thermohydraulic code.
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
Automatic programming for critical applications
NASA Technical Reports Server (NTRS)
Loganantharaj, Raj L.
1988-01-01
The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.
Quantum blind dual-signature scheme without arbitrator
NASA Astrophysics Data System (ADS)
Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying
2016-03-01
Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.
Generic interpreters and microprocessor verification
NASA Technical Reports Server (NTRS)
Windley, Phillip J.
1990-01-01
The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.
NASA Technical Reports Server (NTRS)
1978-01-01
The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.
System Verification of MSL Skycrane Using an Integrated ADAMS Simulation
NASA Technical Reports Server (NTRS)
White, Christopher; Antoun, George; Brugarolas, Paul; Lih, Shyh-Shiuh; Peng, Chia-Yen; Phan, Linh; San Martin, Alejandro; Sell, Steven
2012-01-01
Mars Science Laboratory (MSL) will use the Skycrane architecture to execute final descent and landing maneuvers. The Skycrane phase uses closed-loop feedback control throughout the entire phase, starting with rover separation, through mobility deploy, and through touchdown, ending only when the bridles have completely slacked. The integrated ADAMS simulation described in this paper couples complex dynamical models created by the mechanical subsystem with actual GNC flight software algorithms that have been compiled and linked into ADAMS. These integrated simulations provide the project with the best means to verify key Skycrane requirements which have a tightly coupled GNC-Mechanical aspect to them. It also provides the best opportunity to validate the design of the algorithm that determines when to cut the bridles. The results of the simulations show the excellent performance of the Skycrane system.
NASA Technical Reports Server (NTRS)
Miller, Richard B.
1992-01-01
The development and operations costs of the Space IR Telescope Facility (SIRTF) are discussed in the light of minimizing total outlays and optimizing efficiency. The development phase cannot extend into the post-launch segment which is planned to only support system verification and calibration followed by operations with a 70-percent efficiency goal. The importance of reducing the ground-support staff is demonstrated, and the value of the highly sensitive observations to the general astronomical community is described. The Failure Protection Algorithm for the SIRTF is designed for the 5-yr lifetime and the continuous venting of cryogen, and a science driven ground/operations system is described. Attention is given to balancing cost and performance, prototyping during the development phase, incremental development, the utilization of standards, and the integration of ground system/operations with flight system integration and test.
Weak lensing magnification in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration
2018-05-01
In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.
Cryo Testing of tbe James Webb Space Telescope's Integrated Science Instrument Module
NASA Technical Reports Server (NTRS)
VanCampen, Julie
2004-01-01
The Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope will be integrated and tested at the Environmental Test Facilities at Goddard Space Flight Center (GSFC). The cryogenic thermal vacuum testing of the ISIM will be the most difficult and problematic portion of the GSFC Integration and Test flow. The test is to validate the coupled interface of the science instruments and the ISIM structure and to sufficiently stress that interface while validating image quality of the science instruments. The instruments and the structure are not made from the same materials and have different CTE. Test objectives and verification rationale are currently being evaluated in Phase B of the project plan. The test program will encounter engineering challenges and limitations, which are derived by cost and technology many of which can be mitigated by facility upgrades, creative GSE, and thorough forethought. The cryogenic testing of the ISIM will involve a number of risks such as the implementation of unique metrology techniques, mechanical, electrical and optical simulators housed within the cryogenic vacuum environment. These potential risks are investigated and possible solutions are proposed.
The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...
Software for imaging phase-shift interference microscope
NASA Astrophysics Data System (ADS)
Malinovski, I.; França, R. S.; Couceiro, I. B.
2018-03-01
In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.
Expediting Combinatorial Data Set Analysis by Combining Human and Algorithmic Analysis.
Stein, Helge Sören; Jiao, Sally; Ludwig, Alfred
2017-01-09
A challenge in combinatorial materials science remains the efficient analysis of X-ray diffraction (XRD) data and its correlation to functional properties. Rapid identification of phase-regions and proper assignment of corresponding crystal structures is necessary to keep pace with the improved methods for synthesizing and characterizing materials libraries. Therefore, a new modular software called htAx (high-throughput analysis of X-ray and functional properties data) is presented that couples human intelligence tasks used for "ground-truth" phase-region identification with subsequent unbiased verification by an algorithm to efficiently analyze which phases are present in a materials library. Identified phases and phase-regions may then be correlated to functional properties in an expedited manner. For the functionality of htAx to be proven, two previously published XRD benchmark data sets of the materials systems Al-Cr-Fe-O and Ni-Ti-Cu are analyzed by htAx. The analysis of ∼1000 XRD patterns takes less than 1 day with htAx. The proposed method reliably identifies phase-region boundaries and robustly identifies multiphase structures. The method also addresses the problem of identifying regions with previously unpublished crystal structures using a special daisy ternary plot.
Verification Study of Buoyancy-Driven Turbulent Nuclear Combustion
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-01-01
Buoyancy-driven turbulent nuclear combustion determines the rate of nuclear burning during the deflagration phase (i.e., the ordinary nuclear flame phase) of Type 1a supernovae, and hence the amount of nuclear energy released during this phase. It therefore determines the amount the white dwarf star expands prior to initiation of a detonation wave, and so the amount of radioactive nickel and thus the peak luminosity of the explosion. However, this key physical process is not fully understood. To better understand this process, the Flash Center has conducted an extensive series of large-scale 3D simulations of buoyancy-driven turbulent nuclear combustion for threemore » different physical situations. This movie shows the results for some of these simulations. Credits: Science: Ray Bair, Katherine Riley, Argonne National Laboratory; Anshu Dubey, Don Lamb, Dongwook Lee, University of Chicago; Robert Fisher, University of Massachusetts at Dartmouth and Dean Townsley, University of Alabama Visualization: Jonathan Gallagher, University of Chicago; Randy Hudson, John Norris and Michael E. Papka, Argonne National Laboratory/University of Chicago« less
Model-Based Building Verification in Aerial Photographs.
1987-09-01
Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and
Requirement Specifications for a Design and Verification Unit.
ERIC Educational Resources Information Center
Pelton, Warren G.; And Others
A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…
Design, analysis, and test verification of advanced encapsulation systems
NASA Technical Reports Server (NTRS)
Mardesich, N.; Minning, C.
1982-01-01
Design sensitivities are established for the development of photovoltaic module criteria and the definition of needed research tasks. The program consists of three phases. In Phase I, analytical models were developed to perform optical, thermal, electrical, and structural analyses on candidate encapsulation systems. From these analyses several candidate systems will be selected for qualification testing during Phase II. Additionally, during Phase II, test specimens of various types will be constructed and tested to determine the validity of the analysis methodology developed in Phase I. In Phse III, a finalized optimum design based on knowledge gained in Phase I and II will be developed. All verification testing was completed during this period. Preliminary results and observations are discussed. Descriptions of the thermal, thermal structural, and structural deflection test setups are included.
Mechanical verification of a schematic Byzantine clock synchronization algorithm
NASA Technical Reports Server (NTRS)
Shankar, Natarajan
1991-01-01
Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.
ESTEST: An Open Science Platform for Electronic Structure Research
ERIC Educational Resources Information Center
Yuan, Gary
2012-01-01
Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.
This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less
SPHERES: From Ground Development to Operations on ISS
NASA Technical Reports Server (NTRS)
Katterhagen, A.
2015-01-01
SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) is an internal International Space Station (ISS) Facility that supports multiple investigations for the development of multi-spacecraft and robotic control algorithms. The SPHERES Facility on ISS is managed and operated by the SPHERES National Lab Facility at NASA Ames Research Center (ARC) at Moffett Field California. The SPHERES Facility on ISS consists of three self-contained eight-inch diameter free-floating satellites which perform the various flight algorithms and serve as a platform to support the integration of experimental hardware. To help make science a reality on the ISS, the SPHERES ARC team supports a Guest Scientist Program (GSP). This program allows anyone with new science the possibility to interface with the SPHERES team and hardware. In addition to highlighting the available SPHERES hardware on ISS and on the ground, this presentation will also highlight ground support, facilities, and resources available to guest researchers. Investigations on the ISS evolve through four main phases: Strategic, Tactical, Operations, and Post Operations. The Strategic Phase encompasses early planning beginning with initial contact by the Principle Investigator (PI) and the SPHERES program who may work with the PI to assess what assistance the PI may need. Once the basic parameters are understood, the investigation moves to the Tactical Phase which involves more detailed planning, development, and testing. Depending on the nature of the investigation, the tactical phase may be split into the Lab Tactical Phase or the ISS Tactical Phase due to the difference in requirements for the two destinations. The Operations Phase is when the actual science is performed; this can be either in the lab, or on the ISS. The Post Operations Phase encompasses data analysis and distribution, and generation of summary status and reports. The SPHERES Operations and Engineering teams at ARC is composed of experts who can guide the Payload Developer (PD) and Principle Investigator (PI) in reaching critical milestones to make their science a reality using the SPHERES platform. From performing integrated safety and verification assessments, to assisting in developing crew procedures and operations products, to organizing, planning, and executing all test sessions, to helping manage data products, the SPHERES team at ARC is available to support microgravity research with the SPEHRES Guest Scientist Program.
DOT National Transportation Integrated Search
2014-04-01
The objective of this project was to quantify the effectiveness of the rail inspection ground verification process. More specifically, : the project focused on comparing the effectiveness of conventional versus phased array probes to manually detect ...
Fluor Daniel Hanford Inc. integrated safety management system phase 1 verification final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
PARSONS, J.E.
1999-10-28
The purpose of this review is to verify the adequacy of documentation as submitted to the Approval Authority by Fluor Daniel Hanford, Inc. (FDH). This review is not only a review of the Integrated Safety Management System (ISMS) System Description documentation, but is also a review of the procedures, policies, and manuals of practice used to implement safety management in an environment of organizational restructuring. The FDH ISMS should support the Hanford Strategic Plan (DOE-RL 1996) to safely clean up and manage the site's legacy waste; deploy science and technology while incorporating the ISMS theme to ''Do work safely''; andmore » protect human health and the environment.« less
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
DOT National Transportation Integrated Search
2005-09-01
This document describes a procedure for verifying a dynamic testing system (closed-loop servohydraulic). The procedure is divided into three general phases: (1) electronic system performance verification, (2) calibration check and overall system perf...
NASA Astrophysics Data System (ADS)
Yankovskii, A. P.
2015-05-01
An indirect verification of a structural model describing the creep of a composite medium reinforced by honeycombs and made of nonlinear hereditary phase materials obeying the Rabotnov theory of creep is presented. It is shown that the structural model proposed is trustworthy and can be used in practical calculations. For different kinds of loading, creep curves for a honeycomb core made of a D16T aluminum alloy are calculated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-01-12
This report describes the work done under Phase II, the verification testing of the Kinetic Extruder. The main objective of the test program was to determine failure modes and wear rates. Only minor auxiliary equipment malfunctions were encountered. Wear rates indicate useful life expectancy of from 1 to 5 years for wear-exposed components. Recommendations are made for adapting the equipment for pilot plant and commercial applications. 3 references, 20 figures, 12 tables.
The Screaming Boredom of Learning Science
ERIC Educational Resources Information Center
Krips, H.
1977-01-01
Advocates changing the role of secondary school science from one of theory verification and problem solving to the formulation and acceptance of hypotheses for observed phenomena. Provides an example of the procedure using Hooke's Law. (CP)
Developing a Test for Assessing Elementary Students' Comprehension of Science Texts
ERIC Educational Resources Information Center
Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien
2012-01-01
This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…
Code of Federal Regulations, 2010 CFR
2010-10-01
... the site-specific application programs, run timers, read inputs, drive outputs, perform self... validation process is to determine “whether the correct product was built.” Verification means the process of... established at the start of that phase. The goal of the verification process is to determine “whether the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.
We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i
Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; ...
2017-09-01
We report the results of our searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verication and Year 1 observations. The Science Verication data spans approximately 250 sq. deg. with median i
Earth Science Activities: A Guide to Effective Elementary School Science Teaching.
ERIC Educational Resources Information Center
Kanis, Ira B.; Yasso, Warren E.
The primary emphasis of this book is on new or revised earth science activities that promote concept development rather than mere verification of concepts learned by passive means. Chapter 2 describes philosophies, strategies, methods, and techniques to guide preservice and inservice teachers, school building administrators, and curriculum…
Large liquid rocket engine transient performance simulation system
NASA Technical Reports Server (NTRS)
Mason, J. R.; Southwick, R. D.
1989-01-01
Phase 1 of the Rocket Engine Transient Simulation (ROCETS) program consists of seven technical tasks: architecture; system requirements; component and submodel requirements; submodel implementation; component implementation; submodel testing and verification; and subsystem testing and verification. These tasks were completed. Phase 2 of ROCETS consists of two technical tasks: Technology Test Bed Engine (TTBE) model data generation; and system testing verification. During this period specific coding of the system processors was begun and the engineering representations of Phase 1 were expanded to produce a simple model of the TTBE. As the code was completed, some minor modifications to the system architecture centering on the global variable common, GLOBVAR, were necessary to increase processor efficiency. The engineering modules completed during Phase 2 are listed: INJTOO - main injector; MCHBOO - main chamber; NOZLOO - nozzle thrust calculations; PBRNOO - preburner; PIPE02 - compressible flow without inertia; PUMPOO - polytropic pump; ROTROO - rotor torque balance/speed derivative; and TURBOO - turbine. Detailed documentation of these modules is in the Appendix. In addition to the engineering modules, several submodules were also completed. These submodules include combustion properties, component performance characteristics (maps), and specific utilities. Specific coding was begun on the system configuration processor. All functions necessary for multiple module operation were completed but the SOLVER implementation is still under development. This system, the Verification Checkout Facility (VCF) allows interactive comparison of module results to store data as well as provides an intermediate checkout of the processor code. After validation using the VCF, the engineering modules and submodules were used to build a simple TTBE.
The Environmental Technology Verification report discusses the technology and performance of Seal Assist System (SAS) for natural gas reciprocating compressor rod packing manufactured by A&A Environmental Seals, Inc. The SAS uses a secondary containment gland to collect natural g...
ESA's X-ray space observatory XMM takes first pictures
NASA Astrophysics Data System (ADS)
2000-02-01
Under the aegis of Prof. Roger Bonnet, ESA Director of Science, the mission's Principal Investigators will be presenting these spectacular first images at a press conference to be held on 9 February at the ESA Vilspa facility at Villafranca/Madrid in Spain, where the XMM Science Operations Centre is located. The event will also be the occasion for several major announcements concerning the XMM mission. In particular Professor Bonnet will launch the third XMM competition "Stargazing" - previously announced in September 1999. This will address European youngsters, 16 to 18 years old, who will be offered the unique opportunity of winning observing time using the X-ray telescope. Commissioning phase starts After a successful launch from Kourou on Ariane 504 on 10 December 1999, XMM was brought to its final operational orbit in the following week. The telescope doors on the X-ray Mirror Modules and on the Optical Monitor telescope were opened on 17/18 December. The Radiation Monitor was activated on 19 December and the spacecraft was put into a quiet mode over the Christmas and New Year period. The mission's scientific data is being received, processed and dispatched to astronomers by the XMM Science Operations Centre in Villafranca. Operations with the spacecraft restarted there on 4 January when, as part of the commissioning phase, all the science payloads were switched on one after the other for initial verifications. By the week of 17 January functional tests had begun on the Optical Monitor, the EPIC pn, the two EPIC MOS and the two RGS instruments. The internal doors of the EPIC cameras were opened whilst keeping the camera filter wheels closed. Astounding first images After a series of engineering exposures, all three EPIC cameras were used in turn, between 19-24 January, to take several views of two different extragalactic regions of the Universe. These views, featuring a variety of extended and X-ray point sources, were chosen to demonstrate the full functioning of the observatory. The Optical Monitor also simultaneously viewed the same regions. One RGS spectrometer obtained its first spectra on 25 January; the other will be commissioned at the start of February. This initial series of short and long duration exposures have delighted the Project management team and the scientists even more. First analyses confirm that the spacecraft is extremely stable, the XMM telescopes are focusing perfectly, and the EPIC cameras, Optical Monitor and RGS spectrometers are working exactly as expected. The Science Operations Centre infrastructure, processing and archiving the science data telemetry from the spacecraft, is also performing well. Initial inspection of the first commissioning images immediately showed some unique X-ray views of several celestial objects, to be presented on 9 February. The occasion will give Principal Investigators and Project management the opportunity to comment on the pictures and the excellent start of the XMM mission. The Calibration and Performance Verification phase for XMM's science instruments is to begin on 3 March, with routine science operations starting in June. Press is invited to attend to the press conference that will be held at the Villafranca/ Madrid- Vilspa facility (ESA's Satellite Tracking Station) Apartado 50727, E-2 080 MADRID, Spain. The press event will be broadcast to the other ESA establishments: ESA Headquarters, Paris; ESA/ ESTEC (Space Expo), Noordwijk, the Netherlands; ESA/ESOC, Darmstadt, Germany and ESA/ESRIN, Frascati, Italy. Media representatives wishing to attend the event are kindly requested to fill out the attached reply from and fax it back to the establishment of their choice.
Cognitive neuroscience in forensic science: understanding and utilizing the human element
Dror, Itiel E.
2015-01-01
The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. PMID:26101281
Cognitive neuroscience in forensic science: understanding and utilizing the human element.
Dror, Itiel E
2015-08-05
The human element plays a critical role in forensic science. It is not limited only to issues relating to forensic decision-making, such as bias, but also relates to most aspects of forensic work (some of which even take place before a crime is ever committed or long after the verification of the forensic conclusion). In this paper, I explicate many aspects of forensic work that involve the human element and therefore show the relevance (and potential contribution) of cognitive neuroscience to forensic science. The 10 aspects covered in this paper are proactive forensic science, selection during recruitment, training, crime scene investigation, forensic decision-making, verification and conflict resolution, reporting, the role of the forensic examiner, presentation in court and judicial decisions. As the forensic community is taking on the challenges introduced by the realization that the human element is critical for forensic work, new opportunities emerge that allow for considerable improvement and enhancement of the forensic science endeavour. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.
Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke
2018-05-21
Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.
Software development for airborne radar
NASA Astrophysics Data System (ADS)
Sundstrom, Ingvar G.
Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.
ERIC Educational Resources Information Center
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien
2016-01-01
The purpose of this study was to develop a computer-based measure of elementary students' science talk and to report students' benchmarks. The development procedure had three steps: defining the framework of the test, collecting and identifying key reference sets of science talk, and developing and verifying the science talk instrument. The…
Hubble Space Telescope high speed photometer orbital verification
NASA Technical Reports Server (NTRS)
Richards, Evan E.
1991-01-01
The purpose of this report is to provide a summary of the results of the HSP (High Speed Photometer) Orbital Verification (OV) tests and to report conclusions and lessons learned from the initial operations of the HSP. The HSP OV plan covered the activities through fine (phase 3) alignment. This report covers all activities (OV, SV, and SAO) from launch to the completion of phase 3 alignment. Those activities in this period that are not OV tests are described to the extent that they relate to OV activities.
Aqueous cleaning and verification processes for precision cleaning of small parts
NASA Technical Reports Server (NTRS)
Allen, Gale J.; Fishell, Kenneth A.
1995-01-01
The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.
2008-02-28
An ER-2 high-altitude Earth science aircraft banks away during a flight over the southern Sierra Nevada. NASA’s Armstrong Flight Research Center operates two of the Lockheed-built aircraft on a wide variety of environmental science, atmospheric sampling, and satellite data verification missions.
Leak Mitigation in Mechanically Pumped Fluid Loops for Long Duration Space Missions
NASA Technical Reports Server (NTRS)
Miller, Jennifer R.; Birur, Gajanana; Bame, David; Mastropietro, A. J.; Bhandari, Pradeep; Lee, Darlene; Karlmann, Paul; Liu, Yuanming
2013-01-01
Mechanically pumped fluid loops (MPFLs) are increasingly considered for spacecraft thermal control. A concern for long duration space missions is the leak of fluid leading to performance degradation or potential loop failure. An understanding of leak rate through analysis, as well as destructive and non-destructive testing, provides a verifiable means to quantify leak rates. The system can be appropriately designed to maintain safe operating pressures and temperatures throughout the mission. Two MPFLs on the Mars Science Laboratory Spacecraft, launched November 26, 2011, maintain the temperature of sensitive electronics and science instruments within a -40 deg C to 50 deg C range during launch, cruise, and Mars surface operations. With over 100 meters of complex tubing, fittings, joints, flex lines, and pumps, the system must maintain a minimum pressure through all phases of the mission to provide appropriate performance. This paper describes the process of design, qualification, test, verification, and validation of the components and assemblies employed to minimize risks associated with excessive fluid leaks from pumped fluid loop systems.
Critical Surface Cleaning and Verification Alternatives
NASA Technical Reports Server (NTRS)
Melton, Donald M.; McCool, A. (Technical Monitor)
2000-01-01
As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.
The DES Science Verification Weak Lensing Shear Catalogs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarvis, M.
We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less
The DES Science Verification Weak Lensing Shear Catalogs
Jarvis, M.
2016-05-01
We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less
Adaptive optics self-calibration using differential OTF (dOTF)
NASA Astrophysics Data System (ADS)
Rodack, Alexander T.; Knight, Justin M.; Codona, Johanan L.; Miller, Kelsey L.; Guyon, Olivier
2015-09-01
We demonstrate self-calibration of an adaptive optical system using differential OTF [Codona, JL; Opt. Eng. 0001; 52(9):097105-097105. doi:10.1117/1.OE.52.9.097105]. We use a deformable mirror (DM) along with science camera focal plane images to implement a closed-loop servo that both flattens the DM and corrects for non-common-path aberrations within the telescope. The pupil field modification required for dOTF measurement is introduced by displacing actuators near the edge of the illuminated pupil. Simulations were used to develop methods to retrieve the phase from the complex amplitude dOTF measurements for both segmented and continuous sheet MEMS DMs and tests were performed using a Boston Micromachines continuous sheet DM for verification. We compute the actuator correction updates directly from the phase of the dOTF measurements, reading out displacements and/or slopes at segment and actuator positions. Through simulation, we also explore the effectiveness of these techniques for a variety of photons collected in each dOTF exposure pair.
Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for fluid-particle flows
NASA Astrophysics Data System (ADS)
Kong, Bo; Patel, Ravi G.; Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney O.
2017-11-01
In this work, we study the performance of three simulation techniques for fluid-particle flows: (1) a volume-filtered Euler-Lagrange approach (EL), (2) a quadrature-based moment method using the anisotropic Gaussian closure (AG), and (3) a traditional two-fluid model. By simulating two problems: particles in frozen homogeneous isotropic turbulence (HIT), and cluster-induced turbulence (CIT), the convergence of the methods under grid refinement is found to depend on the simulation method and the specific problem, with CIT simulations facing fewer difficulties than HIT. Although EL converges under refinement for both HIT and CIT, its statistical results exhibit dependence on the techniques used to extract statistics for the particle phase. For HIT, converging both EE methods (TFM and AG) poses challenges, while for CIT, AG and EL produce similar results. Overall, all three methods face challenges when trying to extract converged, parameter-independent statistics due to the presence of shocks in the particle phase. National Science Foundation and National Energy Technology Laboratory.
Manned Mars mission accommodation: Sprint mission
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Kaszubowski, Martin J.; Ayers, J. Kirk; Llewellyn, Charles P.; Weidman, Deene J.; Meredith, Barry D.
1988-01-01
The results of a study conducted at the NASA-LaRC to assess the impacts on the Phase 2 Space Station of Accommodating a Manned Mission to Mars are documented. In addition, several candidate transportation node configurations are presented to accommodate the assembly and verification of the Mars Mission vehicles. This study includes an identification of a life science research program that would need to be completed, on-orbit, prior to mission departure and an assessment of the necessary orbital technology development and demonstration program needed to accomplish the mission. Also included is an analysis of the configuration mass properties and a preliminary analysis of the Space Station control system sizing that would be required to control the station. Results of the study indicate the Phase 2 Space Station can support a manned mission to Mars with the addition of a supporting infrastructure that includes a propellant depot, assembly hangar, and a heavy lift launch vehicle to support the large launch requirements.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.
1992-01-01
The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).
RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.
The National Human Exposure Assessment Sur...
The report presents results of tests determining the efficacy of A&A Environmental Seals, Inc's Seal Assist System (SAS) in preventing natural gas compressor station's compressor rod packing leaks from escaping into the atmosphere. The SAS consists of an Emission Containment Glan...
Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia
1996-01-01
The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Steve
2011-01-01
The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.
The concept verification testing of materials science payloads
NASA Technical Reports Server (NTRS)
Griner, C. S.; Johnston, M. H.; Whitaker, A.
1976-01-01
The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.
Melchior, P.; Gruen, D.; McClintock, T.; ...
2017-05-16
Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melchior, P.; Gruen, D.; McClintock, T.
Here, we use weak-lensing shear measurements to determine the mean mass of optically selected galaxy clusters in Dark Energy Survey Science Verification data. In a blinded analysis, we split the sample of more than 8000 redMaPPer clusters into 15 subsets, spanning ranges in the richness parameter 5 ≤ λ ≤ 180 and redshift 0.2 ≤ z ≤ 0.8, and fit the averaged mass density contrast profiles with a model that accounts for seven distinct sources of systematic uncertainty: shear measurement and photometric redshift errors; cluster-member contamination; miscentring; deviations from the NFW halo profile; halo triaxiality and line-of-sight projections.
ERIC Educational Resources Information Center
Scholes, Robert J.; And Others
The effects of sentence imitation and picture verification on the recall of subsequent digits were studied. Stimuli consisted of 20 sentences, each sentence followed by a string of five digit names, and five structural types of sentences were presented. Subjects were instructed to listen to the sentence and digit string and then either immediately…
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
Sandia National Laboratories: Directed-energy tech receives funding to
Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD & Figures Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research
The role of science in treaty verification.
Gavron, Avigdor
2005-01-01
Technologically advanced nations are currently applying more science to treaty verification than ever before. Satellites gather a multitude of information relating to proliferation concerns using thermal imaging analysis, nuclear radiation measurements, and optical and radio frequency signals detection. Ground stations gather complementary signals such as seismic events and radioactive emissions. Export controls in many countries attempt to intercept materials and technical means that could be used for nuclear proliferation. Nevertheless, we have witnessed a plethora of nuclear proliferation episodes, that were undetected (or were belatedly detected) by these technologies--the Indian nuclear tests in 1998, the Libyan nuclear buildup, the Iranian enrichment program and the North Korea nuclear weapons program are some prime examples. In this talk, we will discuss some of the technologies used for proliferation detection. In particular, we will note some of the issues relating to nuclear materials control agreements that epitomize political difficulties as they impact the implementation of science and technology.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems
1994-07-29
time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real
2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter
2007-08-23
Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust
Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...
The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...
The report presents results of a Phase I test of emissions packing technology offered by France Compressor Products which is designed to reduce methane leaks from compressor rod packing when a compressor is in a standby and pressurized state. This Phase I test was executed betwee...
Commercial Art. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Brown, Ted; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of commercial art, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train commercial artists. Section 1 contains general information: purpose of Phase I; description…
Electric power system test and verification program
NASA Technical Reports Server (NTRS)
Rylicki, Daniel S.; Robinson, Frank, Jr.
1994-01-01
Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerji, M.; Jouvel, S.; Lin, H.
2014-11-25
We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band ( grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factormore » of ~4.5 relative to a simple catalogue level matching and results in a ~1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES+VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z ~ 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerji, M.; Jouvel, S.; Lin, H.
2014-11-25
We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band (grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factor ofmore » similar to 4.5 relative to a simple catalogue level matching and results in a similar to 1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z similar to 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.« less
A Comparison of the Effects of Two Instructional Sequences Involving Science Laboratory Activities.
ERIC Educational Resources Information Center
Ivins, Jerry Edward
This study attempted to determine if students learn science concepts better when laboratories are used to verify concepts already intorduced through lectures and textbooks (verification laboratories or whether achievement and retention are improved when laboratories are used to introduce new concepts (directed discovery learning laboratories). The…
Shuttle-tethered satellite system definition study extension
NASA Technical Reports Server (NTRS)
1980-01-01
A system requirements definition and configuration study (Phase B) of the Tethered Satellite System (TSS) was conducted during the period 14 November 1977 to 27 February 1979. Subsequently a study extension was conducted during the period 13 June 1979 to 30 June 1980, for the purpose of refining the requirements identified during the main phase of the study, and studying in some detail the implications of accommodating various types of scientific experiments on the initial verification flight mission. An executive overview is given of the Tethered Satellite System definition developed during the study. The results of specific study tasks undertaken in the extension phase of the study are reported. Feasibility of the Tethered Satellite System has been established with reasonable confidence and the groundwork laid for proceeding with hardware design for the verification mission.
Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs
Bass, Ellen J.
2011-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, B.; Stern, W.; Colley, J.
International Atomic Energy Agency (IAEA) safeguards involves verification activities at a wide range of facilities in a variety of operational phases (e.g., under construction, start-up, operating, shutdown, closed-down, and decommissioned). Safeguards optimization for each different facility type and operational phase is essential for the effectiveness of safeguards implementation. The IAEA’s current guidance regarding safeguards for the different facility types in the various lifecycle phases is provided in its Design Information Examination (DIE) and Verification (DIV) procedure. 1 Greater efficiency in safeguarding facilities that are shut down or closed down, including those being decommissioned, could allow the IAEA to use amore » greater portion of its effort to conduct other verification activities. Consequently, the National Nuclear Security Administration’s Office of International Nuclear Safeguards sponsored this study to evaluate whether there is an opportunity to optimize safeguards approaches for facilities that are shutdown or closed-down. The purpose of this paper is to examine existing safeguards approaches for shutdown and closed-down facilities, including facilities being decommissioned, and to seek to identify whether they may be optimized.« less
Concept Verification Test - Evaluation of Spacelab/Payload operation concepts
NASA Technical Reports Server (NTRS)
Mcbrayer, R. O.; Watters, H. H.
1977-01-01
The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.
Verification testing of the ADI International Inc. Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8, 2003 through May 28,...
Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program
NASA Technical Reports Server (NTRS)
Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby
2017-01-01
Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.
Engineering within the assembly, verification, and integration (AIV) process in ALMA
NASA Astrophysics Data System (ADS)
Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene
2010-07-01
The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered directly in this paper. It is believed that both continuous improvement and the clear definition of the AIV 4-station model were key factors in achieving the goal of bringing the antennas into a state that is well enough characterized in order to smoothly start commissioning activities.
Avionics Technology Contract Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Squires, Shiela S.
This document reports on Phase I of a project that examined the occupation of avionics technician, established appropriate committees, and conducted task verification. Results of this phase provide the basic information required to develop the program standards and to guide and set up the committee structure to guide the project. Section 1…
A Teamwork-Oriented Air Traffic Control Simulator
2006-06-01
the software development methodology of this work , this chapter is viewed as the acquisition phase of this model. The end of the ...Maintenance phase Changed Verification Retirement Development Maintenance 37 because the different controllers working in these phases usually...traditional operation such as scaling the airport and personalizing the working environment. 4. Pilot Specification The
Sheet Metal Contract. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Kirkpatrick, Thomas; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of sheet metal, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train sheet metal workers. Section 1 contains general information: purpose of Phase I; description…
Restricted access processor - An application of computer security technology
NASA Technical Reports Server (NTRS)
Mcmahon, E. M.
1985-01-01
This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.
Gyrokinetic Particle Simulation of Turbulent Transport in Burning Plasmas (GPS - TTBP) Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chame, Jacqueline
2011-05-27
The goal of this project is the development of the Gyrokinetic Toroidal Code (GTC) Framework and its applications to problems related to the physics of turbulence and turbulent transport in tokamaks,. The project involves physics studies, code development, noise effect mitigation, supporting computer science efforts, diagnostics and advanced visualizations, verification and validation. Its main scientific themes are mesoscale dynamics and non-locality effects on transport, the physics of secondary structures such as zonal flows, and strongly coherent wave-particle interaction phenomena at magnetic precession resonances. Special emphasis is placed on the implications of these themes for rho-star and current scalings and formore » the turbulent transport of momentum. GTC-TTBP also explores applications to electron thermal transport, particle transport; ITB formation and cross-cuts such as edge-core coupling, interaction of energetic particles with turbulence and neoclassical tearing mode trigger dynamics. Code development focuses on major initiatives in the development of full-f formulations and the capacity to simulate flux-driven transport. In addition to the full-f -formulation, the project includes the development of numerical collision models and methods for coarse graining in phase space. Verification is pursued by linear stability study comparisons with the FULL and HD7 codes and by benchmarking with the GKV, GYSELA and other gyrokinetic simulation codes. Validation of gyrokinetic models of ion and electron thermal transport is pursed by systematic stressing comparisons with fluctuation and transport data from the DIII-D and NSTX tokamaks. The physics and code development research programs are supported by complementary efforts in computer sciences, high performance computing, and data management.« less
ERIC Educational Resources Information Center
Suits, Jerry P.
2004-01-01
A laboratory practical examination was used to compare the investigative skills developed in two different types of general-chemistry laboratory courses. Science and engineering majors (SEM) in the control group used a traditional verification approach (SEM-Ctrl), whereas those in the treatment group learned from an innovative, inquiry-based…
The Babushka Concept--An Instructional Sequence to Enhance Laboratory Learning in Science Education
ERIC Educational Resources Information Center
Gårdebjer, Sofie; Larsson, Anette; Adawi, Tom
2017-01-01
This paper deals with a novel method for improving the traditional "verification" laboratory in science education. Drawing on the idea of integrated instructional units, we describe an instructional sequence which we call the Babushka concept. This concept consists of three integrated instructional units: a start-up lecture, a laboratory…
Remote Sensing Product Verification and Validation at the NASA Stennis Space Center
NASA Technical Reports Server (NTRS)
Stanley, Thomas M.
2005-01-01
Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni; Maki, Justin N.; Cucullu, Gordon C.
2008-01-01
Package Qualification and Verification (PQV) of advanced electronic packaging and interconnect technologies and various other types of qualification hardware for the Mars Exploration Rover/Mars Science Laboratory flight projects has been performed to enhance the mission assurance. The qualification of hardware (Engineering Camera and Platinum Resistance Thermometer, PRT) under extreme cold temperatures has been performed with reference to various project requirements. The flight-like packages, sensors, and subassemblies have been selected for the study to survive three times (3x) the total number of expected temperature cycles resulting from all environmental and operational exposures occurring over the life of the flight hardware including all relevant manufacturing, ground operations and mission phases. Qualification has been performed by subjecting above flight-like qual hardware to the environmental temperature extremes and assessing any structural failures or degradation in electrical performance due to either overstress or thermal cycle fatigue. Experiments of flight like hardware qualification test results have been described in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.
We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median i -band limiting magnitude for extended objects (10 σ ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an i -band limiting magnitude for extended objects (10 σ ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified basedmore » on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.« less
NASA Astrophysics Data System (ADS)
Diehl, H. T.; Buckley-Geer, E. J.; Lindgren, K. A.; Nord, B.; Gaitsch, H.; Gaitsch, S.; Lin, H.; Allam, S.; Collett, T. E.; Furlanetto, C.; Gill, M. S. S.; More, A.; Nightingale, J.; Odden, C.; Pellico, A.; Tucker, D. L.; da Costa, L. N.; Fausti Neto, A.; Kuropatkin, N.; Soares-Santos, M.; Welch, B.; Zhang, Y.; Frieman, J. A.; Abdalla, F. B.; Annis, J.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; Desai, S.; Dietrich, J. P.; Drlica-Wagner, A.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Nichol, R. C.; Nugent, P.; Ogando, R. L. C.; Plazas, A. A.; Reil, K.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Sheldon, E.; Smith, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.; DES Collaboration
2017-09-01
We report the results of searches for strong gravitational lens systems in the Dark Energy Survey (DES) Science Verification and Year 1 observations. The Science Verification data span approximately 250 sq. deg. with a median I-band limiting magnitude for extended objects (10σ) of 23.0. The Year 1 data span approximately 2000 sq. deg. and have an I-band limiting magnitude for extended objects (10σ) of 22.9. As these data sets are both wide and deep, they are particularly useful for identifying strong gravitational lens candidates. Potential strong gravitational lens candidate systems were initially identified based on a color and magnitude selection in the DES object catalogs or because the system is at the location of a previously identified galaxy cluster. Cutout images of potential candidates were then visually scanned using an object viewer and numerically ranked according to whether or not we judged them to be likely strong gravitational lens systems. Having scanned nearly 400,000 cutouts, we present 374 candidate strong lens systems, of which 348 are identified for the first time. We provide the R.A. and decl., the magnitudes and photometric properties of the lens and source objects, and the distance (radius) of the source(s) from the lens center for each system.
RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.
Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)
2016-03-01
up game Binary Fission, which was deployed during Phase Two of CHEKOFV. Xylem: The Code of Plants is a casual game for players using mobile ...there are the design and engineering challenges of building a game infrastructure that integrates verification technology with crowd participation...the backend processes that annotate the originating software. Allowing players to construct their own equations opened up the flexibility to receive
Applying Independent Verification and Validation to Automatic Test Equipment
NASA Technical Reports Server (NTRS)
Calhoun, Cynthia C.
1997-01-01
This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.
Radiation effects on science instruments in Grand Tour type missions
NASA Technical Reports Server (NTRS)
Parker, R. H.
1972-01-01
The extent of the radiation effects problem is delineated, along with the status of protective designs for 15 representative science instruments. Designs for protecting science instruments from radiation damage is discussed for the various instruments to be employed in the Grand Tour type missions. A literature search effort was undertaken to collect science instrument components damage/interference effects data on the various sensitive components such as Si detectors, vidicon tubes, etc. A small experimental effort is underway to provide verification of the radiation effects predictions.
25 CFR 39.230 - How will the provisions of this subpart be phased in?
Code of Federal Regulations, 2010 CFR
2010-04-01
... SCHOOL EQUALIZATION PROGRAM Administrative Procedures, Student Counts, and Verifications Phase-in Period... rolling average of ADM for each school and for the entire Bureau-funded school system will be phased-in as shown in the following table. Time period How OIEP must calculate ADM (a) First school year after May 31...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason D. Hales; Veena Tikare
2014-04-01
The Used Fuel Disposition (UFD) program has initiated a project to develop a hydride formation modeling tool using a hybrid Pottsphase field approach. The Potts model is incorporated in the SPPARKS code from Sandia National Laboratories. The phase field model is provided through MARMOT from Idaho National Laboratory.
Experimental verification of arm-locking for LISA using electronic phase delay [rapid communication
NASA Astrophysics Data System (ADS)
Thorpe, J. I.; Mueller, G.
2005-07-01
We present results of an electronic model of arm-locking, a proposed technique for reducing the laser phase noise in the laser interferometer space antenna (LISA). The model is based on a delay of 500 ms, achieved using the electronic phase delay (EPD) method. The observed behavior is consistent with predictions.
6 CFR 5.2 - Public reading rooms.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...
6 CFR 5.2 - Public reading rooms.
Code of Federal Regulations, 2014 CFR
2014-01-01
... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...
6 CFR 5.2 - Public reading rooms.
Code of Federal Regulations, 2011 CFR
2011-01-01
... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...
6 CFR 5.2 - Public reading rooms.
Code of Federal Regulations, 2013 CFR
2013-01-01
... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...
6 CFR 5.2 - Public reading rooms.
Code of Federal Regulations, 2012 CFR
2012-01-01
... the non-proliferation and verification research and development program; (v) The life sciences..., Washington, DC 20472 (for regional offices, consult your phone book); (20) For the Federal Computer Incident...
CARMENES: an overview six months after first light
NASA Astrophysics Data System (ADS)
Quirrenbach, A.; Amado, P. J.; Caballero, J. A.; Mundt, R.; Reiners, A.; Ribas, I.; Seifert, W.; Abril, M.; Aceituno, J.; Alonso-Floriano, F. J.; Anwand-Heerwart, H.; Azzaro, M.; Bauer, F.; Barrado, D.; Becerril, S.; Bejar, V. J. S.; Benitez, D.; Berdinas, Z. M.; Brinkmöller, M.; Cardenas, M. C.; Casal, E.; Claret, A.; Colomé, J.; Cortes-Contreras, M.; Czesla, S.; Doellinger, M.; Dreizler, S.; Feiz, C.; Fernandez, M.; Ferro, I. M.; Fuhrmeister, B.; Galadi, D.; Gallardo, I.; Gálvez-Ortiz, M. C.; Garcia-Piquer, A.; Garrido, R.; Gesa, L.; Gómez Galera, V.; González Hernández, J. I.; Gonzalez Peinado, R.; Grözinger, U.; Guàrdia, J.; Guenther, E. W.; de Guindos, E.; Hagen, H.-J.; Hatzes, A. P.; Hauschildt, P. H.; Helmling, J.; Henning, T.; Hermann, D.; Hernández Arabi, R.; Hernández Castaño, L.; Hernández Hernando, F.; Herrero, E.; Huber, A.; Huber, K. F.; Huke, P.; Jeffers, S. V.; de Juan, E.; Kaminski, A.; Kehr, M.; Kim, M.; Klein, R.; Klüter, J.; Kürster, M.; Lafarga, M.; Lara, L. M.; Lamert, A.; Laun, W.; Launhardt, R.; Lemke, U.; Lenzen, R.; Llamas, M.; Lopez del Fresno, M.; López-Puertas, M.; López-Santiago, J.; Lopez Salas, J. F.; Magan Madinabeitia, H.; Mall, U.; Mandel, H.; Mancini, L.; Marin Molina, J. A.; Maroto Fernández, D.; Martín, E. L.; Martín-Ruiz, S.; Marvin, C.; Mathar, R. J.; Mirabet, E.; Montes, D.; Morales, J. C.; Morales Muñoz, R.; Nagel, E.; Naranjo, V.; Nowak, G.; Palle, E.; Panduro, J.; Passegger, V. M.; Pavlov, A.; Pedraz, S.; Perez, E.; Pérez-Medialdea, D.; Perger, M.; Pluto, M.; Ramón, A.; Rebolo, R.; Redondo, P.; Reffert, S.; Reinhart, S.; Rhode, P.; Rix, H.-W.; Rodler, F.; Rodríguez, E.; Rodríguez López, C.; Rohloff, R. R.; Rosich, A.; Sanchez Carrasco, M. A.; Sanz-Forcada, J.; Sarkis, P.; Sarmiento, L. F.; Schäfer, S.; Schiller, J.; Schmidt, C.; Schmitt, J. H. M. M.; Schöfer, P.; Schweitzer, A.; Shulyak, D.; Solano, E.; Stahl, O.; Storz, C.; Tabernero, H. M.; Tala, M.; Tal-Or, L.; Ulbrich, R.-G.; Veredas, G.; Vico Linares, J. I.; Vilardell, F.; Wagner, K.; Winkler, J.; Zapatero Osorio, M.-R.; Zechmeister, M.; Ammler-von Eiff, M.; Anglada-Escudé, G.; del Burgo, C.; Garcia-Vargas, M. L.; Klutsch, A.; Lizon, J.-L.; Lopez-Morales, M.; Ofir, A.; Pérez-Calpena, A.; Perryman, M. A. C.; Sánchez-Blanco, E.; Strachan, J. B. P.; Stürmer, J.; Suárez, J. C.; Trifonov, T.; Tulloch, S. M.; Xu, W.
2016-08-01
The CARMENES instrument is a pair of high-resolution (R> 80,000) spectrographs covering the wavelength range from 0.52 to 1.71 μm, optimized for precise radial velocity measurements. It was installed and commissioned at the 3.5m telescope of the Calar Alto observatory in Southern Spain in 2015. The first large science program of CARMENES is a survey of 300 M dwarfs, which started on Jan 1, 2016. We present an overview of all subsystems of CARMENES (front end, fiber system, visible-light spectrograph, near-infrared spectrograph, calibration units, etalons, facility control, interlock system, instrument control system, data reduction pipeline, data flow, and archive), and give an overview of the assembly, integration, verification, and commissioning phases of the project. We show initial results and discuss further plans for the scientific use of CARMENES.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y; Yin, F; Ren, L
Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to furthermore » reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The technique enables LIVE to accurately reconstruct 4D-CBCT images using only orthogonal 6° angle, which greatly improves the efficiency and reduces dose of LIVE for intrafraction verification.« less
Code of Federal Regulations, 2014 CFR
2014-01-01
... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and Technology...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... Controller of Imports and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 11011 Indian IC For the “organized” sector, except for computers and related equipment: Directorate... 110011 Indian IC For computers and related electronic items: Department of Electronics, Lok Nayak Bhawan... Controller of Imports and Exports 5, Civic Center Islamabad IC Joint Science Advisor, Ministry of Science and...
1988-03-01
observed in the laboratory, and to determine the degree of correlation between the bioaccumulation of contaminants and bioenergetic responses...toxicity of liquid, suspended particulate, and solid phases; (c) estimating the potential contami- nant bioaccumulation ; and (d) describing the initial... bioaccumulation of dredged material contami- nants with biological responses from laboratory and field exposure to dredged material. However, this study
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
Integrated verification and testing system (IVTS) for HAL/S programs
NASA Technical Reports Server (NTRS)
Senn, E. H.; Ames, K. R.; Smith, K. A.
1983-01-01
The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.
Recruitment Strategies of Methamphetamine-Using Men Who Have Sex with Men into an Online Survey
Wilkerson, J. Michael; Shenk, Jared E.; Grey, Jeremy A.; Simon Rosser, B. R.; Noor, Syed W.
2014-01-01
Recruiting hidden populations into online research remains challenging. In this manuscript, we report lessons learned from our efforts to recruit methamphetamine-using men who have sex with men. Between July and October 2012, we implemented a four-phase recruitment strategy to enroll a total of 343 methamphetamine-using MSM into an online survey about recent substance use, sexual behavior, and various psychosocial measures. The four phases were implemented sequentially. During phase one, we placed advertisements on mobile applications, and during phase two, we placed advertisements on traditional websites formatted for browsers. During phase three, we used e-mail to initiate snowball recruitment, and during phase four, we used social media for snowball recruitment. Advertisements on mobile devices and websites formatted for browsers proved to be expensive options and resulted in few eligible participants. Our attempts to initiate a snowball through e-mail also proved unsuccessful. The majority (n=320) of observations in our final dataset came from our use of social media. However, participant fraud was a concern, requiring us to implement a strong participant verification protocol. For maximum recruitment and cost-effectiveness, researchers should use social media for recruitment provided they employ strong participant verification protocols. PMID:25642143
Recruitment Strategies of Methamphetamine-Using Men Who Have Sex with Men into an Online Survey.
Wilkerson, J Michael; Shenk, Jared E; Grey, Jeremy A; Simon Rosser, B R; Noor, Syed W
Recruiting hidden populations into online research remains challenging. In this manuscript, we report lessons learned from our efforts to recruit methamphetamine-using men who have sex with men. Between July and October 2012, we implemented a four-phase recruitment strategy to enroll a total of 343 methamphetamine-using MSM into an online survey about recent substance use, sexual behavior, and various psychosocial measures. The four phases were implemented sequentially. During phase one, we placed advertisements on mobile applications, and during phase two, we placed advertisements on traditional websites formatted for browsers. During phase three, we used e-mail to initiate snowball recruitment, and during phase four, we used social media for snowball recruitment. Advertisements on mobile devices and websites formatted for browsers proved to be expensive options and resulted in few eligible participants. Our attempts to initiate a snowball through e-mail also proved unsuccessful. The majority (n=320) of observations in our final dataset came from our use of social media. However, participant fraud was a concern, requiring us to implement a strong participant verification protocol. For maximum recruitment and cost-effectiveness, researchers should use social media for recruitment provided they employ strong participant verification protocols.
Exploring system interconnection architectures with VIPACES: from direct connections to NOCs
NASA Astrophysics Data System (ADS)
Sánchez-Peña, Armando; Carballo, Pedro P.; Núñez, Antonio
2007-05-01
This paper presents a simple environment for the verification of AMBA 3 AXI systems in Verification IP (VIP) production called VIPACES (Verification Interface Primitives for the development of AXI Compliant Elements and Systems). These primitives are presented as a not compiled library written in SystemC where interfaces are the core of the library. The definition of interfaces instead of generic modules let the user construct custom modules improving the resources spent during the verification phase as well as easily adapting his modules to the AMBA 3 AXI protocol. This topic is the main discussion in the VIPACES library. The paper focuses on comparing and contrasting the main interconnection schemes for AMBA 3 AXI as modeled by VIPACES. For assessing these results we propose a validation scenario with a particular architecture belonging to the domain of MPEG4 video decoding, which is compound by an AXI bus connecting an IDCT and other processing resources.
Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features
NASA Technical Reports Server (NTRS)
Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed
2012-01-01
Phase I: The use of adhesive locking features or liquid locking compounds (LLCs) (e.g., Loctite) as a means of providing a secondary locking feature has been used on NASA programs since the Apollo program. In many cases Loctite was used as a last resort when (a) self-locking fasteners were no longer functioning per their respective drawing specification, (b) access was limited for removal & replacement, or (c) replacement could not be accomplished without severe impact to schedule. Long-term use of Loctite became inevitable in cases where removal and replacement of worn hardware was not cost effective and Loctite was assumed to be fully cured and working. The NASA Engineering & Safety Center (NESC) and United Space Alliance (USA) recognized the need for more extensive testing of Loctite grades to better understand their capabilities and limitations as a secondary locking feature. These tests, identified as Phase I, were designed to identify processing sensitivities, to determine proper cure time, the correct primer to use on aerospace nutplate, insert and bolt materials such as A286 and MP35N, and the minimum amount of Loctite that is required to achieve optimum breakaway torque values. The .1900-32 was the fastener size tested, due to wide usage in the aerospace industry. Three different grades of Loctite were tested. Results indicate that, with proper controls, adhesive locking features can be successfully used in the repair of locking features and should be considered for design. Phase II: Threaded fastening systems used in aerospace programs typically have a requirement for a redundant locking feature. The primary locking method is the fastener preload and the traditional redundant locking feature is a self-locking mechanical device that may include deformed threads, non-metallic inserts, split beam features, or other methods that impede movement between threaded members. The self-locking resistance of traditional locking features can be directly verified during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.
Dental Laboratory Technology. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Smith, Debra S.
This report provides results of Phase I of a project that researched the occupational area of dental laboratory technology, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train dental laboratory technicians. Section 1 contains general information:…
Environmental Horticulture. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Bachler, Mike; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of environmental horticulture, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to address the needs of the horticulture field. Section 1 contains general information:…
Instrumentation Technology. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Squires, Sheila S.
This report provides results of Phase I of a project that researched the occupational area of instrumentation technology, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train instrumentation technicians. Section 1 contains general information: purpose of…
NASA Technical Reports Server (NTRS)
Martin, F. H.
1972-01-01
An overview of the executive system design task is presented. The flight software executive system, software verification, phase B baseline avionics system review, higher order languages and compilers, and computer hardware features are also discussed.
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
2013-09-01
to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS
Hubble Space Telescope Fine Guidance Sensors Instrument Handbook, version 4.0
NASA Technical Reports Server (NTRS)
Holfeltz, S. T. (Editor)
1994-01-01
This is a revised version of the Hubble Space Telescope Fine Guidance Sensor Instrument Handbook. The main goal of this edition is to help the potential General Observer (GO) learn how to most efficiently use the Fine Guidance Sensors (FGS's). First, the actual performance of the FGS's as scientific instruments is reviewed. Next, each of the available operating modes of the FGS's are reviewed in turn. The status and findings of pertinent calibrations, including Orbital Verification, Science Verification, and Instrument Scientist Calibrations are included as well as the relevant data reduction software.
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2014 CFR
2014-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2010 CFR
2010-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2013 CFR
2013-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2012 CFR
2012-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2011 CFR
2011-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
Partial Support of Meeting of the Board on Mathematical Sciences and Their Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weidman, Scott
2014-08-31
During the performance period, BMSA released the following major reports: Transforming Combustion Research through Cyberinfrastructure (2011); Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification (2012); Fueling Innovation and Discovery: The Mathematical Sciences in the 21st Century (2012); Aging and the Macroeconomy: Long-Term Implications of an Older Population (2012); The Mathematical Sciences in 2025 (2013); Frontiers in Massive Data Analysis (2013); and Developing a 21st Century Global Library for Mathematics Research (2014).
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael
1994-01-01
In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.
NASA Technical Reports Server (NTRS)
Williams, David E.
2007-01-01
The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 ACS, AR, and WRM design and detailed Element Verification methodologies utilized during the Qualification phase for Node 1.
NASA Technical Reports Server (NTRS)
Platt, R.
1999-01-01
This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.
23 CFR 230.409 - Contract compliance review procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... (Actions R-1 and R-2). (2) Contractor Notification (Action R-3). (3) Preliminary Analysis (Phase I) (Action R-4). (4) Onsite Verification and Interviews (Phase II) (Action R-5). (5) Exit Conference (Action R-6). (6) Compliance Determination and Formal Notification (Actions R-8, R-9, R-10, R-11, R-12). The...
A critique of the hypothesis, and a defense of the question, as a framework for experimentation.
Glass, David J
2010-07-01
Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.
SETI and the media: Views from inside and out
NASA Astrophysics Data System (ADS)
Tarter, Donald E.
Results are presented from a detailed questionnaire sent to members of the international SETI (Search for Extraterrestrial Intelligence) community and the international science media. Both groups are compared on the following dimensions: perceived importance of SETI, perceived level of information about SETI available to the media and public, perceived credibility of SETI, and attitudes toward information policy options to govern an announcement of a SETI discovery. The results indicate that SETI is perceived to be an extremely important endeavor, but it enjoys only marginal credibility among the public and the SETI community's professional constituencies. Both the SETI community and the media agree that an erroneous announcement of a discovery of extraterrestrial intelligence could be very damaging. In order to minimize the dangers of false announcement and to bring a degree of order to SETI, a scientific protocol agreement and the establishment of a contact verification committee have been recommended. Both received endorsement from the SETI community and the international science media. The science media feels that from its viewpoint, a contact verification committee would be a more effective way of assuring accurate information about SETI programs and discoveries.
VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.
2015-12-01
A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.
Sandia National Laboratories: About Sandia: Environmental Responsibility:
Environmental Management: Sandia Sandia National Laboratories Exceptional service in the Environmental Responsibility Environmental Management System Pollution Prevention History 60 impacts Diversity ; Verification Research Research Foundations Bioscience Computing & Information Science Electromagnetics
Biometric Subject Verification Based on Electrocardiographic Signals
NASA Technical Reports Server (NTRS)
Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)
2014-01-01
A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.
User interface and operational issues with thermionic space power systems
NASA Technical Reports Server (NTRS)
Dahlberg, R. C.; Fisher, C. R.
1987-01-01
Thermionic space power systems have unique features which facilitate predeployment operations, provide operational flexibility and simplify the interface with the user. These were studied in some detail during the SP-100 program from 1983 to 1985. Three examples are reviewed in this paper: (1) system readiness verification in the prelaunch phase; (2) startup, shutdown, and dormancy in the operations phase; (3) part-load operation in the operations phase.
NASA Astrophysics Data System (ADS)
Varseev, E.
2017-11-01
The present work is dedicated to verification of numerical model in standard solver of open-source CFD code OpenFOAM for two-phase flow simulation and to determination of so-called “baseline” model parameters. Investigation of heterogeneous coolant flow parameters, which leads to abnormal friction increase of channel in two-phase adiabatic “water-gas” flows with low void fractions, presented.
Architectures Toward Reusable Science Data Systems
NASA Technical Reports Server (NTRS)
Moses, John Firor
2014-01-01
Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAA's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today.
A clocking discipline for two-phase digital integrated circuits
NASA Astrophysics Data System (ADS)
Noice, D. C.
1983-09-01
Sooner or later a designer of digital circuits must face the problem of timing verification so he can avoid errors caused by clock skew, critical races, and hazards. Unlike previous verification methods, such as timing simulation and timing analysis, the approach presented here guarantees correct operation despite uncertainty about delays in the circuit. The result is a clocking discipline that deals with timing abstractions only. It is not based on delay calculations; it is only concerned with the correct, synchronous operation at some clock rate. Accordingly, it may be used earlier in the design cycle, which is particularly important to integrated circuit designs. The clocking discipline consists of a notation of clocking types, and composition rules for using the types. Together, the notation and rules define a formal theory of two phase clocking. The notation defines the names and exact characteristics for different signals that are used in a two phase digital system. The notation makes it possible to develop rules for propagating the clocking types through particular circuits.
NASA Technical Reports Server (NTRS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.;
2016-01-01
NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.
2016-09-01
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Deer, Maria Soledad
The auditory experience of using a hearing aid or a cochlear implant simultaneously with a cell phone is driven by a number of factors. These factors are: radiofrequency and baseband interference, speech intelligibility, sound quality, handset design, volume control and signal strength. The purpose of this study was to develop a tool to be used by hearing aid and cochlear implant users in retail stores as they try cell phones before buying them. This tool is meant to be an efficient, practical and systematic consumer selection tool that will capture and document information on all the domains that play a role in the auditory experience of using a cell phone with a hearing aid or cochlear implant. The development of this consumer tool involved three steps as follows: preparation, verification and measurement of success according to a predefined criterion. First, the consumer tool, consisting of a comparison chart and speech material, was prepared. Second, the consumer tool was evaluated by groups of subjects in a two-step verification process. Phase I was conducted in a controlled setting and it was followed by Phase II which took place in real world (field) conditions. In order to perform a systematic evaluation of the consumer tool two questionnaires were developed: one questionnaire for each phase. Both questionnaires involved five quantitative variables scored with the use of ratings scales. These ratings were averaged yielding an Overall Consumer Performance Score. A qualitative performance category corresponding to the Mean Opinion Score (MOS) was allocated to each final score within a scale ranging from 1 to 5 (where 5 = excellent and 1 = bad). Finally, the consumer tool development was determined to be successful if at least 80% of the participants in verification Phase II rated the comparison chart as excellent or good according to the qualitative MOS score. The results for verification Phase II (field conditions) indicated that the Overall Consumer Performance score for 92% of the subjects (11/12) was 3.7 and above corresponding to Good and Excellent MOS qualitative categories. It was concluded that this is a practical and efficient tool for hearing aid/cochlear implant users as they approach a cell phone selection process.
Design, analysis and test verification of advanced encapsulation systems, phase 2 program results
NASA Astrophysics Data System (ADS)
Garcia, A.; Minning, C.; Breen, R. T.; Coakley, J. F.; Duncan, L. B.; Gllaspy, D. M.; Kiewert, R. H.; McKinney, F. G.; Taylor, W. E.; Vaughn, L. E.
1982-06-01
Optical, electrical isolation, thermal structural, structural deflection, and thermal tests are reported. The utility of the optical, series capacitance, and structural deflection models was verified.
Design, analysis and test verification of advanced encapsulation systems, phase 2 program results
NASA Technical Reports Server (NTRS)
Garcia, A.; Minning, C.; Breen, R. T.; Coakley, J. F.; Duncan, L. B.; Gllaspy, D. M.; Kiewert, R. H.; Mckinney, F. G.; Taylor, W. E.; Vaughn, L. E.
1982-01-01
Optical, electrical isolation, thermal structural, structural deflection, and thermal tests are reported. The utility of the optical, series capacitance, and structural deflection models was verified.
Intermediate Experimental Vehicle (IXV): Avionics and Software of the ESA Reentry Demonstrator
NASA Astrophysics Data System (ADS)
Malucchi, Giovanni; Dussy, Stephane; Camuffo, Fabrizio
2012-08-01
The IXV project is conceived as a technology platform that would perform the step forward with respect to the Atmospheric Reentry Demonstrator (ARD), by increasing the system maneuverability and verifying the critical technology performances against a wider re- entry corridor.The main objective is to design, develop and to perform an in-flight verification of an autonomous lifting and aerodynamically controlled (by a combined use of thrusters and aerodynamic surfaces) reentry system.The project also includes the verification and experimentation of a set of critical reentry technologies and disciplines:Thermal Protection System (TPS), for verification and characterization of thermal protection technologies in representative operational environment;Aerodynamics - Aerthermodynamics (AED-A TD), for understanding and validation of aerodynamics and aerothermodyamics phenomena with improvement of design tools;Guidance, Navigation and Control (GNC), for verification of guidance, navigation and control techniques in representative operational environment (i.e. reentry from Low Earth Orbit);Flight dynamics, to update and validate the vehicle model during actual flight, focused on stability and control derivatives.The above activities are being performed through the implementation of a strict system design-to-cost approach with a proto-flight model development philosophy.In 2008 and 2009, the IXV project activities reached the successful completion of the project Phase-B, including the System PDR, and early project Phase-C.In 2010, following a re-organization of the industrial consortium, the IXV project successfully completed a design consolidation leading to an optimization of the technical baseline including the GNC, avionics (i.e. power, data handling, radio frequency and telemetry), measurement sensors, hot and cold composite structures, thermal protections and control, with significant improvements of the main system budgets.The project has successfully closed the System CDR during 2011 and it is currently running the Phase-D with the target to be launched with Vega from Kourou in 2014The paper will provide an overview of the IXV design and mission objectives in the frame of the atmospheric reentry overall activities, focusing on the avionics and software architecture and design.
Separating stages of arithmetic verification: An ERP study with a novel paradigm.
Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes
2015-08-01
In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.
A field study of the accuracy and reliability of a biometric iris recognition system.
Latman, Neal S; Herb, Emily
2013-06-01
The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.
2004-11-01
The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.
SWARM: A 32 GHz Correlator and VLBI Beamformer for the Submillimeter Array
NASA Astrophysics Data System (ADS)
Primiani, Rurik A.; Young, Kenneth H.; Young, André; Patel, Nimesh; Wilson, Robert W.; Vertatschitsch, Laura; Chitwood, Billie B.; Srinivasan, Ranjani; MacMahon, David; Weintroub, Jonathan
2016-03-01
A 32GHz bandwidth VLBI capable correlator and phased array has been designed and deployeda at the Smithsonian Astrophysical Observatory’s Submillimeter Array (SMA). The SMA Wideband Astronomical ROACH2 Machine (SWARM) integrates two instruments: a correlator with 140kHz spectral resolution across its full 32GHz band, used for connected interferometric observations, and a phased array summer used when the SMA participates as a station in the Event Horizon Telescope (EHT) very long baseline interferometry (VLBI) array. For each SWARM quadrant, Reconfigurable Open Architecture Computing Hardware (ROACH2) units shared under open-source from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER) are equipped with a pair of ultra-fast analog-to-digital converters (ADCs), a field programmable gate array (FPGA) processor, and eight 10 Gigabit Ethernet (GbE) ports. A VLBI data recorder interface designated the SWARM digital back end, or SDBE, is implemented with a ninth ROACH2 per quadrant, feeding four Mark6 VLBI recorders with an aggregate recording rate of 64 Gbps. This paper describes the design and implementation of SWARM, as well as its deployment at SMA with reference to verification and science data.
NASA Astrophysics Data System (ADS)
Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.
2008-07-01
In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.
Monitoring/Verification using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan Weeks, Kevin Kyle, Manuel Manard
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
NASA Technical Reports Server (NTRS)
Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan
1997-01-01
The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.
A drinking water method for 12 chemicals, predominately pesticides, is presented that addresses the occurrence monitoring needs of the U.S. Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs solid phase ext...
NASA Astrophysics Data System (ADS)
Zhong-Zhen, Wu; Shu, Xiao; Sui-Han, Cui; Ricky, K. Y. Fu; Xiu-Bo, Tian; Paul, K. Chu; Feng, Pan
2016-07-01
Not Available Supported by the National Natural Science Foundation of China under Grant Nos 51301004 and U1330110, the Guangdong Innovative and Entrepreneurial Research Team Program under Grant No 2013N080, the Shenzhen Science and Technology Research Grant under Grant Nos JCYJ20140903102215536 and JCYJ20150828093127698, and the City University of Hong Kong Applied Research Grant under Grant No 9667104.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spangler, Lee; Cunningham, Alfred; Lageson, David
2011-03-31
ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.
Computer Modeling and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pronskikh, V. S.
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less
Leistedt, B.; Peiris, H. V.; Elsner, F.; ...
2016-10-17
Spatially-varying depth and characteristics of observing conditions, such as seeing, airmass, or sky background, are major sources of systematic uncertainties in modern galaxy survey analyses, in particular in deep multi-epoch surveys. We present a framework to extract and project these sources of systematics onto the sky, and apply it to the Dark Energy Survey (DES) to map the observing conditions of the Science Verification (SV) data. The resulting distributions and maps of sources of systematics are used in several analyses of DES SV to perform detailed null tests with the data, and also to incorporate systematics in survey simulations. Wemore » illustrate the complementarity of these two approaches by comparing the SV data with the BCC-UFig, a synthetic sky catalogue generated by forward-modelling of the DES SV images. We then analyse the BCC-UFig simulation to construct galaxy samples mimicking those used in SV galaxy clustering studies. We show that the spatially-varying survey depth imprinted in the observed galaxy densities and the redshift distributions of the SV data are successfully reproduced by the simulation and well-captured by the maps of observing conditions. The combined use of the maps, the SV data and the BCC-UFig simulation allows us to quantify the impact of spatial systematics on N(z), the redshift distributions inferred using photometric redshifts. We conclude that spatial systematics in the SV data are mainly due to seeing fluctuations and are under control in current clustering and weak lensing analyses. However, they will need to be carefully characterised in upcoming phases of DES in order to avoid biasing the inferred cosmological results. The framework presented is relevant to all multi-epoch surveys, and will be essential for exploiting future surveys such as the Large Synoptic Survey Telescope, which will require detailed null-tests and realistic end-to-end image simulations to correctly interpret the deep, high-cadence observations of the sky.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leistedt, B.; Peiris, H. V.; Elsner, F.
Spatially varying depth and the characteristics of observing conditions, such as seeing, airmass, or sky background, are major sources of systematic uncertainties in modern galaxy survey analyses, particularly in deep multi-epoch surveys. We present a framework to extract and project these sources of systematics onto the sky, and apply it to the Dark Energy Survey (DES) to map the observing conditions of the Science Verification (SV) data. The resulting distributions and maps of sources of systematics are used in several analyses of DES-SV to perform detailed null tests with the data, and also to incorporate systematics in survey simulations. Wemore » illustrate the complementary nature of these two approaches by comparing the SV data with BCC-UFig, a synthetic sky catalog generated by forward-modeling of the DES-SV images. We analyze the BCC-UFig simulation to construct galaxy samples mimicking those used in SV galaxy clustering studies. We show that the spatially varying survey depth imprinted in the observed galaxy densities and the redshift distributions of the SV data are successfully reproduced by the simulation and are well-captured by the maps of observing conditions. The combined use of the maps, the SV data, and the BCC-UFig simulation allows us to quantify the impact of spatial systematics on N(z), the redshift distributions inferred using photometric redshifts. We conclude that spatial systematics in the SV data are mainly due to seeing fluctuations and are under control in current clustering and weak-lensing analyses. However, they will need to be carefully characterized in upcoming phases of DES in order to avoid biasing the inferred cosmological results. The framework presented here is relevant to all multi-epoch surveys and will be essential for exploiting future surveys such as the Large Synoptic Survey Telescope, which will require detailed null tests and realistic end-to-end image simulations to correctly interpret the deep, high-cadence observations of the sky« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leistedt, B.; Peiris, H. V.; Elsner, F.
Spatially-varying depth and characteristics of observing conditions, such as seeing, airmass, or sky background, are major sources of systematic uncertainties in modern galaxy survey analyses, in particular in deep multi-epoch surveys. We present a framework to extract and project these sources of systematics onto the sky, and apply it to the Dark Energy Survey (DES) to map the observing conditions of the Science Verification (SV) data. The resulting distributions and maps of sources of systematics are used in several analyses of DES SV to perform detailed null tests with the data, and also to incorporate systematics in survey simulations. Wemore » illustrate the complementarity of these two approaches by comparing the SV data with the BCC-UFig, a synthetic sky catalogue generated by forward-modelling of the DES SV images. We then analyse the BCC-UFig simulation to construct galaxy samples mimicking those used in SV galaxy clustering studies. We show that the spatially-varying survey depth imprinted in the observed galaxy densities and the redshift distributions of the SV data are successfully reproduced by the simulation and well-captured by the maps of observing conditions. The combined use of the maps, the SV data and the BCC-UFig simulation allows us to quantify the impact of spatial systematics on N(z), the redshift distributions inferred using photometric redshifts. We conclude that spatial systematics in the SV data are mainly due to seeing fluctuations and are under control in current clustering and weak lensing analyses. However, they will need to be carefully characterised in upcoming phases of DES in order to avoid biasing the inferred cosmological results. The framework presented is relevant to all multi-epoch surveys, and will be essential for exploiting future surveys such as the Large Synoptic Survey Telescope, which will require detailed null-tests and realistic end-to-end image simulations to correctly interpret the deep, high-cadence observations of the sky.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2013-07-01
The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.
NASA Astrophysics Data System (ADS)
Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.
2009-04-01
The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.
NASA Technical Reports Server (NTRS)
Williams, David E.
2007-01-01
The International Space Station (ISS) Pressurized Mating Adapters (PMAs) Environmental Control and Life Support (ECLS) System is comprised of three subsystems: Atmosphere Control and Supply (ACS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). PMA 1 and PMA 2 flew to ISS on Flight 2A and PMA 3 flew to ISS on Flight 3A. This paper provides a summary of the PMAs ECLS design and the detailed Element Verification methodologies utilized during the Qualification phase for the PMAs.
International Space Station Temperature and Humidity Control Subsystem Verification for Node 1
NASA Technical Reports Server (NTRS)
Williams, David E.
2007-01-01
The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the nominal operation of the Node 1 THC subsystem design. The paper will also provide a discussion of the detailed Element Verification methodologies for nominal operation of the Node 1 THC subsystem operations utilized during the Qualification phase.
Tethered satellite system dynamics and control review panel and related activities, phase 3
NASA Technical Reports Server (NTRS)
1991-01-01
Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.
NASA Astrophysics Data System (ADS)
Arakelian, S.; Kucherik, A.; Kutrovskaya, S.; Osipov, A.; Istratov, A.; Skryabin, I.
2018-01-01
A clear physical model for the quantum states verification in nanocluster structures with jump/tunneling electroconductivity are under study in both theory and experiment. The accent is made on consideration of low-dimensional structures when the structural phase transitions occur and the tendency to high enhancement electroconductivity obtained. The results give us an opportunity to establish a basis for new physical principles to create the functional elements for the optoelectronics and photonics in hybrid set-up (optics + electrophysics) by the nanocluster technology approach.
Space Weather Models and Their Validation and Verification at the CCMC
NASA Technical Reports Server (NTRS)
Hesse, Michael
2010-01-01
The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.
NASA Astrophysics Data System (ADS)
Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.
2006-03-01
The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.
Phase equilibria computations of multicomponent mixtures at specified internal energy and volume
NASA Astrophysics Data System (ADS)
Myint, Philip C.; Nichols, Albert L., III; Springer, H. Keo
2017-06-01
Hydrodynamic simulation codes for high-energy density science applications often use internal energy and volume as their working variables. As a result, the codes must determine the thermodynamic state that corresponds to the specified energy and volume by finding the global maximum in entropy. This task is referred to as the isoenergetic-isochoric flash. Solving it for multicomponent mixtures is difficult because one must find not only the temperature and pressure consistent with the energy and volume, but also the number of phases present and the composition of the phases. The few studies on isoenergetic-isochoric flash that currently exist all require the evaluation of many derivatives that can be tedious to implement. We present an alternative approach that is based on a derivative-free method: particle swarm optimization. The global entropy maximum is found by running several instances of particle swarm optimization over different sets of randomly selected points in the search space. For verification, we compare the predicted temperature and pressure to results from the related, but simpler problem of isothermal-isobaric flash. All of our examples involve the equation of state we have recently developed for multiphase mixtures of the energetic materials HMX, RDX, and TNT. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Technical Reports Server (NTRS)
Skinner, S. Ballou
1991-01-01
Chlorofluorocarbons (CFC's) in the atmosphere are believed to present a major environmental problem because they are able to interact with and deplete the ozone layer. NASA has been mandated to replace chlorinated solvents in precision cleaning, cleanliness verification, and degreasing of aerospace fluid systems hardware and ground support equipment. KSC has a CFC phase-out plan which provides for the elimination of over 90 percent of the CFC and halon use by 1995. The Materials Science Laboratory and KSC is evaluating four analytical methods for the determination of nonvolatile residues removal by water: (1) infrared analyses using an attenuated total reflectance; (2) surface tension analyses, (3) total organic content analyses, and (4) turbidity analyses. This research project examined the ultrasonic-turbidity responses for 22 hydrocarbons in an effect to determine: (1) if ultrasonics in heated water (70 C) will clean hydrocarbons (oils, greases, gels, and fluids) from aerospace hardware; (2) if the cleaning process by ultrasonics will simultaneously emulsify the removed hydrocarbons in the water; and (3) if a turbidimeter can be used successfully as an analytical instrument for quantifying the removal of hydrocarbons. Sixteen of the 22 hydrocarbons tested showed that ultrasonics would remove it at least 90 percent of the contaminated hydrocarbon from the hardware in 10 minutes or less giving a good ultrasonic-turbidity response. Six hydrocarbons had a lower percentage removal, a slower removal rate, and a marginal ultrasonic-turbidity response.
DOT National Transportation Integrated Search
1984-01-01
The study reported here addresses some of the earlier phases in the development of a pavement management system for the state of Virginia. Among the issues discussed are the development of an adequate data base and the implementation of a condition r...
First Cryo-Vacuum Test of the JWST Integrated Science Instrument Module
NASA Astrophysics Data System (ADS)
Kimble, Randy A.; Antonille, S. R.; Balzano, V.; Comber, B. J.; Davila, P. S.; Drury, M. D.; Glasse, A.; Glazer, S. D.; Lundquist, R.; Mann, S. D.; McGuffey, D. B.; Novo-Gradac, K. J.; Penanen, K.; Ramey, D. D.; Sullivan, J.; Van Campen, J.; Vila, M. B.
2014-01-01
The integration and test program for the Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope (JWST) calls for three cryo-vacuum tests of the ISIM hardware. The first is a risk-reduction test aimed at checking out the test hardware and procedures; this will be followed by two formal verification tests that will bracket other key aspects of the environmental test program (e.g. vibration and acoustics, EMI/EMC). The first of these cryo-vacuum tests, the risk-reduction test, was executed at NASA’s Goddard Space Flight Center starting in late August, 2013. Flight hardware under test included two (of the eventual four) flight instruments, the Mid-Infrared Instrument (MIRI) and the Fine Guidance Sensor/Near-Infrared Imager and Slitless Spectrograph (FGS/NIRISS), mounted to the ISIM structure, as well as the ISIM Electronics Compartment (IEC). The instruments were cooled to their flight operating temperatures 40K for FGS/NIRISS, ~6K for MIRI) and optically tested against a cryo-certified telescope simulator. Key goals for the risk reduction test included: 1) demonstration of controlled cooldown and warmup, stable control at operating temperature, and measurement of heat loads, 2) operation of the science instruments with ISIM electronics systems at temperature, 3) health trending of the science instruments against instrument-level test results, 4) measurement of the pupil positions and six degree of freedom alignment of the science instruments against the simulated telescope focal surface, 5) detailed optical characterization of the NIRISS instrument, 6) verification of the signal-to-noise performance of the MIRI, and 7) exercise of the Onboard Script System that will be used to operate the instruments in flight. In addition, the execution of the test is expected to yield invaluable logistical experience - development and execution of procedures, communications, analysis of results - that will greatly benefit the subsequent verification tests. At the time of this submission, the hardware had reached operating temperature and was partway through the cryo test program. We report here on the test configuration, the overall process, and the results that were ultimately obtained.
Solar power satellite system definition study, phase 2.
NASA Technical Reports Server (NTRS)
1979-01-01
A program plan for the Solar Power Satellite Program is presented. The plan includes research, development, and evaluation phase, engineering and development and cost verification phase, prototype construction, and commercialization. Cost estimates and task requirements are given for the following technology areas: (1) solar arrays; (2) thermal engines and thermal systems; (3) power transmission (to earth); (4) large space structures; (5) materials technology; (6) system control; (7) space construction; (8) space transportation; (9) power distribution, and space environment effects.
Technology verification phase. Dynamic isotope power system. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halsey, D.G.
1982-03-10
The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight systemmore » design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)« less
Sandia National Laboratories: Hydrogen Risk Assessment Models toolkit now
Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers
Sandia National Laboratories: 100 Resilient Cities: Sandia Challenge:
Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers
Sandia National Laboratories: National Security Missions: Defense Systems
Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers
DIESEL ENGINE RETROFIT TECHNOLOGY VERIFICATION
This presentation wil be given at the EPA Science Forum 2005 in Washington, DC. According to recent estimates, there are approximately 7.9 million heavy-duty diesel trucks and buses in use in the United States. Emissions from these vehicles account for substantial portions of t...
Verification of Employment (VOE)
Science Programs Applied Energy Programs Civilian Nuclear Energy Programs Laboratory Directed Research Service Academies Research Associates (SARA) Postdocs, Students Employee, Retiree Resources Benefits New employees need to show a photo ID. Employee, Retiree Resources Benefits Plan Reports & Notices
Verification and Validation of NASA-Supported Enhancements to PECAD's Decision Support Tools
NASA Technical Reports Server (NTRS)
McKellipo, Rodney; Ross, Kenton W.
2006-01-01
The NASA Applied Sciences Directorate (ASD), part of the Earth-Sun System Division of NASA's Science Mission Directorate, has partnered with the U.S. Department of Agriculture (USDA) to enhance decision support in the area of agricultural efficiency-an application of national importance. The ASD integrated the results of NASA Earth science research into USDA decision support tools employed by the USDA Foreign Agricultural Service (FAS) Production Estimates and Crop Assessment Division (PECAD), which supports national decision making by gathering, analyzing, and disseminating global crop intelligence. Verification and validation of the following enhancements are summarized: 1) Near-real-time Moderate Resolution Imaging Spectroradiometer (MODIS) products through PECAD's MODIS Image Gallery; 2) MODIS Normalized Difference Vegetation Index (NDVI) time series data through the USDA-FAS MODIS NDVI Database; and 3) Jason-1 and TOPEX/Poseidon lake level estimates through PECAD's Global Reservoir and Lake Monitor. Where possible, each enhanced product was characterized for accuracy, timeliness, and coverage, and the characterized performance was compared to PECAD operational requirements. The MODIS Image Gallery and the GRLM are more mature and have achieved a semi-operational status, whereas the USDA-FAS MODIS NDVI Database is still evolving and should be considered
NASA Technical Reports Server (NTRS)
Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond;
2007-01-01
The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.
NASA Technical Reports Server (NTRS)
Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.
1993-01-01
To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Automated solar panel assembly line
NASA Technical Reports Server (NTRS)
Somberg, H.
1981-01-01
The initial stage of the automated solar panel assembly line program was devoted to concept development and proof of approach through simple experimental verification. In this phase, laboratory bench models were built to demonstrate and verify concepts. Following this phase was machine design and integration of the various machine elements. The third phase was machine assembly and debugging. In this phase, the various elements were operated as a unit and modifications were made as required. The final stage of development was the demonstration of the equipment in a pilot production operation.
E-st@r-I experience: Valuable knowledge for improving the e-st@r-II design
NASA Astrophysics Data System (ADS)
Corpino, S.; Obiols-Rabasa, G.; Mozzillo, R.; Nichele, F.
2016-04-01
Many universities all over the world have now established hands-on education programs based on CubeSats. These small and cheap platforms are becoming more and more attractive also for other-than-educational missions, such as technology demonstration, science applications, and Earth observation. This new paradigm requires the development of adequate technology to increase CubeSat performance and mission reliability, because educationally-driven missions have often failed. In 2013 the ESA Education Office launched the Fly Your Satellite! Programme which aims at increasing CubeSat mission reliability through several actions: to improve design implementation, to define best practices for conducting the verification process, and to make the CubeSat community aware of the importance of verification. Within this framework, the CubeSat team at Politecnico di Torino developed the e-st@r-II CubeSat as follow-on of the e-st@r-I satellite, launched in 2012 on the VEGA Maiden Flight. E-st@r-I and e-st@r-II are both 1U satellites with educational and technology demonstration objectives: to give hands-on experience to university students and to test an active attitude determination and control system based on inertial and magnetic measurements with magnetic actuation. This paper describes the know-how gained thanks to the e-st@r-I mission, and how this heritage has been translated into the improvement of the new CubeSat in several areas and lifecycle phases. The CubeSat design has been reviewed to reduce the complexity of the assembly procedure and to deal with possible failures of the on-board computer, for example re-coding the software in the communications subsystem. New procedures have been designed and assessed for the verification campaign accordingly to ECSS rules and with the support of ESA specialists. Different operative modes have been implemented to handle some anomalies observed during the operations of the first satellite. A new version of the on-board software is one of the main modifications. In particular, the activation sequence of the satellite has been modified to have a stepwise switch-on of the satellite. In conclusion, the e-st@r-I experience has provided valuable lessons during its development, verification and on-orbit operations. This know-how has become crucial for the development of the e-st@r-II CubeSat as illustrated in this article.
NASA Astrophysics Data System (ADS)
Cohen, K. K.; Klara, S. M.; Srivastava, R. D.
2004-12-01
The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared sensors and; aerial and satellite imagery. This abstract will describe results, similarities, and contrasts for funded studies from the U.S. DOE's Carbon Sequestration Program including examples from the Sleipner North Sea Project, the Canadian Weyburn Field/Dakota Gasification Plant Project, the Frio Formation Texas Project, and Yolo County Bioreactor Landfill Project. The abstract will also address the following: How are the terms ``measurement,'' ``mitigation''and ``verification'' defined in the Program? What is the U.S. DOE's Carbon Sequestration Program Roadmap and what are the Roadmap goals for MM&V? What is the current status of MM&V technologies?
Approaches to environmental verification of STS free-flier and pallet payloads
NASA Technical Reports Server (NTRS)
Keegan, W. B.
1982-01-01
This paper presents an overview of the environmental verification programs followed on an STS-launched free-flier payload, using the Tracking and Data Relay Satellite (TDRS) as an example, and a pallet payload, using the Office of Space Sciences-1 (OSS-1) as an example. Differences are assessed and rationale given as to why the differing programs were used on the two example payloads. It is concluded that the differences between the programs are due to inherent differences in the payload configuration, their respective mission performance objectives and their operational scenarios rather than to any generic distinctions that differentiate between a free-flier and a pallet payload.
NASA Technical Reports Server (NTRS)
Stehura, Aaron; Rozek, Matthew
2013-01-01
The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.
Random phase encoding for optical security
NASA Astrophysics Data System (ADS)
Wang, RuiKang K.; Watson, Ian A.; Chatwin, Christopher R.
1996-09-01
A new optical encoding method for security applications is proposed. The encoded image (encrypted into the security products) is merely a random phase image statistically and randomly generated by a random number generator using a computer, which contains no information from the reference pattern (stored for verification) or the frequency plane filter (a phase-only function for decoding). The phase function in the frequency plane is obtained using a modified phase retrieval algorithm. The proposed method uses two phase-only functions (images) at both the input and frequency planes of the optical processor leading to maximum optical efficiency. Computer simulation shows that the proposed method is robust for optical security applications.
Verification and benchmark testing of the NUFT computer code
NASA Astrophysics Data System (ADS)
Lee, K. H.; Nitao, J. J.; Kulshrestha, A.
1993-10-01
This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.
NASA Astrophysics Data System (ADS)
Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.
2008-08-01
The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Accurate Biomass Estimation via Bayesian Adaptive Sampling
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay
2005-01-01
The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.
Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data
Chang, C.
2015-07-29
We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg 2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We also find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing.more » These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. Finally, we summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.« less
Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data.
Chang, C; Vikram, V; Jain, B; Bacon, D; Amara, A; Becker, M R; Bernstein, G; Bonnett, C; Bridle, S; Brout, D; Busha, M; Frieman, J; Gaztanaga, E; Hartley, W; Jarvis, M; Kacprzak, T; Kovács, A; Lahav, O; Lin, H; Melchior, P; Peiris, H; Rozo, E; Rykoff, E; Sánchez, C; Sheldon, E; Troxel, M A; Wechsler, R; Zuntz, J; Abbott, T; Abdalla, F B; Allam, S; Annis, J; Bauer, A H; Benoit-Lévy, A; Brooks, D; Buckley-Geer, E; Burke, D L; Capozzi, D; Carnero Rosell, A; Carrasco Kind, M; Castander, F J; Crocce, M; D'Andrea, C B; Desai, S; Diehl, H T; Dietrich, J P; Doel, P; Eifler, T F; Evrard, A E; Fausti Neto, A; Flaugher, B; Fosalba, P; Gruen, D; Gruendl, R A; Gutierrez, G; Honscheid, K; James, D; Kent, S; Kuehn, K; Kuropatkin, N; Maia, M A G; March, M; Martini, P; Merritt, K W; Miller, C J; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Roodman, A; Sako, M; Sanchez, E; Sevilla, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Tarle, G; Thaler, J; Thomas, D; Tucker, D; Walker, A R
2015-07-31
We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing. These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. We summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.
RF model of the distribution system as a communication channel, phase 2. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
Program documentation concerning the design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial configured distribution feeders is presented in these appendices.
DOT National Transportation Integrated Search
2018-01-01
Connected vehicles (CVs) and their integration with transportation infrastructure provide new approaches to wrong-way driving (WWD) detection, warning, verification, and intervention that will help practitioners further reduce the occurrence and seve...
Spacecraft servicing demonstration plan
NASA Technical Reports Server (NTRS)
Bergonz, F. H.; Bulboaca, M. A.; Derocher, W. L., Jr.
1984-01-01
A preliminary spacecraft servicing demonstration plan is prepared which leads to a fully verified operational on-orbit servicing system based on the module exchange, refueling, and resupply technologies. The resulting system can be applied at the space station, in low Earth orbit with an orbital maneuvering vehicle (OMV), or be carried with an OMV to geosynchronous orbit by an orbital transfer vehicle. The three phase plan includes ground demonstrations, cargo bay demonstrations, and free flight verifications. The plan emphasizes the exchange of multimission modular spacecraft (MMS) modules which involves space repairable satellites. Three servicer mechanism configurations are the engineering test unit, a protoflight quality unit, and two fully operational units that have been qualified and documented for use in free flight verification activity. The plan balances costs and risks by overlapping study phases, utilizing existing equipment for ground demonstrations, maximizing use of existing MMS equipment, and rental of a spacecraft bus.
Verification and Validation of EnergyPlus Phase Change Material Model for Opaque Wall Assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.
2012-08-01
Phase change materials (PCMs) represent a technology that may reduce peak loads and HVAC energy consumption in buildings. A few building energy simulation programs have the capability to simulate PCMs, but their accuracy has not been completely tested. This study shows the procedure used to verify and validate the PCM model in EnergyPlus using a similar approach as dictated by ASHRAE Standard 140, which consists of analytical verification, comparative testing, and empirical validation. This process was valuable, as two bugs were identified and fixed in the PCM model, and version 7.1 of EnergyPlus will have a validated PCM model. Preliminarymore » results using whole-building energy analysis show that careful analysis should be done when designing PCMs in homes, as their thermal performance depends on several variables such as PCM properties and location in the building envelope.« less
NASA Technical Reports Server (NTRS)
Watson, Leela R.
2011-01-01
The 45th Weather Squadron Launch Weather Officers use the 12-km resolution North American Mesoscale model (MesoNAM) forecasts to support launch weather operations. In Phase I, the performance of the model at KSC/CCAFS was measured objectively by conducting a detailed statistical analysis of model output compared to observed values. The objective analysis compared the MesoNAM forecast winds, temperature, and dew point to the observed values from the sensors in the KSC/CCAFS wind tower network. In Phase II, the AMU modified the current tool by adding an additional 15 months of model output to the database and recalculating the verification statistics. The bias, standard deviation of bias, Root Mean Square Error, and Hypothesis test for bias were calculated to verify the performance of the model. The results indicated that the accuracy decreased as the forecast progressed, there was a diurnal signal in temperature with a cool bias during the late night and a warm bias during the afternoon, and there was a diurnal signal in dewpoint temperature with a low bias during the afternoon and a high bias during the late night.
Aerothermal modeling program, phase 2
NASA Technical Reports Server (NTRS)
Mongia, H. C.; Patankar, S. V.; Murthy, S. N. B.; Sullivan, J. P.; Samuelsen, G. S.
1985-01-01
The main objectives of the Aerothermal Modeling Program, Phase 2 are: to develop an improved numerical scheme for incorporation in a 3-D combustor flow model; to conduct a benchmark quality experiment to study the interaction of a primary jet with a confined swirling crossflow and to assess current and advanced turbulence and scalar transport models; and to conduct experimental evaluation of the air swirler interaction with fuel injectors, assessments of current two-phase models, and verification the improved spray evaporation/dispersion models.
Life sciences laboratory breadboard simulations for shuttle
NASA Technical Reports Server (NTRS)
Taketa, S. T.; Simmonds, R. C.; Callahan, P. X.
1975-01-01
Breadboard simulations of life sciences laboratory concepts for conducting bioresearch in space were undertaken as part of the concept verification testing program. Breadboard simulations were conducted to test concepts of and scope problems associated with bioresearch support equipment and facility requirements and their operational integration for conducting manned research in earth orbital missions. It emphasized requirements, functions, and procedures for candidate research on crew members (simulated) and subhuman primates and on typical radioisotope studies in rats, a rooster, and plants.
2017-04-01
Calendar year (January 1 through December 31) DO Dissolved oxygen ELWS Water surface elevation ERDC Engineer Research and Development Center ISS...Dorothy H. Tillman, and David L. Smith April 2017 Approved for public release; distribution is unlimited. The U.S. Army Engineer Research ...military engineering, geospatial sciences, water resources, and environmental sciences for the Army, the Department of Defense, civilian agencies
An Exploratory Analysis of Economic Factors in the Navy Total Force Strength Model (NTFSM)
2015-12-01
NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...7 B. NTFSM VERIFICATION AND TESTING ......................................... 8 C
Lightweight Towed Howitzer Demonstrator. Phase 1 and Partial Phase 2. Volume A. Overview.
1987-04-01
Reliability Floyd Manson............................... Test Plans Errol Quick................................. Systems Engrnq Coordi nati on Bob Schmidt ...FMC Structur*1 Verification o Beam stress calculations on the supporting trails which allow 70kpsi in a quasi-isotropic lay up of graphite epoxy...addressed utilizing a damage tolerant design criteria. o Strength calculations are questionable because of the dry room temperature values used. The
NASA Technical Reports Server (NTRS)
Richardson, David
2018-01-01
Model-Based Systems Engineering (MBSE) is the formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases . This presentation will discuss the value proposition that MBSE has for Systems Engineering, and the associated culture change needed to adopt it.
Proton Therapy Dose Characterization and Verification
2016-10-01
than recommended as these patients are on a separate UPENN research study where dose maximum accepted was 6700 cGy. 15... Research Protection Office. 8.0 Data Handling and Record Keeping All patients must have a signed Informed Consent Form and an On - study (confirmation...this award. Phase 1 concentrated on designing and building a Multi-leaf collimator for use in proton therapy. Phase 2 focused on studying the
Landing System Development- Design and Test Prediction of a Lander Leg Using Nonlinear Analysis
NASA Astrophysics Data System (ADS)
Destefanis, Stefano; Buchwald, Robert; Pellegrino, Pasquale; Schroder, Silvio
2014-06-01
Several mission studies have been performed focusing on a soft and precision landing using landing legs. Examples for such missions are Mars Sample Return scenarios (MSR), Lunar landing scenarios (MoonNEXT, Lunar Lander) and small body sample return studies (Marco Polo, MMSR, Phootprint). Such missions foresee a soft landing on the planet surface for delivering payload in a controlled manner and limiting the landing loads.To ensure a successful final landing phase, a landing system is needed, capable of absorbing the residual velocities (vertical, horizontal and angular) at touch- down, and insuring a controlled attitude after landing. Such requirements can be fulfilled by using landing legs with adequate damping.The Landing System Development (LSD) study, currently in its phase 2, foresees the design, analysis, verification, manufacturing and testing of a representative landing leg breadboard based on the Phase B design of the ESA Lunar Lander. Drop tests of a single leg will be performed both on rigid and soft ground, at several impact angles. The activity is covered under ESA contract with TAS-I as Prime Contractor, responsible for analysis and verification, Astrium GmbH for design and test and QinetiQ Space for manufacturing. Drop tests will be performed at the Institute of Space Systems of the German Aerospace Center (DLR-RY) in Bremen.This paper presents an overview of the analytical simulations (test predictions and design verification) performed, comparing the results produced by Astrium made multi body model (rigid bodies, nonlinearities accounted for in mechanical joints and force definitions, based on development tests) and TAS-I made nonlinear explicit model (fully deformable bodies).
DIESEL ENGINE RETROFIT TECHNOLOGY VERIFICATION (POSTER)
ETV is presenting a poster at the EPA's 2005 Science Forum from May 16-18, 2005 in Washington, DC. This poster will contain a summary of the performance results realized by the six verified diesel retrofit technologies, as well as potential impacts that could be realized if sigi...
Reliability and Qualification of Hardware to Enhance the Mission Assurance of JPL/NASA Projects
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni
2010-01-01
Packaging Qualification and Verification (PQV) and life testing of advanced electronic packaging, mechanical assemblies (motors/actuators), and interconnect technologies (flip-chip), platinum temperature thermometer attachment processes, and various other types of hardware for Mars Exploration Rover (MER)/Mars Science Laboratory (MSL), and JUNO flight projects was performed to enhance the mission assurance. The qualification of hardware under extreme cold to hot temperatures was performed with reference to various project requirements. The flight like packages, assemblies, test coupons, and subassemblies were selected for the study to survive three times the total number of expected temperature cycles resulting from all environmental and operational exposures occurring over the life of the flight hardware including all relevant manufacturing, ground operations, and mission phases. Qualification/life testing was performed by subjecting flight-like qualification hardware to the environmental temperature extremes and assessing any structural failures, mechanical failures or degradation in electrical performance due to either overstress or thermal cycle fatigue. Experimental flight qualification test results will be described in this presentation.
An Overview of the JPSS Ground Project Algorithm Integration Process
NASA Astrophysics Data System (ADS)
Vicente, G. A.; Williams, R.; Dorman, T. J.; Williamson, R. C.; Shaw, F. J.; Thomas, W. M.; Hung, L.; Griffin, A.; Meade, P.; Steadley, R. S.; Cember, R. P.
2015-12-01
The smooth transition, implementation and operationalization of scientific software's from the National Oceanic and Atmospheric Administration (NOAA) development teams to the Join Polar Satellite System (JPSS) Ground Segment requires a variety of experiences and expertise. This task has been accomplished by a dedicated group of scientist and engineers working in close collaboration with the NOAA Satellite and Information Services (NESDIS) Center for Satellite Applications and Research (STAR) science teams for the JPSS/Suomi-NPOES Preparatory Project (S-NPP) Advanced Technology Microwave Sounder (ATMS), Cross-track Infrared Sounder (CrIS), Visible Infrared Imaging Radiometer Suite (VIIRS) and Ozone Mapping and Profiler Suite (OMPS) instruments. The presentation purpose is to describe the JPSS project process for algorithm implementation from the very early delivering stages by the science teams to the full operationalization into the Interface Processing Segment (IDPS), the processing system that provides Environmental Data Records (EDR's) to NOAA. Special focus is given to the NASA Data Products Engineering and Services (DPES) Algorithm Integration Team (AIT) functional and regression test activities. In the functional testing phase, the AIT uses one or a few specific chunks of data (granules) selected by the NOAA STAR Calibration and Validation (cal/val) Teams to demonstrate that a small change in the code performs properly and does not disrupt the rest of the algorithm chain. In the regression testing phase, the modified code is placed into to the Government Resources for Algorithm Verification, Integration, Test and Evaluation (GRAVITE) Algorithm Development Area (ADA), a simulated and smaller version of the operational IDPS. Baseline files are swapped out, not edited and the whole code package runs in one full orbit of Science Data Records (SDR's) using Calibration Look Up Tables (Cal LUT's) for the time of the orbit. The purpose of the regression test is to identify unintended outcomes. Overall the presentation provides a general and easy to follow overview of the JPSS Algorithm Change Process (ACP) and is intended to facility the audience understanding of a very extensive and complex process.
Planning and Scheduling of Payloads of AstroSat During Initial and Normal Phase Observations
NASA Astrophysics Data System (ADS)
Pandiyan, R.; Subbarao, S. V.; Nagamani, T.; Rao, Chaitra; Rao, N. Hari Prasad; Joglekar, Harish; Kumar, Naresh; Dumpa, Surya Ratna Prakash; Chauhan, Anshu; Dakshayani, B. P.
2017-06-01
On 28th September 2015, India launched its first astronomical space observatory AstroSat, successfully. AstroSat carried five astronomy payloads, namely, (i) Cadmium Zinc Telluride Imager (CZTI), (ii) Large Area X-ray Proportional Counter (LAXPC), (iii) Soft X-ray Telescope (SXT), (iv) Ultra Violet Imaging Telescope (UVIT) and (v) Scanning Sky Monitor (SSM) and therefore, has the capability to observe celestial objects in multi-wavelength. Four of the payloads are co-aligned along the positive roll axis of the spacecraft and the remaining one is placed along the positive yaw axis direction. All the payloads are sensitive to bright objects and specifically, require avoiding bright Sun within a safe zone of their bore axes in orbit. Further, there are other operational constraints both from spacecraft side and payloads side which are to be strictly enforced during operations. Even on-orbit spacecraft manoeuvres are constrained to about two of the axes in order to avoid bright Sun within this safe zone and a special constrained manoeuvre is exercised during manoeuvres. The planning and scheduling of the payloads during the Performance Verification (PV) phase was carried out in semi-autonomous/manual mode and a complete automation is exercised for normal phase/Guaranteed Time Observation (GuTO) operations. The process is found to be labour intensive and several operational software tools, encompassing spacecraft sub-systems, on-orbit, domain and environmental constraints, were built-in and interacted with the scheduling tool for appropriate decision-making and science scheduling. The procedural details of the complex scheduling of a multi-wavelength astronomy space observatory and their working in PV phase and in normal/GuTO phases are presented in this paper.
Polarization-multiplexed plasmonic phase generation with distributed nanoslits.
Lee, Seung-Yeol; Kim, Kyuho; Lee, Gun-Yeal; Lee, Byoungho
2015-06-15
Methods for multiplexing surface plasmon polaritons (SPPs) have been attracting much attention due to their potentials for plasmonic integrated systems, plasmonic holography, and optical tweezing. Here, using closely-distanced distributed nanoslits, we propose a method for generating polarization-multiplexed SPP phase profiles which can be applied for implementing general SPP phase distributions. Two independent types of SPP phase generation mechanisms - polarization-independent and polarization-reversible ones - are combined to generate fully arbitrary phase profiles for each optical handedness. As a simple verification of the proposed scheme, we experimentally demonstrate that the location of plasmonic focus can be arbitrary designed, and switched by the change of optical handedness.
New analytical solutions to the two-phase water faucet problem
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-06-17
Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F.
2011-12-01
Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence study uncovered with the method of false injection and visualization of the results. Symmetry had dual functionality: there was a bug, which was hidden due to the symmetric nature of a test (it was detected afterward utilizing artificial false injection), on the other hand self-symmetry was used to design a new test, and in a case the analytical solution of the ADR equation was unknown. Assisting subroutines designed to check and post-process conservation of mass and oscillatory behavior. Finally, capability of the solver also checked for stiff reaction source term. The above test suite not only was a decent tool of error detection but also it provided a thorough feedback on the ADR solvers limitations. Such information is the crux of any rigorous numerical modeling for a modeler who deals with surface/subsurface pollution transport.
NASA Technical Reports Server (NTRS)
Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian
2000-01-01
This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.
Gravity Fields and Interiors of the Saturnian Satellites
NASA Technical Reports Server (NTRS)
Rappaport, N. J.; Armstrong, J. W.; Asmar, Sami W.; Iess, L.; Tortora, P.; Somenzi, L.; Zingoni, F.
2006-01-01
This viewgraph presentation reviews the Gravity Science Objectives and accomplishments of the Cassini Radio Science Team: (1) Mass and density of icy satellites (2) Quadrupole field of Titan and Rhea (3) Dynamic Love number of Titan (4) Moment of inertia of Titan (in collaboration with the Radar Team) (5) Gravity field of Saturn. The proposed measurements for the extended tour are: (1) Quadrupole field of Enceladus (2) More accurate measurement of Titan k2 (3) Local gravity/topography correlations for Iapetus (4) Verification/disproof of "Pioneer anomaly".
Earth Sciences annual report, 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younker, L.W.; Donohue, M.L.; Peterson, S.J.
1988-12-01
The Earth Sciences Department at Lawrence Livermore National Laboratory conducts work in support of the Laboratory's energy, defense, and research programs. The Department is organized into ten groups. Five of these -- Nuclear Waste Management, Fossil Energy, Containment, Verification, and Research -- represent major programmatic activities within the Department. Five others -- Experimental Geophysics, Geomechanics, Geology/Geological Engineering, Geochemistry, and Seismology/Applied Geophysics -- are major disciplinary areas that support these and other laboratory programs. This report summarizes work carried out in 1987 by each group and contains a bibliography of their 1987 publications.
2002-12-17
KENNEDY SPACE CENTER, FLA. -- Attached underneath the Orbital Sciences L-1011 aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .
2002-12-17
KENNEDY SPACE CENTER, FLA. -- Attached underneath the Orbital Sciences L-1011 aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .
Phase noise measurements of the 400-kW, 2.115-GHz (S-band) transmitter
NASA Technical Reports Server (NTRS)
Boss, P.; Hoppe, D.; Bhanji, A.
1987-01-01
The measurement theory is described and a test method to perform phase noise verification using off-the-shelf components and instruments is presented. The measurement technique described consists of a double-balanced mixer used as phase detector, followed by a low noise amplifier. An FFT spectrum analyzer is then used to view the modulation components. A simple calibration procedure is outlined that ensures accurate measurements. A block diagram of the configuration is presented as well as actual phase noise data from the 400 kW, 2.115 GHz (S-band) klystron transmitter.
Interfering with the neutron spin
NASA Astrophysics Data System (ADS)
Wagh, Apoorva G.; Rakhecha, Veer Chand
2004-07-01
Charge neutrality, a spin frac{1}{2} and an associated magnetic moment of the neu- tron make it an ideal probe of quantal spinor evolutions. Polarized neutron interferometry in magnetic field Hamiltonians has thus scored several firsts such as direct verification of Pauli anticommutation, experimental separation of geometric and dynamical phases and observation of non-cyclic amplitudes and phases. This paper provides a flavour of the physics learnt from such experiments.
45 CFR 1705.5 - Disclosure of requested information to individuals.
Code of Federal Regulations, 2010 CFR
2010-10-01
... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2010-10-01 2010-10-01 false Disclosure of requested information to individuals...
45 CFR 1705.5 - Disclosure of requested information to individuals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2011-10-01 2011-10-01 false Disclosure of requested information to individuals...
45 CFR 1705.5 - Disclosure of requested information to individuals.
Code of Federal Regulations, 2013 CFR
2013-10-01
... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2013-10-01 2013-10-01 false Disclosure of requested information to individuals...
45 CFR 1705.5 - Disclosure of requested information to individuals.
Code of Federal Regulations, 2012 CFR
2012-10-01
... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2012-10-01 2012-10-01 false Disclosure of requested information to individuals...
45 CFR 1705.5 - Disclosure of requested information to individuals.
Code of Federal Regulations, 2014 CFR
2014-10-01
... COMMISSION ON LIBRARIES AND INFORMATION SCIENCE PRIVACY REGULATIONS § 1705.5 Disclosure of requested information to individuals. Upon verification of identity, the System Manager shall disclose to the individual... 45 Public Welfare 4 2014-10-01 2014-10-01 false Disclosure of requested information to individuals...
NASA Technical Reports Server (NTRS)
Gavin, Thomas R.
2006-01-01
This viewgraph presentation reviews the many parts of the JPL mission planning process that the project manager has to work with. Some of them are: NASA & JPL's institutional requirements, the mission systems design requirements, the science interactions, the technical interactions, financial requirements, verification and validation, safety and mission assurance, and independent assessment, review and reporting.
Information Society: Agenda for Action in the UK.
ERIC Educational Resources Information Center
Phillips of Ellesmere, Lord
1997-01-01
Explains the House of Lords Select Committee on Science and Technology in the UK (United Kingdom) and discusses its report that addresses the need for information technology planning on a national basis. Topics include electronic publishing for access to government publications, universal access, regulatory framework, encryption and verification,…
Automatic extraction of numeric strings in unconstrained handwritten document images
NASA Astrophysics Data System (ADS)
Haji, M. Mehdi; Bui, Tien D.; Suen, Ching Y.
2011-01-01
Numeric strings such as identification numbers carry vital pieces of information in documents. In this paper, we present a novel algorithm for automatic extraction of numeric strings in unconstrained handwritten document images. The algorithm has two main phases: pruning and verification. In the pruning phase, the algorithm first performs a new segment-merge procedure on each text line, and then using a new regularity measure, it prunes all sequences of characters that are unlikely to be numeric strings. The segment-merge procedure is composed of two modules: a new explicit character segmentation algorithm which is based on analysis of skeletal graphs and a merging algorithm which is based on graph partitioning. All the candidate sequences that pass the pruning phase are sent to a recognition-based verification phase for the final decision. The recognition is based on a coarse-to-fine approach using probabilistic RBF networks. We developed our algorithm for the processing of real-world documents where letters and digits may be connected or broken in a document. The effectiveness of the proposed approach is shown by extensive experiments done on a real-world database of 607 documents which contains handwritten, machine-printed and mixed documents with different types of layouts and levels of noise.
Bai, Zhiliang; Chen, Shili; Jia, Lecheng; Zeng, Zhoumo
2018-01-01
Embracing the fact that one can recover certain signals and images from far fewer measurements than traditional methods use, compressive sensing (CS) provides solutions to huge amounts of data collection in phased array-based material characterization. This article describes how a CS framework can be utilized to effectively compress ultrasonic phased array images in time and frequency domains. By projecting the image onto its Discrete Cosine transform domain, a novel scheme was implemented to verify the potentiality of CS for data reduction, as well as to explore its reconstruction accuracy. The results from CIVA simulations indicate that both time and frequency domain CS can accurately reconstruct array images using samples less than the minimum requirements of the Nyquist theorem. For experimental verification of three types of artificial flaws, although a considerable data reduction can be achieved with defects clearly preserved, it is currently impossible to break Nyquist limitation in the time domain. Fortunately, qualified recovery in the frequency domain makes it happen, meaning a real breakthrough for phased array image reconstruction. As a case study, the proposed CS procedure is applied to the inspection of an engine cylinder cavity containing different pit defects and the results show that orthogonal matching pursuit (OMP)-based CS guarantees the performance for real application. PMID:29738452
Cognitive Bias in Systems Verification
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.
Phase II NCAT test track results.
DOT National Transportation Integrated Search
2006-12-01
There is a need to be able to quickly test materials and mixtures in-place, under real traffic. : There have been many developments during the last few years that need verification prior to : adopting. One such development is the new proposed mechani...
NASA Astrophysics Data System (ADS)
Dartevelle, S.
2006-12-01
Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.
Perceptual processing affects conceptual processing.
Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W
2008-04-05
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.
Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Miller, Karen A.; Garner, James R.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less
Multibody modeling and verification
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1989-01-01
A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.
Photometric redshift analysis in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Sánchez, C.; Carrasco Kind, M.; Lin, H.; Miquel, R.; Abdalla, F. B.; Amara, A.; Banerji, M.; Bonnett, C.; Brunner, R.; Capozzi, D.; Carnero, A.; Castander, F. J.; da Costa, L. A. N.; Cunha, C.; Fausti, A.; Gerdes, D.; Greisel, N.; Gschwend, J.; Hartley, W.; Jouvel, S.; Lahav, O.; Lima, M.; Maia, M. A. G.; Martí, P.; Ogando, R. L. C.; Ostrovski, F.; Pellegrini, P.; Rau, M. M.; Sadeh, I.; Seitz, S.; Sevilla-Noarbe, I.; Sypniewski, A.; de Vicente, J.; Abbot, T.; Allam, S. S.; Atlee, D.; Bernstein, G.; Bernstein, J. P.; Buckley-Geer, E.; Burke, D.; Childress, M. J.; Davis, T.; DePoy, D. L.; Dey, A.; Desai, S.; Diehl, H. T.; Doel, P.; Estrada, J.; Evrard, A.; Fernández, E.; Finley, D.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Honscheid, K.; Kim, A.; Kuehn, K.; Kuropatkin, N.; Lidman, C.; Makler, M.; Marshall, J. L.; Nichol, R. C.; Roodman, A.; Sánchez, E.; Santiago, B. X.; Sako, M.; Scalzo, R.; Smith, R. C.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Uddin, S. A.; Valdés, F.; Walker, A.; Yuan, F.; Zuntz, J.
2014-12-01
We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method in a multidimensional colour-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ˜ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.
Photometric redshift analysis in the Dark Energy Survey Science Verification data
Sanchez, C.; Carrasco Kind, M.; Lin, H.; ...
2014-10-09
In this study, we present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method inmore » a multidimensional colour–magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. In addition, empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions σ68 ~ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets.« less
The NASA Carbon Monitoring System
NASA Astrophysics Data System (ADS)
Hurtt, G. C.
2015-12-01
Greenhouse gas emission inventories, forest carbon sequestration programs (e.g., Reducing Emissions from Deforestation and Forest Degradation (REDD and REDD+), cap-and-trade systems, self-reporting programs, and their associated monitoring, reporting and verification (MRV) frameworks depend upon data that are accurate, systematic, practical, and transparent. A sustained, observationally-driven carbon monitoring system using remote sensing data has the potential to significantly improve the relevant carbon cycle information base for the U.S. and world. Initiated in 2010, NASA's Carbon Monitoring System (CMS) project is prototyping and conducting pilot studies to evaluate technological approaches and methodologies to meet carbon monitoring and reporting requirements for multiple users and over multiple scales of interest. NASA's approach emphasizes exploitation of the satellite remote sensing resources, computational capabilities, scientific knowledge, airborne science capabilities, and end-to-end system expertise that are major strengths of the NASA Earth Science program. Through user engagement activities, the NASA CMS project is taking specific actions to be responsive to the needs of stakeholders working to improve carbon MRV frameworks. The first phase of NASA CMS projects focused on developing products for U.S. biomass/carbon stocks and global carbon fluxes, and on scoping studies to identify stakeholders and explore other potential carbon products. The second phase built upon these initial efforts, with a large expansion in prototyping activities across a diversity of systems, scales, and regions, including research focused on prototype MRV systems and utilization of COTS technologies. Priorities for the future include: 1) utilizing future satellite sensors, 2) prototyping with commercial off-the-shelf technology, 3) expanding the range of prototyping activities, 4) rigorous evaluation, uncertainty quantification, and error characterization, 5) stakeholder engagement, 6) partnerships with other U.S. agencies and international partners, and 7) modeling and data assimilation.
NASA Astrophysics Data System (ADS)
Jeffrey, N.; Abdalla, F. B.; Lahav, O.; Lanusse, F.; Starck, J.-L.; Leonard, A.; Kirk, D.; Chang, C.; Baxter, E.; Kacprzak, T.; Seitz, S.; Vikram, V.; Whiteway, L.; Abbott, T. M. C.; Allam, S.; Avila, S.; Bertin, E.; Brooks, D.; Rosell, A. Carnero; Kind, M. Carrasco; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Davis, C.; De Vicente, J.; Desai, S.; Doel, P.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; Hoyle, B.; James, D. J.; Jarvis, M.; Kuehn, K.; Lima, M.; Lin, H.; March, M.; Melchior, P.; Menanteau, F.; Miquel, R.; Plazas, A. A.; Reil, K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.
2018-05-01
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion, not accounting for survey masks or noise. The Wiener filter is well-motivated for Gaussian density fields in a Bayesian framework. GLIMPSE uses sparsity, aiming to reconstruct non-linearities in the density field. We compare these methods with several tests using public Dark Energy Survey (DES) Science Verification (SV) data and realistic DES simulations. The Wiener filter and GLIMPSE offer substantial improvements over smoothed KS with a range of metrics. Both the Wiener filter and GLIMPSE convergence reconstructions show a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated ΛCDM shear catalogues and catalogues with no mass fluctuations (a standard data vector when inferring cosmology from peak statistics); the maximum signal-to-noise of these peak statistics is increased by a factor of 3.5 for the Wiener filter and 9 for GLIMPSE. With simulations we measure the reconstruction of the harmonic phases; the phase residuals' concentration is improved 17% by GLIMPSE and 18% by the Wiener filter. The correlation between reconstructions from data and foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE.
NASA Astrophysics Data System (ADS)
Malphrus, Benjamin Kevin
1990-01-01
The purpose of this study is to examine the sequence of events that led to the establishment of the NRAO, the construction and development of instrumentation and the contributions and discovery events and to relate the significance of these events to the evolution of the sciences of radio astronomy and cosmology. After an overview of the resources, a brief discussion of the early days of the science is given to set the stage for an examination of events that led to the establishment of the NRAO. The developmental and construction phases of the major instruments including the 85-foot Tatel telescope, the 300-foot telescope, the 140-foot telescope, and the Green Bank lnterferometer are examined. The technical evolution of these instruments is traced and their relevance to scientific programs and discovery events is discussed. The history is told in narrative format that is interspersed with technical and scientific explanations. Through the use of original data technical and scientific information of historical concern is provided to elucidate major developments and events. An interpretive discussion of selected programs, events and technological developments that epitomize the contributions of the NRAO to the science of radio astronomy is provided. Scientific programs conducted with the NRAO instruments that were significant to galactic and extragalactic astronomy are presented. NRAO research programs presented include continuum and source surveys, mapping, a high precision verification of general relativity, and SETI programs. Cosmic phenomena investigated in these programs include galactic and extragalactic HI and HII, emission nebula, supernova remnants, cosmic masers, giant molecular clouds, radio stars, normal and radio galaxies, and quasars. Modern NRAO instruments including the VLA and VLBA and their scientific programs are presented in the final chapter as well as plans for future NRAO instruments such as the GBT.
A New Virtual and Remote Experimental Environment for Teaching and Learning Science
NASA Astrophysics Data System (ADS)
Lustigova, Zdena; Lustig, Frantisek
This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.
Space Shuttle External Tank Project status
NASA Technical Reports Server (NTRS)
Davis, R. M.
1980-01-01
The External Tank Project is reviewed with emphasis on the DDT&E and production phases and the lightweight tank development. It is noted that the DDT&E phase is progressing well with the structural and ground vibration test article programs complete, the propulsion test article program progressing well, and the component qualification and verification testing 92% complete. New tools and facilities are being brought on line to support the increased build rate for the production phase. The lightweight tank, which will provide additional payload in orbit, is progressing to schedule with first delivery in early 1982.
4MOST systems engineering: from conceptual design to preliminary design review
NASA Astrophysics Data System (ADS)
Bellido-Tirado, Olga; Frey, Steffen; Barden, Samuel C.; Brynnel, Joar; Giannone, Domenico; Haynes, Roger; de Jong, Roelof S.; Phillips, Daniel; Schnurr, Olivier; Walcher, Jakob; Winkler, Roland
2016-08-01
The 4MOST Facility is a high-multiplex, wide-field, brief-fed spectrograph system for the ESO VISTA telescope. It aims to create a world-class spectroscopic survey facility unique in its combination of wide-field multiplex, spectral resolution, spectral coverage, and sensitivity. At the end of 2014, after a successful concept optimization design phase, 4MOST entered into its Preliminary Design Phase. Here we present the process and tools adopted during the Preliminary Design Phase to define the subsystems specifications, coordinate the interface control documents and draft the system verification procedures.
AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF FOUR MERCURY EMISSION SAMPLING SYSTEMS
CEMs - Tekran Instrument Corp. Series 3300 and Thermo Electron's Mercury Freedom System Continuous Emission Monitors (CEMs) for mercury are designed to determine total and/or chemically speciated vapor-phase mercury in combustion emissions. Performance for mercury CEMs are cont...
Evaluating shallow-flow rock structures as scour countermeasures at bridges.
DOT National Transportation Integrated Search
2009-12-01
A study to determine whether or not shallow-flow rock structures could reliably be used at bridge abutments in place of riprap. Research was conducted in a two-phase effort beginning with numerical modeling and ending with field verification of model...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-26
... from nearly all of the parties in this proceeding. All of these parties raised issues of first impression that were not addressed in the initial phase of this proceeding. The Office is studying these new...
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.
Satellite Power System (SPS) concept definition study (exhibit C)
NASA Technical Reports Server (NTRS)
Haley, G. M.
1979-01-01
The major outputs of the study are the constructability studies which resulted in the definition of the concepts for satellite, rectenna, and satellite construction base construction. Transportation analyses resulted in definition of heavy-lift launch vehicle, electric orbit transfer vehicle, personnel orbit transfer vehicle, and intra-orbit transfer vehicle as well as overall operations related to transportation systems. The experiment/verification program definition resulted in the definition of elements for the Ground-Based Experimental Research and Key Technology plans. These studies also resulted in conceptual approaches for early space technology verification. The cost analysis defined the overall program and cost data for all program elements and phases.
The PLAID graphics analysis impact on the space program
NASA Technical Reports Server (NTRS)
Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.
1994-01-01
An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.
Test and training simulator for ground-based teleoperated in-orbit servicing
NASA Technical Reports Server (NTRS)
Schaefer, Bernd E.
1989-01-01
For the Post-IOC(In-Orbit Construction)-Phase of COLUMBUS it is intended to use robotic devices for the routine operations of ground-based teleoperated In-Orbit Servicing. A hardware simulator for verification of the relevant in-orbit operations technologies, the Servicing Test Facility, is necessary which mainly will support the Flight Control Center for the Manned Space-Laboratories for operational specific tasks like system simulation, training of teleoperators, parallel operation simultaneously to actual in-orbit activities and for the verification of the ground operations segment for telerobotics. The present status of definition for the facility functional and operational concept is described.
Cryo-Vacuum Testing of the Integrated Science Instrument Module for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Kimble, Randy A.; Davila, P. S.; Drury, M. P.; Glazer, S. D.; Krom, J. R.; Lundquist, R. A.; Mann, S. D.; McGuffey, D. B.; Perry, R. L.; Ramey, D. D.
2011-01-01
With delivery of the science instruments for the James Webb Space Telescope (JWST) to Goddard Space Flight Center (GSFC) expected in 2012, current plans call for the first cryo-vacuum test of the Integrated Science Instrument Module (ISIM) to be carried out at GSFC in early 2013. Plans are well underway for conducting this ambitious test, which will perform critical verifications of a number of optical, thermal, and operational requirements of the IS 1M hardware, at its deep cryogenic operating temperature. We describe here the facilities, goals, methods, and timeline for this important Integration & Test milestone in the JWST program.
Ames Research Center life sciences payload
NASA Technical Reports Server (NTRS)
Callahan, P. X.; Tremor, J. W.
1982-01-01
In response to a recognized need for an in-flight animal housing facility to support Spacelab life sciences investigators, a rack and system compatible Research Animal Holding Facility (RAHF) has been developed. A series of ground tests is planned to insure its satisfactory performance under certain simulated conditions of flight exposure and use. However, even under the best conditions of simulation, confidence gained in ground testing will not approach that resulting from actual spaceflight operation. The Spacelab Mission 3 provides an opportunity to perform an inflight Verification Test (VT) of the RAHF. Lessons learned from the RAHF-VT and baseline performance data will be invaluable in preparation for subsequent dedicated life sciences missions.
System engineering of the Atacama Large Millimeter/submillimeter Array
NASA Astrophysics Data System (ADS)
Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel
2012-09-01
The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.
Radiometric, geometric, and image quality assessment of ALOS AVNIR-2 and PRISM sensors
Saunier, S.; Goryl, P.; Chander, G.; Santer, R.; Bouvet, M.; Collet, B.; Mambimba, A.; Kocaman, Aksakal S.
2010-01-01
The Advanced Land Observing Satellite (ALOS) was launched on January 24, 2006, by a Japan Aerospace Exploration Agency (JAXA) H-IIA launcher. It carries three remote-sensing sensors: 1) the Advanced Visible and Near-Infrared Radiometer type 2 (AVNIR-2); 2) the Panchromatic Remote-Sensing Instrument for Stereo Mapping (PRISM); and 3) the Phased-Array type L-band Synthetic Aperture Radar (PALSAR). Within the framework of ALOS Data European Node, as part of the European Space Agency (ESA), the European Space Research Institute worked alongside JAXA to provide contributions to the ALOS commissioning phase plan. This paper summarizes the strategy that was adopted by ESA to define and implement a data verification plan for missions operated by external agencies; these missions are classified by the ESA as third-party missions. The ESA was supported in the design and execution of this plan by GAEL Consultant. The verification of ALOS optical data from PRISM and AVNIR-2 sensors was initiated 4 months after satellite launch, and a team of principal investigators assembled to provide technical expertise. This paper includes a description of the verification plan and summarizes the methodologies that were used for radiometric, geometric, and image quality assessment. The successful completion of the commissioning phase has led to the sensors being declared fit for operations. The consolidated measurements indicate that the radiometric calibration of the AVNIR-2 sensor is stable and agrees with the Landsat-7 Enhanced Thematic Mapper Plus and the Envisat MEdium-Resolution Imaging Spectrometer calibration. The geometrical accuracy of PRISM and AVNIR-2 products improved significantly and remains under control. The PRISM modulation transfer function is monitored for improved characterization.
A System for Mailpiece ZIP Code Assignment through Contextual Analysis. Phase 2
1991-03-01
Segmentation Address Block Interpretation Automatic Feature Generation Word Recognition Feature Detection Word Verification Optical Character Recognition Directory...in the Phase III effort. 1.1 Motivation The United States Postal Service (USPS) deploys large numbers of optical character recognition (OCR) machines...4):208-218, November 1986. [2] Gronmeyer, L. K., Ruffin, B. W., Lybanon, M. A., Neely, P. L., and Pierce, S. E. An Overview of Optical Character Recognition (OCR
Verification of target motion effects on SAR imagery using the Gotcha GMTI challenge dataset
NASA Astrophysics Data System (ADS)
Hack, Dan E.; Saville, Michael A.
2010-04-01
This paper investigates the relationship between a ground moving target's kinematic state and its SAR image. While effects such as cross-range offset, defocus, and smearing appear well understood, their derivations in the literature typically employ simplifications of the radar/target geometry and assume point scattering targets. This study adopts a geometrical model for understanding target motion effects in SAR imagery, termed the target migration path, and focuses on experimental verification of predicted motion effects using both simulated and empirical datasets based on the Gotcha GMTI challenge dataset. Specifically, moving target imagery is generated from three data sources: first, simulated phase history for a moving point target; second, simulated phase history for a moving vehicle derived from a simulated Mazda MPV X-band signature; and third, empirical phase history from the Gotcha GMTI challenge dataset. Both simulated target trajectories match the truth GPS target position history from the Gotcha GMTI challenge dataset, allowing direct comparison between all three imagery sets and the predicted target migration path. This paper concludes with a discussion of the parallels between the target migration path and the measurement model within a Kalman filtering framework, followed by conclusions.
From Cookbook to Collaborative: Transforming a University Biology Laboratory Course
ERIC Educational Resources Information Center
Herron, Sherry S.
2009-01-01
As described in "How People Learn," "Developing Biological Literacy," and by the Commission on Undergraduate Education in the Biological Sciences during the 1960s and early 1970s, laboratories should promote guided-inquiries or investigations, and not simply consist of cookbook or verification activities. However, the only word that could describe…
A Software Hub for High Assurance Model-Driven Development and Analysis
2007-01-23
verification of UML models in TLPVS. In Thomas Baar, Alfred Strohmeier, Ana Moreira, and Stephen J. Mellor, editors, UML 2004 - The Unified Modeling...volume 3785 of Lecture Notes in Computer Science, pages 52–65, Manchester, UK, Nov 2005. Springer. [GH04] Günter Graw and Peter Herrmann. Transformation
WOSMIP II- Workshop on Signatures of Medical and Industrial Isotope Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Murray; Achim, Pascal; Auer, M.
2011-11-01
Medical and industrial fadioisotopes are fundamental tools used in science, medicine and industry with an ever expanding usage in medical practice where their availability is vital. Very sensitive environmental radionuclide monitoring networks have been developed for nuclear-security-related monitoring [particularly Comprehensive Test-Ban-Treaty (CTBT) compliance verification] and are now operational.
Weak lensing magnification in the Dark Energy Survey Science Verification Data
Garcia-Fernandez, M.; et al.
2018-02-02
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Commissioning and Science Verification of JAST/T80
NASA Astrophysics Data System (ADS)
Ederoclte, A.; Cenarro, A. J.; Marín-Franch, A.; Cristóbal-Hornillos, D.; Vázquez Ramió, H.; Varela, J.; Hurier, G.; Moles, M.; Lamadrid, J. L.; Díaz-Martín, M. C.; Iglesias Marzoa, R.; Tilve, V.; Rodríguez, S.; Maícas, N.; Abri, J.
2017-03-01
Located at the Observatorio Astrofísico de Javalambre, the ’’Javalambre Auxiliary Survey Telescope’’ is an 80cm telescope with a unvignetted 2 square degrees field of view. The telescope is equipped with T80Cam, a camera with a large format CCD and two filter wheels which can host, at any given time, 12 filters. The telescope has been designed to provide optical quality all across the field of view, which is achieved with a field corrector. In this talk, I will review the commissioning of the telescope. The optical performance in the centre of the field of view has been tested with lucky imaging technique, providing a telescope PSF of 0.4’’, which is close to the one expected from theory. Moreover, the tracking of the telescope does not affect the image quality, as it has been shown that stars appear round even in exposures of 10minutes obtained without guiding. Most importantly, we present the preliminary results of science verification observations which combine the two main characteristics of this telescope: the large field of view and the special filter set.
Weak lensing magnification in the Dark Energy Survey Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Fernandez, M.; et al.
2016-11-30
In this paper the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using data from the Dark Energy Survey Science Verification dataset. The analysis is carried out for two photometrically-selected galaxy samples, with mean photometric redshifts in themore » $0.2 < z < 0.4$ and $0.7 < z < 1.0$ ranges, in the riz bands. A signal is detected with a $$3.5\\sigma$$ significance level in each of the bands tested, and is compatible with the magnification predicted by the $$\\Lambda$$CDM model. After an extensive analysis, it cannot be attributed to any known systematic effect. The detection of the magnification signal is robust to estimated uncertainties in the outlier rate of the pho- tometric redshifts, but this will be an important issue for use of photometric redshifts in magnification mesurements from larger samples. In addition to the detection of the magnification signal, a method to select the sample with the maximum signal-to-noise is proposed and validated with data.« less
Experimental Verification of the Theory of Oscillating Airfoils
NASA Technical Reports Server (NTRS)
Silverstein, Abe; Joyner, Upshur T
1939-01-01
Measurements have been made of the lift on an airfoil in pitching oscillation with a continuous-recording, instantaneous-force balance. The experimental values for the phase difference between the angle of attack and the lift are shown to be in close agreement with theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less
NASA Astrophysics Data System (ADS)
1999-12-01
The early orbit phase came to an end on 16 December after XMM had been manoeuvred to its final orbit. This required four firings of its thrusters, on successive passages at apogee, in order to increase XMM's velocity, thus elongating its orbit and raising the perigee from 826 km to 7,365 km. One burn was then made to fine tune the apogee to around 114,000km. The spacecraft, being tracked by ground stations in Perth, Kourou and Villafranca, is now circling the Earth in this highly elliptical orbit once every 48 hours. The XMM flight operations staff have found themselves controlling a spacecraft that responds exceptionally well. During these first orbits, the satellite has been oriented several times with razor-sharp precision. On board systems have responded without incident to several thousand instructions sent by controllers. "XMM is flying so beautifully" says Dietmar Heger, XMM Spacecraft Operations Manager. "The satellite is behaving better in space than all our pre-launch simulations and we have been able to adjust our shifts to this more relaxed situation". On his return from French Guiana, Robert Lainé, XMM Project Manager immediately visited the Darmstadt Mission Control Centre, at ESOC. "The perfect behaviour of XMM at this early stage reflects the constructive cooperation of European industrial companies and top scientists. Spacecraft operations are in the hands of professionals who will endeavour to fulfill the expectations of the astronomers and astrophysicists of the world. I am very happy that ESA could provide them with such a wonderful precision tool". During the early orbit phase, controllers have activated part of XMM's science payload. The three EPIC X-ray cameras have been switched on and vented. On 17 December the telescope doors were opened allowing the spacecraft's golden X-ray Multi Mirror modules to see the sky. The Optical Monitor telescope door was opened on 18 December. During this last weekend, XMM's Radiation Monitor which records the flux of cosmic particles and radiations was switched on. Mission controllers have now placed XMM in a quiescent mode for the Christmas and New Year period. Full operations will resume on 4 January with the start of the spacecraft commissioning phase due to last until 15 February. ESA's XMM Science Operations Centre at Villafranca will be brought online early January allowing the start of the exhaustive calibration and performance verification phase of XMM's science instruments. Progress on this calibration should allow the telescope to target and take "firstlight pictures" of its first X-ray sources next March.
Consortium for Verification Technology Fellowship Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadler, Lorraine E.
2017-06-01
As one recipient of the Consortium for Verification Technology (CVT) Fellowship, I spent eight days as a visiting scientist at the University of Michigan, Department of Nuclear Engineering and Radiological Sciences (NERS). During this time, I participated in multiple department and research group meetings and presentations, met with individual faculty and students, toured multiple laboratories, and taught one-half of a one-unit class on Risk Analysis in Nuclear Arms control (six 1.5 hour lectures). The following report describes some of the interactions that I had during my time as well as a brief discussion of the impact of this fellowship onmore » members of the consortium and on me/my laboratory’s technical knowledge and network.« less
The ALMA Band 9 receiver. Design, construction, characterization, and first light
NASA Astrophysics Data System (ADS)
Baryshev, A. M.; Hesper, R.; Mena, F. P.; Klapwijk, T. M.; van Kempen, T. A.; Hogerheijde, M. R.; Jackson, B. D.; Adema, J.; Gerlofsma, G. J.; Bekema, M. E.; Barkhof, J.; de Haan-Stijkel, L. H. R.; van den Bemt, M.; Koops, A.; Keizer, K.; Pieters, C.; Koops van het Jagt, J.; Schaeffer, H. H. A.; Zijlstra, T.; Kroug, M.; Lodewijk, C. F. J.; Wielinga, K.; Boland, W.; de Graauw, M. W. M.; van Dishoeck, E. F.; Jager, H.; Wild, W.
2015-05-01
Aims: We describe the design, construction, and characterization of the Band 9 heterodyne receivers (600-720 GHz) for the Atacama Large Millimeter/submillimeter Array (ALMA). First-light Band 9 data, obtained during ALMA commissioning and science verification phases, are presented as well. Methods: The ALMA Band 9 receiver units (so-called "cartridges"), which are installed in the telescope's front end, have been designed to detect and down-convert two orthogonal linear polarization components of the light collected by the ALMA antennas. The light entering the front end is refocused with a compact arrangement of mirrors, which is fully contained within the cartridge. The arrangement contains a grid to separate the polarizations and two beam splitters to combine each resulting beam with a local oscillator signal. The combined beams are fed into independent double-sideband mixers, each with a corrugated feedhorn coupling the radiation by way of a waveguide with backshort cavity into an impedance-tuned superconductor-insulator-superconductor (SIS) junction that performs the heterodyne down-conversion. Finally, the generated intermediate frequency (IF) signals are amplified by cryogenic and room-temperature HEMT amplifiers and exported to the telescope's IF back end for further processing and, finally, correlation. Results: The receivers have been constructed and tested in the laboratory and they show an excellent performance, complying with ALMA requirements. Performance statistics on all 73 Band 9 receivers are reported. Importantly, two different tunnel-barrier technologies (necessitating different tuning circuits) for the SIS junctions have been used, namely conventional AlOx barriers and the more recent high-current-density AlN barriers. On-sky characterization and tests of the performance of the Band 9 cartridges are presented using commissioning data. Continuum and line images of the low-mass protobinary IRAS 16293-2422 are presented which were obtained as part of the ALMA science verification program. An 8 GHz wide Band 9 spectrum extracted over a 0.3'' × 0.3'' region near source B, containing more than 100 emission lines, illustrates the quality of the data.
Lessons Learned from Optical Payload for Lasercomm Science (OPALS) Mission Operations
NASA Technical Reports Server (NTRS)
Sindiy, Oleg V.; Abrahamson, Matthew J.; Biswas, Abhijit; Wright, Malcolm W.; Padams, Jordan H.; Konyha, Alexander L.
2015-01-01
This paper provides an overview of Optical Payload for Lasercomm Science (OPALS) activities and lessons learned during mission operations. Activities described cover the periods of commissioning, prime, and extended mission operations, during which primary and secondary mission objectives were achieved for demonstrating space-to-ground optical communications. Lessons learned cover Mission Operations System topics in areas of: architecture verification and validation, staffing, mission support area, workstations, workstation tools, interfaces with support services, supporting ground stations, team training, procedures, flight software upgrades, post-processing tools, and public outreach.
Structural Safety of a Hubble Space Telescope Science Instrument
NASA Technical Reports Server (NTRS)
Lou, M. C.; Brent, D. N.
1993-01-01
This paper gives an overview of safety requirements related to structural design and verificationof payloads to be launched and/or retrieved by the Space Shuttle. To demonstrate the generalapproach used to implement these requirements in the development of a typical Shuttle payload, theWide Field/Planetary Camera II, a second generation science instrument currently being developed bythe Jet Propulsion Laboratory (JPL) for the Hubble Space Telescope is used as an example. Inaddition to verification of strength and dynamic characteristics, special emphasis is placed upon thefracture control implementation process, including parts classification and fracture controlacceptability.
2002-12-17
KENNEDY SPACE CENTER, FLA. - An Orbital Sciences L-1011 aircraft arrives at the Cape Canaveral Air Force Station Skid Strip. Attached underneath the aircraft is the Pegasus XL Expendable Launch Vehicle, which will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .
2002-12-17
KENNEDY SPACE CENTER, FLA. -- Workers at the Cape Canaveral Air Force Station Skid Strip stand next to the Pegasus XL Expendable Launch Vehicle underneath the Orbital Sciences L-1011 aircraft. The Pegasus will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-08-24
This study presents a numerical investigation on using the Jacobian-free Newton–Krylov (JFNK) method to solve the two-phase flow four-equation drift flux model with realistic constitutive correlations (‘closure models’). The drift flux model is based on Isshi and his collaborators’ work. Additional constitutive correlations for vertical channel flow, such as two-phase flow pressure drop, flow regime map, wall boiling and interfacial heat transfer models, were taken from the RELAP5-3D Code Manual and included to complete the model. The staggered grid finite volume method and fully implicit backward Euler method was used for the spatial discretization and time integration schemes, respectively. Themore » Jacobian-free Newton–Krylov method shows no difficulty in solving the two-phase flow drift flux model with a discrete flow regime map. In addition to the Jacobian-free approach, the preconditioning matrix is obtained by using the default finite differencing method provided in the PETSc package, and consequently the labor-intensive implementation of complex analytical Jacobian matrix is avoided. Extensive and successful numerical verification and validation have been performed to prove the correct implementation of the models and methods. Code-to-code comparison with RELAP5-3D has further demonstrated the successful implementation of the drift flux model.« less
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1992-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.
An unattended verification station for UF6 cylinders: Field trial findings
NASA Astrophysics Data System (ADS)
Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.
2017-12-01
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.
DOT National Transportation Integrated Search
2015-01-01
This study examines the feasibility of using driven piles to stabilize highway embankment slopes. The literature review showed that there has been significant research done concerning the lateral capacity of piles. This research tends to be focused o...
NASA Astrophysics Data System (ADS)
Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong
2011-04-01
As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.
In-Space Engine (ISE-100) Development - Design Verification Test
NASA Technical Reports Server (NTRS)
Trinh, Huu P.; Popp, Chris; Bullard, Brad
2017-01-01
In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.
Ground based ISS payload microgravity disturbance assessments.
McNelis, Anne M; Heese, John A; Samorezov, Sergey; Moss, Larry A; Just, Marcus L
2005-01-01
In order to verify that the International Space Station (ISS) payload facility racks do not disturb the microgravity environment of neighboring facility racks and that the facility science operations are not compromised, a testing and analytical verification process must be followed. Currently no facility racks have taken this process from start to finish. The authors are participants in implementing this process for the NASA Glenn Research Center (GRC) Fluids and Combustion Facility (FCF). To address the testing part of the verification process, the Microgravity Emissions Laboratory (MEL) was developed at GRC. The MEL is a 6 degree of freedom inertial measurement system capable of characterizing inertial response forces (emissions) of components, sub-rack payloads, or rack-level payloads down to 10(-7) g's. The inertial force output data, generated from the steady state or transient operations of the test articles, are utilized in analytical simulations to predict the on-orbit vibratory environment at specific science or rack interface locations. Once the facility payload rack and disturbers are properly modeled an assessment can be made as to whether required microgravity levels are achieved. The modeling is utilized to develop microgravity predictions which lead to the development of microgravity sensitive ISS experiment operations once on-orbit. The on-orbit measurements will be verified by use of the NASA GRC Space Acceleration Measurement System (SAMS). The major topics to be addressed in this paper are: (1) Microgravity Requirements, (2) Microgravity Disturbers, (3) MEL Testing, (4) Disturbance Control, (5) Microgravity Control Process, and (6) On-Orbit Predictions and Verification. Published by Elsevier Ltd.
TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, G.J.; Pruess
1992-11-01
The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less
Design verification and cold-flow modeling test report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-07-01
This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, andmore » TRW proprietary information has been excluded.« less
NASA Technical Reports Server (NTRS)
Muheim, Danniella; Menzel, Michael; Mosier, Gary; Irish, Sandra; Maghami, Peiman; Mehalick, Kimberly; Parrish, Keith
2010-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2014. System-level verification of critical performance requirements will rely on integrated observatory models that predict the wavefront error accurately enough to verify that allocated top-level wavefront error of 150 nm root-mean-squared (rms) through to the wave-front sensor focal plane is met. The assembled models themselves are complex and require the insight of technical experts to assess their ability to meet their objectives. This paper describes the systems engineering and modeling approach used on the JWST through the detailed design phase.
The Deep Space Network: A Radio Communications Instrument for Deep Space Exploration
NASA Technical Reports Server (NTRS)
Renzetti, N. A.; Stelzried, C. T.; Noreen, G. K.; Slobin, S. D.; Petty, S. M.; Trowbridge, D. L.; Donnelly, H.; Kinman, P. W.; Armstrong, J. W.; Burow, N. A.
1983-01-01
The primary purpose of the Deep Space Network (DSN) is to serve as a communications instrument for deep space exploration, providing communications between the spacecraft and the ground facilities. The uplink communications channel provides instructions or commands to the spacecraft. The downlink communications channel provides command verification and spacecraft engineering and science instrument payload data.
ERIC Educational Resources Information Center
Herrington, Deborah G.; Yezierski, Ellen J.
2014-01-01
The recent revisions to the advanced placement (AP) chemistry curriculum promote deep conceptual understanding of chemistry content over more rote memorization of facts and algorithmic problem solving. For many teachers, this will mean moving away from traditional worksheets and verification lab activities that they have used to address the vast…
1992-01-01
Norman .................................... University of California, San Diego, CA Dan R . Olsen, Jr ........................................ Brigham...Peter G. Poison .............................................. University of Colorado, Boulder, CO James R . Rhyne ................. IBM T J Watson...and artificial intelligence, among which are: * reasoning about concurrent systems, including program verification ( Barringer , 1985), operating
GENESA as an Aid to Incubation/Imagery.
ERIC Educational Resources Information Center
Bruch, Catherine; And Others
1979-01-01
The article focuses on the applications of the GENESA model (a life sized model of the geometry of a biological cell) in the enhancement of the creative processes during the stages of incubation, illumination, and verification, with emphasis primarily on the phase of incubation/imagery through potential illumination. (SBH)
Phase 2 of the Array Automated Assembly Task for the Low Cost Silicon Solar Array Project
NASA Technical Reports Server (NTRS)
Wihl, M.; Torro, J.; Scheinine, A.; Anderson, J.
1978-01-01
An automated process sequence, to manufacture photovoltaic modules at a capacity of approximately 500 MW per year at a cost of approximately $0.50 per peak watt is described. Verification tests were performed and are reported along with cost predictions.
DOT National Transportation Integrated Search
1987-08-01
One of the primary reasons that highway departments are hesitant to use heat-straightening techniques to repair damaged steel girders is the lack of experimental verification of the process. A comprehensive experimental program on the subject has bee...
Geothermal Resource Verification for Air Force Bases,
1981-06-01
phase of reservoir - ... geothermal techniques will begin to focus on the deeer, iso ’i fined reservoirs that will have little or no definitive surfa...1976. ;L-ison, D. L., PROGRAM REVIEW, GEOTHERMAL EXPLORATION AND ASSESSMENT TECHNOLOGY PROGRAM, U. S. Department of Energy, DOE/ET/ 27002 -6, December 1979
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PCB DETECTION TECHNOLOGY, HYBRIZYME DELFIA TM ASSAY
The DELFIA PCB Assay is a solid-phase time-resolved fluoroimmunoassay based on the sequential addition of sample extract and europium-labeled PCB tracer to a monoclonal antibody reagent specific for PCBs. In this assay, the antibody reagent and sample extract are added to a strip...
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
1991-06-28
and examined various models as possible alternatives to TRANSMO. None of the candidate models met all CAA’s requirements, so a major TERP recommendation...will simulate the mobilization of U.S. forces, deployment of forces and supplies across an intertheater network, and deployment of forces and... supplies to the combat zone. 1.2 Phase !1 IV&V Summary Potomac Systems Engineering, Inc. (PSE), is providing IV&V support to CAA during the GDAS development
ALMA observations of TiO2 around VY Canis Majoris
NASA Astrophysics Data System (ADS)
De Beck, E.; Vlemmings, W.; Muller, S.; Black, J. H.; O'Gorman, E.; Richards, A. M. S.; Baudry, A.; Maercker, M.; Decin, L.; Humphreys, E. M.
2015-08-01
Context. Titanium dioxide, TiO2, is a refractory species that could play a crucial role in the dust-condensation sequence around oxygen-rich evolved stars. To date, gas phase TiO2 has been detected only in the complex environment of the red supergiant VY CMa. Aims: We aim to constrain the distribution and excitation of TiO2 around VY CMa in order to clarify its role in dust formation. Methods: We analyse spectra and channel maps for TiO2 extracted from ALMA science verification data. Results: We detect 15 transitions of TiO2, and spatially resolve the emission for the first time. The maps demonstrate a highly clumpy, anisotropic outflow in which the TiO2 emission likely traces gas exposed to the stellar radiation field. An accelerating bipolar-like structure is found, oriented roughly east-west, of which the blue component runs into and breaks up around a solid continuum component. A distinct tail to the south-west is seen for some transitions, consistent with features seen in the optical and near-infrared. Conclusions: We find that a significant fraction of TiO2 remains in the gas phase outside the dust-formation zone and suggest that this species might play only a minor role in the dust-condensation process around extreme oxygen-rich evolved stars like VY CMa. Appendix A is available in electronic form at http://www.aanda.org
Using virtual reality for science mission planning: A Mars Pathfinder case
NASA Technical Reports Server (NTRS)
Kim, Jacqueline H.; Weidner, Richard J.; Sacks, Allan L.
1994-01-01
NASA's Mars Pathfinder Project requires a Ground Data System (GDS) that supports both engineering and scientific payloads with reduced mission operations staffing, and short planning schedules. Also, successful surface operation of the lander camera requires efficient mission planning and accurate pointing of the camera. To meet these challenges, a new software strategy that integrates virtual reality technology with existing navigational ancillary information and image processing capabilities. The result is an interactive workstation based applications software that provides a high resolution, 3-dimensial, stereo display of Mars as if it were viewed through the lander camera. The design, implementation strategy and parametric specification phases for the development of this software were completed, and the prototype tested. When completed, the software will allow scientists and mission planners to access simulated and actual scenes of Mars' surface. The perspective from the lander camera will enable scientists to plan activities more accurately and completely. The application will also support the sequence and command generation process and will allow testing and verification of camera pointing commands via simulation.
Improving the fiber coupling efficiency for DARWIN by loss-less shaping of the receive beams
NASA Astrophysics Data System (ADS)
Voland, Ch.; Weigel, Th.; Dreischer, Th.; Wallner, O.; Ergenzinger, K.; Ries, H.; Jetter, R.; Vosteen, A.
2017-11-01
For the DARWIN mission the extremely low planet signal levels require an optical instrument design with utmost efficiency to guarantee the required science performance. By shaping the transverse amplitude and phase distributions of the receive beams, the singlemode fibre coupling efficiency can be increased to almost 100%, thus allowing for a gain of more than 20% compared to conventional designs. We show that the use of "tailored freeform surfaces" for purpose of beam shaping dramatically reduces the coupling degradations, which otherwise result from mode mismatch between the Airy pattern of the image and the fibre mode, and therefore allows for achieving a performance close to the physical limitations. We present an application of tailored surfaces for building a beam shaping optics that shall enhance fibre coupling performance as core part of a space based interferometer in the future DARWIN mission and present performance predictions by wave-optical simulations. We assess the feasibility of manufacturing the corresponding tailored surfaces and describe the proof of concept demonstrator we use for experimental performance verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy
The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup,more » under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.« less
NASA Astrophysics Data System (ADS)
Gozzard, David R.; Schediwy, Sascha W.; Dodson, Richard; Rioja, María J.; Hill, Mike; Lennon, Brett; McFee, Jock; Mirtschin, Peter; Stevens, Jamie; Grainge, Keith
2017-07-01
In order to meet its cutting-edge scientific objectives, the Square Kilometre Array (SKA) telescope requires high-precision frequency references to be distributed to each of its antennas. The frequency references are distributed via fiber-optic links and must be actively stabilized to compensate for phase noise imposed on the signals by environmental perturbations on the links. SKA engineering requirements demand that any proposed frequency reference distribution system be proved in “astronomical verification” tests. We present results of the astronomical verification of a stabilized frequency reference transfer system proposed for SKA-mid. The dual-receiver architecture of the Australia Telescope Compact Array was exploited to subtract the phase noise of the sky signal from the data, allowing the phase noise of observations performed using a standard frequency reference, as well as the stabilized frequency reference transfer system transmitting over 77 km of fiber-optic cable, to be directly compared. Results are presented for the fractional frequency stability and phase drift of the stabilized frequency reference transfer system for celestial calibrator observations at 5 and 25 GHz. These observations plus additional laboratory results for the transferred signal stability over a 166 km metropolitan fiber-optic link are used to show that the stabilized transfer system under test exceeds all SKA phase-stability requirements within a broad range of observing conditions. Furthermore, we have shown that alternative reference dissemination systems that use multiple synthesizers to supply reference signals to sub-sections of an array may limit the imaging capability of the telescope.
An unattended verification station for UF 6 cylinders: Field trial findings
Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...
2017-08-26
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
An unattended verification station for UF 6 cylinders: Field trial findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L. E.; Miller, K. A.; McDonald, B. S.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
Optical Verification Laboratory Demonstration System for High Security Identification Cards
NASA Technical Reports Server (NTRS)
Javidi, Bahram
1997-01-01
Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the primary pattern [1-3]. We have demonstrated experimentally an optical processor for security verification of objects, products, and persons. This demonstration is very important to encourage industries to consider the proposed system for research and development.
ERIC Educational Resources Information Center
Grooms, Jonathon; Sampson, Victor; Golden, Barry
2014-01-01
This quasi-experimental study uses a pre-/post-intervention approach to investigate the quality of undergraduate students' arguments in the context of socioscientific issues (SSI) based on experiencing a semester of traditional "cookbook" instruction (N?=?79) or a semester of argument-based instruction (N?=?73) in the context of an…
2009-03-01
thanks for paving the way for me, giving me guidance, and working with me to build the laboratory scintillometer. Thanks to my coworkers Seth Marek...Washington, DC as a function of relative humidity. Journal of Atmospheric Sciences, 39, 1838-1852. Good, E. R., Watkins , B. J., Quesada, A. F., Brown, J
Applications of HCMM satellite data to the study of urban heating patterns
NASA Technical Reports Server (NTRS)
Carlson, T. N. (Principal Investigator)
1980-01-01
A research summary is presented and is divided into two major areas, one developmental and the other basic science. In the first three sub-categories are discussed: image processing techniques, especially the method whereby surface temperature image are converted to images of surface energy budget, moisture availability and thermal inertia; model development; and model verification. Basic science includes the use of a method to further the understanding of the urban heat island and anthropogenic modification of the surface heating, evaporation over vegetated surfaces, and the effect of surface heat flux on plume spread.
2002-12-17
KENNEDY SPACE CENTER, FLA. -- Workers at the Cape Canaveral Air Force Station Skid Strip get ready to remove the Pegasus XL Expendable Launch Vehicle attached underneath the Orbital Sciences L-1011 aircraft. The Pegasus will be transported to the Multi-Payload Processing Facility for testing and verification. The Pegasus will undergo three flight simulations prior to its scheduled launch in late January 2003. The Pegasus XL will carry NASA's Solar Radiation and Climate Experiment (SORCE) into orbit. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere. .
A Study of the Access to the Scholarly Record from a Hospital Health Science Core Collection *
Williams, James F.; Pings, Vern M.
1973-01-01
This study is an effort to determine possible service performance levels in hospital libraries based on access to the scholarly record of medicine through selected lists of clinical journals and indexing and abstracting journals. The study was designed to test a methodology as well as to provide data for planning and management decisions for health science libraries. Findings and conclusions cover the value of a core collection of journals, length of journal files, performance of certain bibliographic instruments in citation verification, and the implications of study data for library planning and management. PMID:4744345
Adherence to balance tolerance limits at the Upper Mississippi Science Center, La Crosse, Wisconsin.
Myers, C.T.; Kennedy, D.M.
1998-01-01
Verification of balance accuracy entails applying a series of standard masses to a balance prior to use and recording the measured values. The recorded values for each standard should have lower and upper weight limits or tolerances that are accepted as verification of balance accuracy under normal operating conditions. Balance logbooks for seven analytical balances at the Upper Mississippi Science Center were checked over a 3.5-year period to determine if the recorded weights were within the established tolerance limits. A total of 9435 measurements were checked. There were 14 instances in which the balance malfunctioned and operators recorded a rationale in the balance logbook. Sixty-three recording errors were found. Twenty-eight operators were responsible for two types of recording errors: Measurements of weights were recorded outside of the tolerance limit but not acknowledged as an error by the operator (n = 40); and measurements were recorded with the wrong number of decimal places (n = 23). The adherence rate for following tolerance limits was 99.3%. To ensure the continued adherence to tolerance limits, the quality-assurance unit revised standard operating procedures to require more frequent review of balance logbooks.
CMB lensing tomography with the DES Science Verification galaxies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giannantonio, T.
We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less
CMB lensing tomography with the DES Science Verification galaxies
Giannantonio, T.
2016-01-07
We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less
Verification System: First System-Wide Performance Test
NASA Astrophysics Data System (ADS)
Chernobay, I.; Zerbo, L.
2006-05-01
System-wide performance tests are essential for the development, testing and evaluation of individual components of the verification system. In addition to evaluating global readiness it helps establishing the practical and financial requirements for eventual operations. The first system-wide performance test (SPT1) was conducted in three phases: - A preparatory phase in May-June 2004 - A performance testing phase in April-June 2005 - An evaluation phase in the last half of 2005. The preparatory phase was developmental in nature. The main objectives for the performance testing phase included establishment of performance baseline under current provisional mode of operation (CTBT/PC- 19/1/Annex II, CTBT/WGB-21/1), examination of established requirements and procedures for operation and maintenance. To establish a system-wide performance baseline the system configuration was fixed for April-May 2005. The third month (June 2005) was used for implementation of 21 test case scenarios to examine either particular operational procedures or the response of the system components to the failures simulated under controlled conditions. A total of 163 stations and 5 certified radionuclide laboratories of International Monitoring System (IMS) participated in the performance testing phase - about 50% of the eventual IMS network. 156 IMS facilities and 40 National Data Centres (NDCs) were connected to the International Data Centre (IDC) via Global Communication Infrastructure (GCI) communication links. In addition, 12 legacy stations in the auxiliary seismic network sent data to the IDC over the Internet. During the performance testing phase, the IDC produced all required products, analysed more than 6100 seismic events and 1700 radionuclide spectra. Performance of all system elements was documented and analysed. IDC products were compared with results of data processing at the NDCs. On the basis of statistics and information collected during the SPT1 a system-wide performance baseline under current guidelines for provisional Operation and Maintenance was established. The test provided feedback for further development of the draft IMS and IDC Operational Manuals and identified priority areas for further system development.
The Static Pac was verified at a natural gas compressor station operated by ANR Pipeline Company. The test was carried out on two engines (8-cylinder, 2000 hp), each with two reciprocating compressors operating in parallel (4 in. rods). The evaluation focused on two shutdown proc...
Marcus Schortemeyer; Ken Thomas; Robert A. Haack; Adnan Uzunovic; Kelli Hoover; Jack A. Simpson; Cheryl A. Grgurinovic
2011-01-01
Following the increasing international phasing out of methyl bromide for quarantine purposes, the development of alternative treatments for timber pests becomes imperative. The international accreditation of new quarantine treatments requires verification standards that give confidence in the effectiveness of a treatment. Probit-9 mortality is a standard for treatment...
DOT National Transportation Integrated Search
1978-12-01
This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...
ERDEC Contribution to the 1993 International Treaty Verification Round Robin Exercise 4
1994-07-01
COLUMN Detector: MS (Finnigan 5100) Phase: AT-5 Manufacturer: Alltech GC CONDITIONS Length: 25 m Carrier gas: Helium Inner diameter: 0.25 mm Carrier...ionized in the ion source. The resulting CH,* then chemically reacts with the analyte. The advantage of this technique is that because less energy is
Standardized Radiation Shield Design Methods: 2005 HZETRN
NASA Technical Reports Server (NTRS)
Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.
2006-01-01
Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.
Special features of the CLUSTER antenna and radial booms design, development and verification
NASA Technical Reports Server (NTRS)
Gianfiglio, G.; Yorck, M.; Luhmann, H. J.
1995-01-01
CLUSTER is a scientific space mission to in-situ investigate the Earth's plasma environment by means of four identical spin-stabilized spacecraft. Each spacecraft is provided with a set of four rigid booms: two Antenna Booms and two Radial Booms. This paper presents a summary of the boom development and verification phases addressing the key aspects of the Radial Boom design. In particular, it concentrates on the difficulties encountered in fulfilling simultaneously the requirements of minimum torque ratio and maximum allowed shock loads at boom latching for this two degree of freedom boom. The paper also provides an overview of the analysis campaign and testing program performed to achieve sufficient confidence in the boom performance and operation.
NASA Technical Reports Server (NTRS)
Platt, R.
1999-01-01
This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.
Array automated assembly task, phase 2. Low cost silicon solar array project
NASA Technical Reports Server (NTRS)
Rhee, S. S.; Jones, G. T.; Allison, K. T.
1978-01-01
Several modifications instituted in the wafer surface preparation process served to significantly reduce the process cost to 1.55 cents per peak watt in 1975 cents. Performance verification tests of a laser scanning system showed a limited capability to detect hidden cracks or defects, but with potential equipment modifications this cost effective system could be rendered suitable for applications. Installation of electroless nickel plating system was completed along with an optimization of the wafer plating process. The solder coating and flux removal process verification test was completed. An optimum temperature range of 500-550 C was found to produce uniform solder coating with the restriction that a modified dipping procedure is utilized. Finally, the construction of the spray-on dopant equipment was completed.
NASA Technical Reports Server (NTRS)
Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)
1983-01-01
Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.
Results of the life sciences DSOs conducted aboard the space shuttle 1981-1986
NASA Technical Reports Server (NTRS)
Bungo, Michael W.; Bagian, Tandi M.; Bowman, Mark A.; Levitan, Barry M.
1987-01-01
Results are presented for a number of life sciences investigations sponsored by the Space Biomedical Research Institute at the NASA Lyndon B. Johnson Space Center and conducted as Detailed Supplementary Objectives (DSOs) on Space Shuttle flights between 1981 and 1986. An introduction and a description of the DSO program are followed by summary reports on the investigations. Reports are grouped into the following disciplines: Biochemistry and Pharmacology, Cardiovascular Effects and Fluid Shifts, Equipment Testing and Experiment Verification, Microbiology, Space Motion Sickness, and Vision. In the appendix, the status of every medical/life science DSO is presented in graphical form, which enables the flight history, the number of subjects tested, and the experiment results to be reviewed at a glance.
A new phase-correlation-based iris matching for degraded images.
Krichen, Emine; Garcia-Salicetti, Sonia; Dorizzi, Bernadette
2009-08-01
In this paper, we present a new phase-correlation-based iris matching approach in order to deal with degradations in iris images due to unconstrained acquisition procedures. Our matching system is a fusion of global and local Gabor phase-correlation schemes. The main originality of our local approach is that we do not only consider the correlation peak amplitudes but also their locations in different regions of the images. Results on several degraded databases, namely, the CASIA-BIOSECURE and Iris Challenge Evaluation 2005 databases, show the improvement of our method compared to two available reference systems, Masek and Open Source for Iris (OSRIS), in verification mode.
NASA Technical Reports Server (NTRS)
Disimile, Peter J.; Heist, Timothy J.
1990-01-01
The fluid behavior in normal gravity of a single phase gas system and a two phase gas/liquid system in an enclosed circular cylinder heated suddenly and nonuniformly from above was investigated. Flow visualization was used to obtain qualitative data on both systems. The use of thermochromatic liquid crystal particles as liquid phase flow tracers was evaluated as a possible means of simultaneously gathering both flow pattern and temperature gradient data for the two phase system. The results of the flow visualization experiments performed on both systems can be used to gain a better understanding of the behavior of such systems in a reduced gravity environment and aid in the verification of a numerical model of the system.
SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Jinfeng; Cao, Ruifen; Dai, Yumei
Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and speed satisfy the clinical requirement, and it is expectable to be a Monte Carlo dose verification tool for IMRT Plan. Strategic Priority Research Program of the China Academy of Science(XDA03040000); National Natural Science Foundation of China (81101132)« less
Kuc, Roman
2018-04-01
This paper describes phase-sensitive and phase-insensitive processing of monaural echolocation waveforms to generate target maps. Composite waveforms containing both the emission and echoes are processed to estimate the target impulse response using an audible sonar. Phase-sensitive processing yields the composite signal envelope, while phase-insensitive processing that starts with the composite waveform power spectrum yields the envelope of the autocorrelation function. Analysis and experimental verification show that multiple echoes form an autocorrelation function that produces near-range phantom-reflector artifacts. These artifacts interfere with true target echoes when the first true echo occurs at a time that is less than the total duration of the target echoes. Initial comparison of phase-sensitive and phase-insensitive maps indicates that both display important target features, indicating that phase is not vital. A closer comparison illustrates the improved resolution of phase-sensitive processing, the near-range phantom-reflectors produced by phase-insensitive processing, and echo interference and multiple reflection artifacts that were independent of the processing.
WFIRST: Update on the Coronagraph Science Requirements
NASA Astrophysics Data System (ADS)
Douglas, Ewan S.; Cahoy, Kerri; Carlton, Ashley; Macintosh, Bruce; Turnbull, Margaret; Kasdin, Jeremy; WFIRST Coronagraph Science Investigation Teams
2018-01-01
The WFIRST Coronagraph instrument (CGI) will enable direct imaging and low resolution spectroscopy of exoplanets in reflected light and imaging polarimetry of circumstellar disks. The CGI science investigation teams were tasked with developing a set of science requirements which advance our knowledge of exoplanet occurrence and atmospheric composition, as well as the composition and morphology of exozodiacal debris disks, cold Kuiper Belt analogs, and protoplanetary systems. We present the initial content, rationales, validation, and verification plans for the WFIRST CGI, informed by detailed and still-evolving instrument and observatory performance models. We also discuss our approach to the requirements development and management process, including the collection and organization of science inputs, open source approach to managing the requirements database, and the range of models used for requirements validation. These tools can be applied to requirements development processes for other astrophysical space missions, and may ease their management and maintenance. These WFIRST CGI science requirements allow the community to learn about and provide insights and feedback on the expected instrument performance and science return.
NASA Technical Reports Server (NTRS)
Johnston, Shaida
2004-01-01
The term verification implies compliance verification in the language of treaty negotiation and implementation, particularly in the fields of disarmament and arms control. The term monitoring on the other hand, in both environmental and arms control treaties, has a much broader interpretation which allows for use of supporting data sources that are not necessarily acceptable or adequate for direct verification. There are many ways that satellite Earth observation (EO) data can support international environmental agreements, from national forest inventories to use in geographic information system (GIs) tools. Though only a few references to satellite EO data and their use exist in the treaties themselves, an expanding list of applications can be considered in support of multilateral environmental agreements (MEAs). This paper explores the current uses of satellite Earth observation data which support monitoring activities of major environmental treaties and draws conclusions about future missions and their data use. The scope of the study includes all phases of environmental treaty fulfillment - development, monitoring, and enforcement - and includes a multinational perspective on the use of satellite Earth observation data for treaty support.
NASA Astrophysics Data System (ADS)
Monteiro, Anna Karina
Research acknowledges that if students are to be successful science, they must learn to navigate and cross cultural borders that exist between their own cultures and the subculture of science. This dissertation utilized a mixed methods approach to explore how inservice science teachers working in urban schools construct their ideas of and apply the concepts about the culture of science and cultural border crossing as relevant to the teaching and learning of science. The study used the lenses of cultural capital, social constructivism, and cultural congruency in the design and analysis of each of the three phases of data collection. Phase I identified the perspectives of six inservice science teachers on science culture, cultural border crossing, and which border crossing methods, if any, they used during science teaching. Phase II took a dialectical approach as the teachers read about science culture and cultural border crossing during three informal professional learning community meetings. This phase explored how teachers constructed their understanding of cultural border crossing and how the concept applied to the teaching and learning of science. Phase III evaluated how teachers' perspectives changed from Phase I. In addition, classroom observations were used to determine whether teachers' practices in their science classrooms changed from Phase I to Phase III. All three phases collected data through qualitative (i.e., interviews, classroom observations, and surveys) and quantitative (Likert items) means. The findings indicated that teachers found great value in learning about the culture of science and cultural border crossing as it pertained to their teaching methods. This was not only evidenced by their interviews and surveys, but also in the methods they used in their classrooms. Final conclusions included how the use of student capital resources (prior experiences, understandings and knowledge, ideas an interests, and personal beliefs), if supported by science practices and skills increases student cultural capital. With a greater cultural capital, the students experience cultural congruency between their cultures and the culture of science, enabling them to cross such borders in the science classroom. The implications such findings have on teacher training programs and professional development are discussed.
Construction Status and Early Science with the Daniel K. Inouye Solar Telescope
NASA Astrophysics Data System (ADS)
McMullin, Joseph P.; Rimmele, Thomas R.; Warner, Mark; Martinez Pillet, Valentin; Craig, Simon; Woeger, Friedrich; Tritschler, Alexandra; Berukoff, Steven J.; Casini, Roberto; Goode, Philip R.; Knoelker, Michael; Kuhn, Jeffrey Richard; Lin, Haosheng; Mathioudakis, Mihalis; Reardon, Kevin P.; Rosner, Robert; Schmidt, Wolfgang
2016-05-01
The 4-m Daniel K. Inouye Solar Telescope (DKIST) is in its seventh year of overall development and its fourth year of site construction on the summit of Haleakala, Maui. The Site Facilities (Utility Building and Support & Operations Building) are in place with ongoing construction of the Telescope Mount Assembly within. Off-site the fabrication of the component systems is completing with early integration testing and verification starting.Once complete this facility will provide the highest sensitivity and resolution for study of solar magnetism and the drivers of key processes impacting Earth (solar wind, flares, coronal mass ejections, and variability in solar output). The DKIST will be equipped initially with a battery of first light instruments which cover a spectral range from the UV (380 nm) to the near IR (5000 nm), and capable of providing both imaging and spectro-polarimetric measurements throughout the solar atmosphere (photosphere, chromosphere, and corona); these instruments are being developed by the National Solar Observatory (Visible Broadband Imager), High Altitude Observatory (Visible Spectro-Polarimeter), Kiepenheuer Institute (Visible Tunable Filter) and the University of Hawaii (Cryogenic Near-Infrared Spectro-Polarimeter and the Diffraction-Limited Near-Infrared Spectro-Polarimeter). Further, a United Kingdom consortium led by Queen's University Belfast is driving the development of high speed cameras essential for capturing the highly dynamic processes measured by these instruments. Finally, a state-of-the-art adaptive optics system will support diffraction limited imaging capable of resolving features approximately 20 km in scale on the Sun.We present the overall status of the construction phase along with the current challenges as well as a review of the planned science testing and the transition into early science operations.
FPGA-based real-time phase measuring profilometry algorithm design and implementation
NASA Astrophysics Data System (ADS)
Zhan, Guomin; Tang, Hongwei; Zhong, Kai; Li, Zhongwei; Shi, Yusheng
2016-11-01
Phase measuring profilometry (PMP) has been widely used in many fields, like Computer Aided Verification (CAV), Flexible Manufacturing System (FMS) et al. High frame-rate (HFR) real-time vision-based feedback control will be a common demands in near future. However, the instruction time delay in the computer caused by numerous repetitive operations greatly limit the efficiency of data processing. FPGA has the advantages of pipeline architecture and parallel execution, and it fit for handling PMP algorithm. In this paper, we design a fully pipelined hardware architecture for PMP. The functions of hardware architecture includes rectification, phase calculation, phase shifting, and stereo matching. The experiment verified the performance of this method, and the factors that may influence the computation accuracy was analyzed.
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
NASA Astrophysics Data System (ADS)
Akers, James C.; Passe, Paul J.; Cooper, Beth A.
2005-09-01
The Acoustical Testing Laboratory (ATL) at the NASA John H. Glenn Research Center (GRC) in Cleveland, OH, provides acoustic emission testing and noise control engineering services for a variety of specialized customers, particularly developers of equipment and science experiments manifested for NASA's manned space missions. The ATL's primary customer has been the Fluids and Combustion Facility (FCF), a multirack microgravity research facility being developed at GRC for the USA Laboratory Module of the International Space Station (ISS). Since opening in September 2000, ATL has conducted acoustic emission testing of components, subassemblies, and partially populated FCF engineering model racks. The culmination of this effort has been the acoustic emission verification tests on the FCF Combustion Integrated Rack (CIR) and Fluids Integrated Rack (FIR), employing a procedure that incorporates ISO 11201 (``Acoustics-Noise emitted by machinery and equipment-Measurement of emission sound pressure levels at a work station and at other specified positions-Engineering method in an essentially free field over a reflecting plane''). This paper will provide an overview of the test methodology, software, and hardware developed to perform the acoustic emission verification tests on the CIR and FIR flight racks and lessons learned from these tests.
Automated Array Assembly, Phase 2
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1979-01-01
The Automated Array Assembly Task, Phase 2 of the Low Cost Silicon Solar Array Project is a process development task. The contract provides for the fabrication of modules from large area tandem junction cells (TJC). During this quarter, effort was focused on the design of a large area, approximately 36 sq cm, TJC and process verification runs. The large area TJC design was optimized for minimum I squared R power losses. In the TJM activity, the cell-module interfaces were defined, module substrates were formed and heat treated and clad metal interconnect strips were fabricated.
1982-11-01
ment, S,(1rct se’lection, design reviews, au- forwarded to HQ USAF/RDM. dits. valiatin.verification (of computer prgrams s), testinr, ani acceptance...Development phases of the system acquisition in order to prevent duplication. (7) Test planning during the production and post deployment phase will be designed...response to AIRTASKS will be idcntificd in the SLCL to prevent duplication and permit disseninacion of the total information available, concerning the
A High Power Density Single-Phase PWM Rectifier with Active Ripple Energy Storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, Puqi; Wang, Ruxi; Wang, Fei
It is well known that there exist second-order harmonic current and corresponding ripple voltage on dc bus for single phase PWM rectifiers. The low frequency harmonic current is normally filtered using a bulk capacitor in the bus which results in low power density. This paper proposed an active ripple energy storage method that can effectively reduce the energy storage capacitance. The feed-forward control method and design considerations are provided. Simulation and 15 kW experimental results are provided for verification purposes.
Modeling the Water Balloon Slingshot
NASA Astrophysics Data System (ADS)
Bousquet, Benjamin D.; Figura, Charles C.
2013-01-01
In the introductory physics courses at Wartburg College, we have been working to create a lab experience focused on the scientific process itself rather than verification of physical laws presented in the classroom or textbook. To this end, we have developed a number of open-ended modeling exercises suitable for a variety of learning environments, from non-science major classes to algebra-based and calculus-based introductory physics classes.
76 FR 51147 - Medicaid Program; Eligibility Changes Under the Affordable Care Act of 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... elect to phase-in'' coverage for this optional group ``based on the categorical group (including non... (Sec. 435.603(f)(2)) (2) Household Composition for Non-Filers (Sec. 435.603(f)(3)) (3) Retention of... (Sec. 435.952) 5. Verification of Other Non-Financial Information (Sec. 435.956) H. Periodic...
Development Status for the Stennis Space Center LIDAR Product Characterization Range
NASA Technical Reports Server (NTRS)
Zanoni, Vicki; Berglund, Judith; Ross, Kenton
2004-01-01
The presentation describes efforts to develop a LIDAR in-flight product characterization range at Stennis Space Center as the next phase of the NASA Verification and Validation activities. It describes the status of surveying efforts on targets of interest to LIDAR vendors as well as the potential guidelines that will be used for product characterization.
The purpose of this SOP is to ensure suitable temperature maintenance of freezers used for storage of samples. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: freezers; operation.
The National H...
The purpose of this SOP is to assure suitable temperature maintenance in refrigerators and freezers used for sample storage during the Arizona NHEXAS project and the "Border" study. Keywords: lab; equipment; refrigerators and freezers.
The National Human Exposure Assessment Su...
The purpose of this SOP is to provide a standard method for the "first stage" of cleaning data. The first cleaning stage takes place after data verification and before master database appendage. This procedure applies to (1) post-keypunch data collected by the NHEXAS Arizona st...
VizieR Online Data Catalog: 2 new candidate luminous blue variables (Gvaramadze+ 2012)
NASA Astrophysics Data System (ADS)
Gvaramadze, V. V.; Kniazev, A. Y.; Miroshnichenko, A. S.; Berdnikov, L. N.; Langer, N.; Stringfellow, G. S.; Todt, H.; Hamann, W.-R.; Grebel, E. K.; Buckley, D.; Crause, L.; Crawford, S.; Gulbis, A.; Hettlage, C.; Hooper, E.; Husser, T.-O.; Kotze, P.; Loaring, N.; Nordsieck, K. H.; O'Donoghue, D.; Pickering, T.; Potter, S.; Romero Colmenero, E.; Vaisanen, P.; Williams, T.; Wolf, M.; Reichart, D. E.; Ivarsen, K. M.; Haislip, J. B.; Nysewander, M. C.; Lacluyze, A. P.
2013-03-01
Two circular shells, which are the main subject of this paper, were discovered using the WISE 22-um archival data. Spectral observations of WS1 and WS2 were conducted with SALT on 2011 June 12 and 13 during the performance verification phase of the Robert Stobie Spectrograph. (3 data files).
A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid pha...
This treatability study reports on the results of one of a series of field trials using various remedial action technologies that may be capable of restoring Herbicide Orange (HO)XDioxin contaminated sites. A full-scale field trial using a rotary kiln incinerator capable of pro...
Park, Yang-Kyun; Son, Tae-geun; Kim, Hwiyoung; Lee, Jaegi; Sung, Wonmo; Kim, Il Han; Lee, Kunwoo; Bang, Young-bong; Ye, Sung-Joon
2013-09-06
Phase-based respiratory-gated radiotherapy relies on the reproducibility of patient breathing during the treatment. To monitor the positional reproducibility of patient breathing against a 4D CT simulation, we developed a real-time motion verification system (RMVS) using an optical tracking technology. The system in the treatment room was integrated with a real-time position management system. To test the system, an anthropomorphic phantom that was mounted on a motion platform moved on a programmed breathing pattern and then underwent a 4D CT simulation with RPM. The phase-resolved anterior surface lines were extracted from the 4D CT data to constitute 4D reference lines. In the treatment room, three infrared reflective markers were attached on the superior, middle, and inferior parts of the phantom along with the body midline and then RMVS could track those markers using an optical camera system. The real-time phase information extracted from RPM was delivered to RMVS via in-house network software. Thus, the real-time anterior-posterior positions of the markers were simultaneously compared with the 4D reference lines. The technical feasibility of RMVS was evaluated by repeating the above procedure under several scenarios such as ideal case (with identical motion parameters between simulation and treatment), cycle change, baseline shift, displacement change, and breathing type changes (abdominal or chest breathing). The system capability for operating under irregular breathing was also investigated using real patient data. The evaluation results showed that RMVS has a competence to detect phase-matching errors between patient's motion during the treatment and 4D CT simulation. Thus, we concluded that RMVS could be used as an online quality assurance tool for phase-based gating treatments.
A year after lift-off, XMM-Newton is impressing the X-ray astronomy community
NASA Astrophysics Data System (ADS)
2000-11-01
XMM-Newton was launched from Kourou on 10 December 1999 on the first Ariane-5 commercial flight. After in-orbit commissioning of the spacecraft, and calibration and performance verification of its science instruments, the observatory entered its routine operations phase on 1 July. At the press conference, ESA's Director of Science Prof. Roger-Maurice Bonnet and XMM-Newton Project Scientist Fred Jansen will present some of the many scientific results from the first eight months of the mission. Also present will be two of Europe's foremost X-ray astronomers, Prof. Johan Bleeker of the Space Research Organisation of the Netherlands, and Prof. Guenther Hasinger of the Astrophysikalisches Institut Potsdam, Germany. Amongst the topics to be illustrated with some remarkably vivid "colour" images of the X-ray Universe, will be XMM-Newton's first examination of a cataclysmic binary star, its first insights into some enigmatic black hole systems, analysis of the morphology of a few supernovae remnants, and evidence it has collected to end the long-standing mystery over X-ray cosmic background emission... The press conference will also recap on the spacecraft's operations, the performance of its science instruments, the issue of radiation constraints and future aspects of the mission. Media representatives wishing to attend the press event are kindly invited to complete the attached reply form and fax it back to ESA Media Relations Office +33(0)1.53.69.7690. Note to editors XMM-Newton is ESA's second Cornerstone Mission of the Horizon 2000 programme. The spacecraft was built by a European consortium of companies led by Astrium (formerly Dornier Satellitensysteme), Friedrichshafen, Germany. Its X-ray imaging and spectrographic instruments (EPIC and RGS) and its optical telescope (OM) were provided by large consortia, whose principal investigators are from, respectively, the University of Leicester, UK, SRON University of Utrecht Netherlands, and the Mullard Space Science Laboratory, UK.
The NIRCam Optical Telescope Simulator (NOTES)
NASA Technical Reports Server (NTRS)
Kubalak, David; Hakun, Claef; Greeley, Bradford; Eichorn, William; Leviton, Douglas; Guishard, Corina; Gong, Qian; Warner, Thomas; Bugby, David; Robinson, Frederick;
2007-01-01
The Near Infra-Red Camera (NIRCam), the 0.6-5.0 micron imager and wavefront sensing instrument for the James Webb Space Telescope (JWST), will be used on orbit both as a science instrument, and to tune the alignment of the telescope. The NIRCam Optical Telescope Element Simulator (NOTES) will be used during ground testing to provide an external stimulus to verify wavefront error, imaging characteristics, and wavefront sensing performance of this crucial instrument. NOTES is being designed and built by NASA Goddard Space Flight Center with the help of Swales Aerospace and Orbital Sciences Corporation. It is a single-point imaging system that uses an elliptical mirror to form an U20 image of a point source. The point source will be fed via optical fibers from outside the vacuum chamber. A tip/tilt mirror is used to change the chief ray angle of the beam as it passes through the aperture stop and thus steer the image over NIRCam's field of view without moving the pupil or introducing field aberrations. Interchangeable aperture stop elements allow us to simulate perfect JWST wavefronts for wavefront error testing, or introduce transmissive phase plates to simulate a misaligned JWST segmented mirror for wavefront sensing verification. NOTES will be maintained at an operating temperature of 80K during testing using thermal switches, allowing it to operate within the same test chamber as the NIRCam instrument. We discuss NOTES' current design status and on-going development activities.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Phase-synchroniser based on gm-C all-pass filter chain with sliding mode control
NASA Astrophysics Data System (ADS)
Mitić, Darko B.; Jovanović, Goran S.; Stojčev, Mile K.; Antić, Dragan S.
2015-03-01
Phase-synchronisers have many applications in VLSI circuit designs. They are used in CMOS RF circuits including phase (de)modulators, phase recovery circuits, multiphase synthesis, etc. In this article, a phase-synchroniser based on gm-C all-pass filter chain with sliding mode control is presented. The filter chain provides good controllable delay characteristics over the full range of phase and frequency regulation, without deterioration of input signal amplitude and waveform, while the sliding mode control enables us to achieve fast and predetermined finite locking time. IHP 0.25 µm SiGe BiCMOS technology has been used in design and verification processes. The circuit operates in the frequency range from 33 MHz up to 150 MHz. Simulation results indicate that it is possible to achieve very fast synchronisation time period, which is approximately four time intervals of the input signal during normal operation, and 20 time intervals during power-on.
Hagen, C K; Diemoz, P C; Endrizzi, M; Rigon, L; Dreossi, D; Arfelli, F; Lopez, F C M; Longo, R; Olivo, A
2014-04-07
X-ray phase contrast imaging (XPCi) methods are sensitive to phase in addition to attenuation effects and, therefore, can achieve improved image contrast for weakly attenuating materials, such as often encountered in biomedical applications. Several XPCi methods exist, most of which have already been implemented in computed tomographic (CT) modality, thus allowing volumetric imaging. The Edge Illumination (EI) XPCi method had, until now, not been implemented as a CT modality. This article provides indications that quantitative 3D maps of an object's phase and attenuation can be reconstructed from EI XPCi measurements. Moreover, a theory for the reconstruction of combined phase and attenuation maps is presented. Both reconstruction strategies find applications in tissue characterisation and the identification of faint, weakly attenuating details. Experimental results for wires of known materials and for a biological object validate the theory and confirm the superiority of the phase over conventional, attenuation-based image contrast.
Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K
2009-07-03
This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.
NASA Astrophysics Data System (ADS)
McIntyre, Gregory; Neureuther, Andrew; Slonaker, Steve; Vellanki, Venu; Reynolds, Patrick
2006-03-01
The initial experimental verification of a polarization monitoring technique is presented. A series of phase shifting mask patterns produce polarization dependent signals in photoresist and are capable of monitoring the Stokes parameters of any arbitrary illumination scheme. Experiments on two test reticles have been conducted. The first reticle consisted of a series of radial phase gratings (RPG) and employed special apertures to select particular illumination angles. Measurement sensitivities of about 0.3 percent of the clear field per percent change in polarization state were observed. The second test reticle employed the more sensitive proximity effect polarization analyzers (PEPA), a more robust experimental setup, and a backside pinhole layer for illumination angle selection and to enable characterization of the full illuminator. Despite an initial complication with the backside pinhole alignment, the results correlate with theory. Theory suggests that, once the pinhole alignment is corrected in the near future, the second reticle should achieve a measurement sensitivity of about 1 percent of the clear field per percent change in polarization state. This corresponds to a measurement of the Stokes parameters after test mask calibration, to within about 0.02 to 0.03. Various potential improvements to the design, fabrication of the mask, and experimental setup are discussed. Additionally, to decrease measurement time, a design modification and double exposure technique is proposed to enable electrical detection of the measurement signal.
Jeffrey, N.; Abdalla, F. B.; Lahav, O.; ...
2018-05-15
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey, N.; et al.
2018-01-26
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey, N.; Abdalla, F. B.; Lahav, O.
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
NASA Technical Reports Server (NTRS)
Williams, David E.
2008-01-01
The International Space Station (ISS) Node 1 Environmental Control and Life Support (ECLS) System is comprised of five subsystems: Atmosphere Control and Supply (ACS), Atmosphere Revitalization (AR), Fire Detection and Suppression (FDS), Temperature and Humidity Control (THC), and Water Recovery and Management (WRM). This paper provides a summary of the Node 1 Emergency Response capability, which includes nominal and off-nominal FDS operation, off nominal ACS operation, and off-nominal THC operation. These subsystems provide the capability to help aid the crew members during an emergency cabin depressurization, a toxic spill, or a fire. The paper will also provide a discussion of the detailed Node 1 ECLS Element Verification methodologies for operation of the Node 1 Emergency Response hardware operations utilized during the Qualification phase.
A retrospective of the GREGOR solar telescope in scientific literature
NASA Astrophysics Data System (ADS)
Denker, C.; von der Lühe, O.; Feller, A.; Arlt, K.; Balthasar, H.; Bauer, S.-M.; Bello González, N.; Berkefeld, Th.; Caligari, P.; Collados, M.; Fischer, A.; Granzer, T.; Hahn, T.; Halbgewachs, C.; Heidecke, F.; Hofmann, A.; Kentischer, T.; Klva{ňa, M.; Kneer, F.; Lagg, A.; Nicklas, H.; Popow, E.; Puschmann, K. G.; Rendtel, J.; Schmidt, D.; Schmidt, W.; Sobotka, M.; Solanki, S. K.; Soltau, D.; Staude, J.; Strassmeier, K. G.; Volkmer, R.; Waldmann, T.; Wiehr, E.; Wittmann, A. D.; Woche, M.
2012-11-01
In this review, we look back upon the literature, which had the GREGOR solar telescope project as its subject including science cases, telescope subsystems, and post-focus instruments. The articles date back to the year 2000, when the initial concepts for a new solar telescope on Tenerife were first presented at scientific meetings. This comprehensive bibliography contains literature until the year 2012, i.e., the final stages of commissioning and science verification. Taking stock of the various publications in peer-reviewed journals and conference proceedings also provides the ``historical'' context for the reference articles in this special issue of Astronomische Nachrichten/Astronomical Notes.
U.S. Geological Survey Global Seismographic Network - Five-Year Plan 2006-2010
Leith, William S.; Gee, Lind S.; Hutt, Charles R.
2009-01-01
The Global Seismographic Network provides data for earthquake alerting, tsunami warning, nuclear treaty verification, and Earth science research. The system consists of nearly 150 permanent digital stations, distributed across the globe, connected by a modern telecommunications network. It serves as a multi-use scientific facility and societal resource for monitoring, research, and education, by providing nearly uniform, worldwide monitoring of the Earth. The network was developed and is operated through a partnership among the National Science Foundation (http://www.nsf.gov), the Incorporated Research Institutions for Seismology (http://www.iris.edu/hq/programs/gsn), and the U.S. Geological Survey (http://earthquake.usgs.gov/gsn).
Hydrogen and helium under high pressure - A case for a classical theory of dense matter
NASA Astrophysics Data System (ADS)
Celebonovic, Vladan
1989-06-01
When subject to high pressure, H2 and He-3 are expected to undergo phase transitions, and to become metallic at a sufficiently high pressure. Using a semiclassical theory of dense matter proposed by Savic and Kasanin, calculations of phase transition and metallization pressure have been performed for these two materials. In hydrogen, metallization occurs at p(M) = (3.0 + or - 0.2) Mbar, while for helium the corresponding value is (106 + or - 1) Mbar. A phase transition occurs in helium at p(tr) = (10.0 + or - 0.4) Mbar. These values are close to the results obtainable by more rigorous methods. Possibilities of experimental verification of the calculations are briefly discussed.
Thinking forensics: Cognitive science for forensic practitioners.
Edmond, Gary; Towler, Alice; Growns, Bethany; Ribeiro, Gianni; Found, Bryan; White, David; Ballantyne, Kaye; Searston, Rachel A; Thompson, Matthew B; Tangen, Jason M; Kemp, Richard I; Martire, Kristy
2017-03-01
Human factors and their implications for forensic science have attracted increasing levels of interest across criminal justice communities in recent years. Initial interest centred on cognitive biases, but has since expanded such that knowledge from psychology and cognitive science is slowly infiltrating forensic practices more broadly. This article highlights a series of important findings and insights of relevance to forensic practitioners. These include research on human perception, memory, context information, expertise, decision-making, communication, experience, verification, confidence, and feedback. The aim of this article is to sensitise forensic practitioners (and lawyers and judges) to a range of potentially significant issues, and encourage them to engage with research in these domains so that they may adapt procedures to improve performance, mitigate risks and reduce errors. Doing so will reduce the divide between forensic practitioners and research scientists as well as improve the value and utility of forensic science evidence. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.
The spectrum of (136199) Eris between 350 and 2350 nm: results with X-Shooter
NASA Astrophysics Data System (ADS)
Alvarez-Candal, A.; Pinilla-Alonso, N.; Licandro, J.; Cook, J.; Mason, E.; Roush, T.; Cruikshank, D.; Gourgeot, F.; Dotto, E.; Perna, D.
2011-08-01
Context. X-Shooter is the first second-generation instrument for the ESO-Very Large Telescope. It is a spectrograph covering the entire 300-2480 nm spectral range at once with a high resolving power. These properties enticed us to observe the well-known trans-Neptunian object (136199) Eris during the science verification of the instrument. The target has numerous absorption features in the optical and near-infrared domain that have been observed by different authors, showing differences in these features' positions and strengths. Aims: Besides testing the capabilities of X-Shooter to observe minor bodies, we attempt to constrain the existence of super-volatiles, e.g., CH4, CO and N2, and in particular we try to understand the physical-chemical state of the ices on Eris' surface. Methods: We observed Eris in the 300 - 2480 nm range and compared the newly obtained spectra with those available in the literature. We identified several absorption features, measured their positions and depth, and compare them with those of the reflectance of pure methane ice obtained from the optical constants of this ice at 30 K to study shifts in these features' positions and find a possible explanation for their origin. Results: We identify several absorption bands in the spectrum that are all consistent with the presence of CH4 ice. We do not identify bands related to N2 or CO. We measured the central wavelengths of the bands and compared to those measured in the spectrum of pure CH4 at 30 K finding variable spectral shifts. Conclusions: Based on these wavelength shifts, we confirm the presence of a dilution of CH4 in other ice on the surface of Eris and the presence of pure CH4 that is spatially segregated. The comparison of the centers and shapes of these bands with previous works suggests that the surface is heterogeneous. The absence of the 2160 nm band of N2 can be explained if the surface temperature is below 35.6 K, the transition temperature between the alpha and beta phases of this ice. Our results, including the reanalysis of data published elsewhere, point to a heterogeneous surface on Eris. Observations made during X-Shooter Science Verification, program 60.A-9400(A), PIs: Alvarez-Candal and Mason.
NASA Technical Reports Server (NTRS)
Kennedy, Brian; Abrahamson, Matt; Ardito, Alessandro; Han, Dongsuk; Haw, Robert; Mastrodemos, Nicholas; Nandi, Sumita; Park, Ryan; Rush, Brian; Vaughan, Andrew
2013-01-01
The Dawn spacecraft was launched on September 27th, 2007. Its mission is to consecutively rendezvous with and observe the two largest bodies in the asteroid belt, Vesta and Ceres. It has already completed over a year's worth of direct observations of Vesta (spanning from early 2011 through late 2012) and is currently on a cruise trajectory to Ceres, where it will begin scientific observations in mid-2015. Achieving this data collection required careful planning and execution from all spacecraft teams. Dawn's Orbit Determination (OD) team was tasked with accurately predicting the trajectory of the Dawn spacecraft during the Vesta science phases, and also determining the parameters of Vesta to support future science orbit design. The future orbits included the upcoming science phase orbits as well as the transfer orbits between science phases. In all, five science phases were executed at Vesta, and this paper will describe some of the OD team contributions to the planning and execution of those phases.
1984-09-01
Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development
2007-10-28
Software Engineering, FASE, volume 3442 of Lecture Notes in Computer Science, pages 175--189. Springer, 2005. Andreas Bauer, Martin Leucker, and Jonathan ...of Personnel receiving masters degrees NAME Markus Strohmeier Gerrit Hanselmann Jonathan Streit Ernst Sassen 4Total Number: Names of personnel...developed and documented mainly within the master thesis by Jonathan Streit [Str06]: • Jonathan Streit. Development of a programming language like tem
2016-08-01
Using Categorical and Object-Based Methods by John W Raby and Huaqing Cai Approved for public release; distribution...by John W Raby and Huaqing Cai Computational and Information Sciences Directorate, ARL Approved for public release...AUTHOR(S) John W Raby and Huaqing Cai 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND
Open and Crowd-Sourced Data for Treaty Verification
2014-10-01
the case of user - generated Internet content , such as Wikipedia, or Amazon reviews. Another example is the Zooniverse citizen science project,6 which...Prescribed by ANSI Std. Z39.18 Contents EXECUTIVE SUMMARY 1 1 INTRODUCTION 5 1.1 Charge to the Panel . . . . . . . . . . . . . . . . . . . . . . . 7 2...number of potential observations can in many in- stances make up for relatively crude measurements made by the pub- lic. Users are motivated to
NASA Astrophysics Data System (ADS)
Xie, Qi; Hu, Bin; Chen, Ke-Fei; Liu, Wen-Hao; Tan, Xiao
2015-11-01
In three-party password authenticated key exchange (AKE) protocol, since two users use their passwords to establish a secure session key over an insecure communication channel with the help of the trusted server, such a protocol may suffer the password guessing attacks and the server has to maintain the password table. To eliminate the shortages of password-based AKE protocol, very recently, according to chaotic maps, Lee et al. [2015 Nonlinear Dyn. 79 2485] proposed a first three-party-authenticated key exchange scheme without using passwords, and claimed its security by providing a well-organized BAN logic test. Unfortunately, their protocol cannot resist impersonation attack, which is demonstrated in the present paper. To overcome their security weakness, by using chaotic maps, we propose a biometrics-based anonymous three-party AKE protocol with the same advantages. Further, we use the pi calculus-based formal verification tool ProVerif to show that our AKE protocol achieves authentication, security and anonymity, and an acceptable efficiency. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LZ12F02005), the Major State Basic Research Development Program of China (Grant No. 2013CB834205), and the National Natural Science Foundation of China (Grant No. 61070153).
NASA Technical Reports Server (NTRS)
Yew, Calinda; Whitehouse, Paul; Lui, Yan; Banks, Kimberly
2016-01-01
JWST Integrated Science Instruments Module (ISIM) has completed its system-level testing program at the NASA Goddard Space Flight Center (GSFC). In March 2016, ISIM was successfully delivered for integration with the Optical Telescope Element (OTE) after the successful verification of the system through a series of three cryo-vacuum (CV) tests. The first test served as a risk reduction test; the second test provided the initial verification of the fully-integrated flight instruments; and the third test verified the system in its final flight configuration. The complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. As JWST progressed through its CV testing campaign, deficiencies in the test configuration and support equipment were uncovered from one test to the next. Subsequent upgrades and modifications were implemented to improve the facility support capabilities required to achieve test requirements. This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) compares the overall facility performance and instrumentation results from the three ISIM CV tests, and (3) summarizes lessons learned from the ISIM testing campaign.
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
NASA Astrophysics Data System (ADS)
Clampitt, J.; Sánchez, C.; Kwan, J.; Krause, E.; MacCrann, N.; Park, Y.; Troxel, M. A.; Jain, B.; Rozo, E.; Rykoff, E. S.; Wechsler, R. H.; Blazek, J.; Bonnett, C.; Crocce, M.; Fang, Y.; Gaztanaga, E.; Gruen, D.; Jarvis, M.; Miquel, R.; Prat, J.; Ross, A. J.; Sheldon, E.; Zuntz, J.; Abbott, T. M. C.; Abdalla, F. B.; Armstrong, R.; Becker, M. R.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.
2017-03-01
We present galaxy-galaxy lensing results from 139 deg2 of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise ratio of 29 over scales 0.09 < R < 15 Mpc h-1, including all lenses over a wide redshift range 0.2 < z < 0.8. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtain consistent results for the lensing measurement with two independent shear pipelines, NGMIX and IM3SHAPE. We perform a number of null tests on the shear and photometric redshift catalogues and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The result and systematic checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a halo occupation distribution (HOD) model, and demonstrate that our data constrain the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.
redMaGiC: selecting luminous red galaxies from the DES Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozo, E.
We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sam- ple of constant comoving density. Additionally, we demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshiftmore » range z ϵ [0.2,0.8]. Our fiducial sample has a comoving space density of 10 -3 (h -1Mpc) -3, and a median photo-z bias (z spec z photo) and scatter (σ z=(1 + z)) of 0.005 and 0.017 respectively.The corresponding 5σ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1% level.« less
NASA Astrophysics Data System (ADS)
To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian
2008-09-01
Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.
Student experience of school science
NASA Astrophysics Data System (ADS)
Shirazi, Shaista
2017-09-01
This paper presents the findings of a two-phase mixed methods research study that explores the link between experiences of school science of post-16 students and their decisions to take up science for their higher studies. In the first phase, students aged 16-17 (n = 569) reflected on the past five years of their school science experience in a quasi-longitudinal approach to determine a typology of experiences. The second phase entailed data collection through interviews of a sample of these students (n = 55) to help triangulate and extend findings from the first phase. Students taking up science post-16 reported significantly more positive experiences of school science than students who had decided not to take science further. Of school-related factors influencing experiences of school science curriculum content was the most important followed by being interested and motivated in the subject. There is evidence that interest and motivation in science depend on teacher practice and the perception of science as a difficult subject.
Challenges and Support When Teaching Science Through an Integrated Inquiry and Literacy Approach
NASA Astrophysics Data System (ADS)
Ødegaard, Marianne; Haug, Berit; Mork, Sonja M.; Ove Sørvik, Gard
2014-12-01
In the Budding Science and Literacy project, we explored how working with an integrated inquiry-based science and literacy approach may challenge and support the teaching and learning of science at the classroom level. By studying the inter-relationship between multiple learning modalities and phases of inquiry, we wished to illuminate possible dynamics between science inquiry and literacy in an integrated science approach. Six teachers and their students were recruited from a professional development course for the current classroom study. The teachers were to try out the Budding Science teaching model. This paper presents an overall video analysis of our material demonstrating variations and patterns of inquiry-based science and literacy activities. Our analysis revealed that multiple learning modalities (read it, write it, do it, and talk it) are used in the integrated approach; oral activities dominate. The inquiry phases shifted throughout the students' investigations, but the consolidating phases of discussion and communication were given less space. The data phase of inquiry seems essential as a driving force for engaging in science learning in consolidating situations. The multiple learning modalities were integrated in all inquiry phases, but to a greater extent in preparation and data. Our results indicate that literacy activities embedded in science inquiry provide support for teaching and learning science; however, the greatest challenge for teachers is to find the time and courage to exploit the discussion and communication phases to consolidate the students' conceptual learning.
The purpose of this SOP is to define the steps needed to operate the light pens, and verify the values produced by light pens used in the Arizona NHEXAS project and the "Border" study. Keywords: data; equipment; light pens.
The National Human Exposure Assessment Survey (NHEXAS)...
The Search for Nonflammable Solvent Alternatives for Cleaning Aerospace Oxygen Systems
NASA Technical Reports Server (NTRS)
Mitchell, Mark; Lowrey, Nikki
2012-01-01
Oxygen systems are susceptible to fires caused by particle and nonvolatile residue (NVR) contaminants, therefore cleaning and verification is essential for system safety. . Cleaning solvents used on oxygen system components must be either nonflammable in pure oxygen or complete removal must be assured for system safety. . CFC -113 was the solvent of choice before 1996 because it was effective, least toxic, compatible with most materials of construction, and non ]reactive with oxygen. When CFC -113 was phased out in 1996, HCFC -225 was selected as an interim replacement for cleaning propulsion oxygen systems at NASA. HCFC-225 production phase-out date is 01/01/2015. HCFC ]225 (AK ]225G) is used extensively at Marshall Space Flight Center and Stennis Space Center for cleaning and NVR verification on large propulsion oxygen systems, and propulsion test stands and ground support equipment. . Many components are too large for ultrasonic agitation - necessary for effective aqueous cleaning and NVR sampling. . Test stand equipment must be cleaned prior to installation of test hardware. Many items must be cleaned by wipe or flush in situ where complete removal of a flammable solvent cannot be assured. The search for a replacement solvent for these applications is ongoing.
Telemetry and Science Data Software System
NASA Technical Reports Server (NTRS)
Bates, Lakesha; Hong, Liang
2011-01-01
The Telemetry and Science Data Software System (TSDSS) was designed to validate the operational health of a spacecraft, ease test verification, assist in debugging system anomalies, and provide trending data and advanced science analysis. In doing so, the system parses, processes, and organizes raw data from the Aquarius instrument both on the ground and while in space. In addition, it provides a user-friendly telemetry viewer, and an instant pushbutton test report generator. Existing ground data systems can parse and provide simple data processing, but have limitations in advanced science analysis and instant report generation. The TSDSS functions as an offline data analysis system during I&T (integration and test) and mission operations phases. After raw data are downloaded from an instrument, TSDSS ingests the data files, parses, converts telemetry to engineering units, and applies advanced algorithms to produce science level 0, 1, and 2 data products. Meanwhile, it automatically schedules upload of the raw data to a remote server and archives all intermediate and final values in a MySQL database in time order. All data saved in the system can be straightforwardly retrieved, exported, and migrated. Using TSDSS s interactive data visualization tool, a user can conveniently choose any combination and mathematical computation of interesting telemetry points from a large range of time periods (life cycle of mission ground data and mission operations testing), and display a graphical and statistical view of the data. With this graphical user interface (GUI), the data queried graphs can be exported and saved in multiple formats. This GUI is especially useful in trending data analysis, debugging anomalies, and advanced data analysis. At the request of the user, mission-specific instrument performance assessment reports can be generated with a simple click of a button on the GUI. From instrument level to observatory level, the TSDSS has been operating supporting functional and performance tests and refining system calibration algorithms and coefficients, in sync with the Aquarius/SAC-D spacecraft. At the time of this reporting, it was prepared and set up to perform anomaly investigation for mission operations preceding the Aquarius/SAC-D spacecraft launch on June 10, 2011.
Science verification of operational aerosol and cloud products for TROPOMI on Sentinel-5 precursor
NASA Astrophysics Data System (ADS)
Lelli, Luca; Gimeno-Garcia, Sebastian; Sanders, Abram; Sneep, Maarten; Rozanov, Vladimir V.; Kokhanvosky, Alexander A.; Loyola, Diego; Burrows, John P.
2016-04-01
With the approaching launch of the Sentinel-5 precursor (S-5P) satellite, scheduled by mid 2016, one preparatory task of the L2 working group (composed by the Institute of Environmental Physics IUP Bremen, the Royal Netherlands Meteorological Institute KNMI De Bilt, and the German Aerospace Center DLR Oberpfaffenhofen) has been the assessment of biases among aerosol and cloud products, that are going to be inferred by the respective algorithms from measurements of the platform's payload TROPOspheric Monitoring Instrument (TROPOMI). The instrument will measure terrestrial radiance with varying moderate spectral resolutions from the ultraviolet throughout the shortwave infrared. Specifically, all the operational and verification algorithms involved in this comparison exploit the sensitivity of molecular oxygen absorption (the A-band, 755-775 nm, with a resolution of 0.54 nm) to changes in optical and geometrical parameters of tropospheric scattering layers. Therefore, aerosol layer height (ALH) and thickness (AOT), cloud top height (CTH), thickness (COT) and albedo (CA) are the targeted properties. First, the verification of these properties has been accomplished upon synchronisation of the respective forward radiative transfer models for a variety of atmospheric scenarios. Then, biases against independent techniques have been evaluated with real measurements of selected GOME-2 orbits. Global seasonal bias assessment has been carried out for CTH, CA and COT, whereas the verification of ALH and AOT is based on the analysis of the ash plume emitted by the icelandic volcanic eruption Eyjafjallajökull in May 2010 and selected dust scenes off the Saharan west coast sensed by SCIAMACHY in year 2009.
Development and Assessment of CTF for Pin-resolved BWR Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Wysocki, Aaron J; Collins, Benjamin S
2017-01-01
CTF is the modernized and improved version of the subchannel code, COBRA-TF. It has been adopted by the Consortium for Advanced Simulation for Light Water Reactors (CASL) for subchannel analysis applications and thermal hydraulic feedback calculations in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS). CTF is now jointly developed by Oak Ridge National Laboratory and North Carolina State University. Until now, CTF has been used for pressurized water reactor modeling and simulation in CASL, but in the future it will be extended to boiling water reactor designs. This required development activities to integrate the code into the VERA-CSmore » workflow and to make it more ecient for full-core, pin resolved simulations. Additionally, there is a significant emphasis on producing high quality tools that follow a regimented software quality assurance plan in CASL. Part of this plan involves performing validation and verification assessments on the code that are easily repeatable and tied to specific code versions. This work has resulted in the CTF validation and verification matrix being expanded to include several two-phase flow experiments, including the General Electric 3 3 facility and the BWR Full-Size Fine Mesh Bundle Tests (BFBT). Comparisons with both experimental databases is reasonable, but the BFBT analysis reveals a tendency of CTF to overpredict void, especially in the slug flow regime. The execution of these tests is fully automated, analysis is documented in the CTF Validation and Verification manual, and the tests have become part of CASL continuous regression testing system. This paper will summarize these recent developments and some of the two-phase assessments that have been performed on CTF.« less
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Stephen B.
2010-01-01
Software plays an increasingly larger role in all aspects of NASA's science missions. This has been extended to the identification, management and control of faults which affect safety-critical functions and by default, the overall success of the mission. Traditionally, the analysis of fault identification, management and control are hardware based. Due to the increasing complexity of system, there has been a corresponding increase in the complexity in fault management software. The NASA Independent Validation & Verification (IV&V) program is creating processes and procedures to identify, and incorporate safety-critical software requirements along with corresponding software faults so that potential hazards may be mitigated. This Specific to Generic ... A Case for Reuse paper describes the phases of a dependability and safety study which identifies a new, process to create a foundation for reusable assets. These assets support the identification and management of specific software faults and, their transformation from specific to generic software faults. This approach also has applications to other systems outside of the NASA environment. This paper addresses how a mission specific dependability and safety case is being transformed to a generic dependability and safety case which can be reused for any type of space mission with an emphasis on software fault conditions.
Monitoring/Verification Using DMS: TATP Example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Kyle; Stephan Weeks
Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less
Wang, Xiaogang; Chen, Wen; Chen, Xudong
2015-03-09
In this paper, we develop a new optical information authentication system based on compressed double-random-phase-encoded images and quick-response (QR) codes, where the parameters of optical lightwave are used as keys for optical decryption and the QR code is a key for verification. An input image attached with QR code is first optically encoded in a simplified double random phase encoding (DRPE) scheme without using interferometric setup. From the single encoded intensity pattern recorded by a CCD camera, a compressed double-random-phase-encoded image, i.e., the sparse phase distribution used for optical decryption, is generated by using an iterative phase retrieval technique with QR code. We compare this technique to the other two methods proposed in literature, i.e., Fresnel domain information authentication based on the classical DRPE with holographic technique and information authentication based on DRPE and phase retrieval algorithm. Simulation results show that QR codes are effective on improving the security and data sparsity of optical information encryption and authentication system.
Phosphoric and electric utility fuel cell technology development
NASA Astrophysics Data System (ADS)
Breault, R. D.; Briggs, T. A.; Congdon, J. V.; Demarche, T. E.; Gelting, R. L.; Goller, G. J.; Luoma, W. I.; McCloskey, M. W.; Mientek, A. P.; Obrien, J. J.
1984-01-01
The advancement of electric utility cell stack technology and reduction of cell stack cost was initiated. The cell stack has a nominal 10 ft (2) active area and operates at 120 psia/405(0)F. The program comprises six parallel phases, which culminate in a full height, 10-ft(2) stack verification test: (1) provides the information and services needed to manage the effort, including definition of the prototype commercial power plant; (2) develops the technical base for long term improvements to the cell stack; (3) develops materials and processing techniques for cell stack components incorporating the best available technology; (4) provides the design of hardware and conceptual processing layouts, and updates the power plant definition of Phase 1 to reflect the results of Phases 2 and 3; Phase 5 manufactures the hardware to verify the achievements of Phases 2 and 3, and analyzes the cost of this hardware; and Phase 6 tests the cell stacks assembled from the hardware of Phase 5 to assess the state of development.
Luis Martínez Fuentes, Jose; Moreno, Ignacio
2018-03-05
A new technique for encoding the amplitude and phase of diffracted fields in digital holography is proposed. It is based on a random spatial multiplexing of two phase-only diffractive patterns. The first one is the phase information of the intended pattern, while the second one is a diverging optical element whose purpose is the control of the amplitude. A random number determines the choice between these two diffractive patterns at each pixel, and the amplitude information of the desired field governs its discrimination threshold. This proposed technique is computationally fast and does not require iterative methods, and the complex field reconstruction appears on axis. We experimentally demonstrate this new encoding technique with holograms implemented onto a flicker-free phase-only spatial light modulator (SLM), which allows the axial generation of such holograms. The experimental verification includes the phase measurement of generated patterns with a phase-shifting polarization interferometer implemented in the same experimental setup.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1994-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1995-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).
Modeling and Analysis of Asynchronous Systems Using SAL and Hybrid SAL
NASA Technical Reports Server (NTRS)
Tiwari, Ashish; Dutertre, Bruno
2013-01-01
We present formal models and results of formal analysis of two different asynchronous systems. We first examine a mid-value select module that merges the signals coming from three different sensors that are each asynchronously sampling the same input signal. We then consider the phase locking protocol proposed by Daly, Hopkins, and McKenna. This protocol is designed to keep a set of non-faulty (asynchronous) clocks phase locked even in the presence of Byzantine-faulty clocks on the network. All models and verifications have been developed using the SAL model checking tools and the Hybrid SAL abstractor.
Sigma Metrics Across the Total Testing Process.
Charuruks, Navapun
2017-03-01
Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.
Three-dimensional surface contouring of macroscopic objects by means of phase-difference images.
Velásquez Prieto, Daniel; Garcia-Sucerquia, Jorge
2006-09-01
We report a technique to determine the 3D contour of objects with dimensions of at least 4 orders of magnitude larger than the illumination optical wavelength. Our proposal is based on the numerical reconstruction of the optical wave field of digitally recorded holograms. The required modulo 2pi phase map in any contouring process is obtained by means of the direct subtraction of two phase-contrast images under different illumination angles to create a phase-difference image of a still object. Obtaining the phase-difference images is only possible by using the capability of numerical reconstruction of the complex optical field provided by digital holography. This unique characteristic leads us to a robust, reliable, and fast procedure that requires only two images. A theoretical analysis of the contouring system is shown, with verification by means of numerical and experimental results.
Quantum adiabatic machine learning
NASA Astrophysics Data System (ADS)
Pudenz, Kristen L.; Lidar, Daniel A.
2013-05-01
We develop an approach to machine learning and anomaly detection via quantum adiabatic evolution. This approach consists of two quantum phases, with some amount of classical preprocessing to set up the quantum problems. In the training phase we identify an optimal set of weak classifiers, to form a single strong classifier. In the testing phase we adiabatically evolve one or more strong classifiers on a superposition of inputs in order to find certain anomalous elements in the classification space. Both the training and testing phases are executed via quantum adiabatic evolution. All quantum processing is strictly limited to two-qubit interactions so as to ensure physical feasibility. We apply and illustrate this approach in detail to the problem of software verification and validation, with a specific example of the learning phase applied to a problem of interest in flight control systems. Beyond this example, the algorithm can be used to attack a broad class of anomaly detection problems.
In-line phase contrast micro-CT reconstruction for biomedical specimens.
Fu, Jian; Tan, Renbo
2014-01-01
X-ray phase contrast micro computed tomography (micro-CT) can non-destructively provide the internal structure information of soft tissues and low atomic number materials. It has become an invaluable analysis tool for biomedical specimens. Here an in-line phase contrast micro-CT reconstruction technique is reported, which consists of a projection extraction method and the conventional filter back-projection (FBP) reconstruction algorithm. The projection extraction is implemented by applying the Fourier transform to the forward projections of in-line phase contrast micro-CT. This work comprises a numerical study of the method and its experimental verification using a biomedical specimen dataset measured at an X-ray tube source micro-CT setup. The numerical and experimental results demonstrate that the presented technique can improve the imaging contrast of biomedical specimens. It will be of interest for a wide range of in-line phase contrast micro-CT applications in medicine and biology.
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
Formalization of the Integral Calculus in the PVS Theorem Prover
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
2004-01-01
The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.
NASA Technical Reports Server (NTRS)
Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.
2017-01-01
As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.
Shuttle avionics software development trials: Tribulations and successes, the backup flight system
NASA Technical Reports Server (NTRS)
Chevers, E. S.
1985-01-01
The development and verification of the Backup Flight System software (BFS) is discussed. The approach taken for the BFS was to develop a very simple and straightforward software program and then test it in every conceivable manner. The result was a program that contained approximately 12,000 full words including ground checkout and the built in test program for the computer. To perform verification, a series of tests was defined using the actual flight type hardware and simulated flight conditions. Then simulated flights were flown and detailed performance analysis was conducted. The intent of most BFS tests was to demonstrate that a stable flightpath could be obtained after engagement from an anomalous initial condition. The extention of the BFS to meet the requirements of the orbital flight test phase is also described.
Cassini's Test Methodology for Flight Software Verification and Operations
NASA Technical Reports Server (NTRS)
Wang, Eric; Brown, Jay
2007-01-01
The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).
Fiber Lasers and Amplifiers for Space-based Science and Exploration
NASA Technical Reports Server (NTRS)
Yu, Anthony W.; Krainak, Michael A.; Stephen, Mark A.; Chen, Jeffrey R.; Coyle, Barry; Numata, Kenji; Camp, Jordan; Abshire, James B.; Allan, Graham R.; Li, Steven X.;
2012-01-01
We present current and near-term uses of high-power fiber lasers and amplifiers for NASA science and spacecraft applications. Fiber lasers and amplifiers offer numerous advantages for the deployment of instruments on exploration and science remote sensing satellites. Ground-based and airborne systems provide an evolutionary path to space and a means for calibration and verification of space-borne systems. NASA fiber-laser-based instruments include laser sounders and lidars for measuring atmospheric carbon dioxide, oxygen, water vapor and methane and a pulsed or pseudo-noise (PN) code laser ranging system in the near infrared (NIR) wavelength band. The associated fiber transmitters include high-power erbium, ytterbium, and neodymium systems and a fiber laser pumped optical parametric oscillator. We discuss recent experimental progress on these systems and instrument prototypes for ongoing development efforts.
An Overview of the Jupiter Europa Orbiter Concept's Europa Science Phase Orbit Design
NASA Technical Reports Server (NTRS)
Lock, Robert E.; Ludwinski, Jan M.; Petropoulos, Anastassios E.; Clark, Karla B.; Pappalardo, Robert T.
2009-01-01
Jupiter Europa Orbiter (JEO), the proposed NASA element of the proposed joint NASA-ESA Europa Jupiter System Mission (EJSM), could launch in February 2020 and conceivably arrive at Jupiter in December of 2025. The concept is to perform a multi-year study of Europa and the Jupiter system, including 30 months of Jupiter system science and a comprehensive Europa orbit phase of 9 months. This paper provides an overview of the JEO concept and describes the Europa Science phase orbit design and the related science priorities, model pay-load and operations scenarios needed to conduct the Europa Science phase. This overview is for planning and discussion purposes only.
NASA Astrophysics Data System (ADS)
Smart, Julie Brockman
2009-11-01
This study examined interactions between middle school science students' perceptions of teacher-student interactions and their motivation for learning science. Specifically, in order to better understand factors affecting middle school students' motivation for science, this study investigated the interactions between middle school students' perceptions of teacher interpersonal behavior in their science classroom and their efficacy, task value, mastery orientations, and goal orientation for learning science. This mixed methods study followed a sequential explanatory model (Cresswell & Plano-Clark, 2007). Quantitative and qualitative data were collected in two phases, with quantitative data in the first phase informing the selection of participants for the qualitative phase that followed. The qualitative phase also helped to clarify and explain results from the quantitative phase. Data mixing occurred between Phase One and Phase Two (participant selection) and at the interpretation level (explanatory) after quantitative and qualitative data were analyzed separately. Results from Phase One indicated that students' perceptions of teacher interpersonal behaviors were predictive of their efficacy for learning science, task value for learning science, mastery orientation, and performance orientation. These results were used to create motivation/perception composites, which were used in order to select students for the qualitative interviews. A total of 24 students with high motivation/high perceptions, low motivation/low perceptions, high motivation/low perceptions, and low motivation/high perceptions were selected in order to represent students whose profiles either supported or refuted the quantitative results. Results from Phase Two revealed themes relating to students' construction of their perceptions of teacher interpersonal behavior and dimensions of their efficacy and task value for science. Students who reported high motivation and high perceptions of teacher-student interactions during the quantitative phase described the most instances of teacher cooperative behaviors, such as teacher helpfulness and understanding. Conversely, students reporting low motivation and low perceptions of teacher-student interactions described the most instances of teacher oppositional behavior, such as harsh and impatient behaviors. An in-depth description of categories and subcategories is also provided. This study concludes with an interpretive analysis of quantitative and qualitative results considered both separately and together. Implications for middle grades science education are discussed, including recommendations for behavior management, scaffolding students' transition to middle school, making explicit connections to science careers, and providing opportunities for small successes within the science classroom. Implications for science teacher education, limitations of the study, and future research directions are also discussed.
Application of Regional Arrays in Seismic Verification Research
1990-08-31
Hill, MA 02167 P.O. Box 1620 La Jolla, CA 92038-1620 Dr. Richard LaCoss Prof. William Menke MIT-Lincoln Laboratory L amront-Doherty -- dogical b ser t...LWH of Columbia University Hanscom AFB, MA 01731-5000 Palisades, NY 10964 3 Dr. Lorraine Wolf GLILWH Hanscom AFB, MA 01731-5000 Dr. William Wortman... William J. Best Prof. Robert W. Clayton 907 Westwood Drive Seismological Laboratory Vienna, VA 22180 Division of Geological & Planetary Sciences
2014-09-30
partitioning between humpback and minke whales around the western Antarctic Peninsula. Marine Mammal Science. 25: 402-415. 11 Friedlaender, A. S., J. A... Humpback whales (Megaptera novaengliae). Marine Ecology Progress Series 395: 75-89. Watkins, J.L., and A.S. Brierley. 2002. Verification of acoustic... Whales to Acoustic Stimuli, Oceanographic Features, and Prey Availability Ari S. Friedlaender, PhD & Brandon L. Southall, PhD Southall Environmental
Harnessing Implementation Science to Increase the Impact of Health Equity Research.
Chinman, Matthew; Woodward, Eva N; Curran, Geoffrey M; Hausmann, Leslie R M
2017-09-01
Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows 3 steps: detecting (phase 1), understanding (phase 2), and reducing (phase 3), disparities. Although disparities have narrowed over time, many remain. We argue that implementation science could enhance disparities research by broadening the scope of phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in phase 3 studies. We briefly review the focus of phase 2 and phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in phase 3 studies. Many phase 3 studies of disparity-reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real-world practice. Disparities can be considered a "special case" of implementation challenges-when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own.
2002-12-18
KENNEDY SPACE CENTER, FLA. -- A Pegasus XL Expendable Launch Vehicle is prepared for towing to the Multi-Purpose Payload Facility (MPPF) where it will undergo testing, verification, and three flight simulations prior to its scheduled launch. The vehicle, nestled beneath an Orbital Sciences L-1011 aircraft, arrived at the Cape Canaveral Air Force Station Skid Strip on Dec. 17. It is commissioned to carry NASA's Solar Radiation and Climate Experiment (SORCE) spacecraft into orbit in late January 2003. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere with instruments built by the University of Colorado's Laboratory for Atmospheric and Space Physics (LASP).
2002-12-18
KENNEDY SPACE CENTER, FLA. -- A Pegasus XL Expendable Launch Vehicle sits atop a transporter following its arrival in the Multi-Purpose Payload Facility (MPPF) where it will undergo testing, verification, and three flight simulations prior to its scheduled launch. The vehicle, nestled beneath an Orbital Sciences L-1011 aircraft, arrived at the Cape Canaveral Air Force Station Skid Strip on Dec. 17. It is commissioned to carry NASA's Solar Radiation and Climate Experiment (SORCE) spacecraft into orbit in late January 2003. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere with instruments built by the University of Colorado's Laboratory for Atmospheric and Space Physics (LASP).
2002-12-18
KENNEDY SPACE CENTER, FLA. -- A Pegasus XL Expendable Launch Vehicle is prepared for towing to the Multi-Purpose Payload Facility (MPPF) where it will undergo testing, verification, and three flight simulations prior to its scheduled launch. The vehicle, nestled beneath an Orbital Sciences L-1011 aircraft, arrived at the Cape Canaveral Air Force Station Skid Strip on Dec. 17. It is commissioned to carry NASA's Solar Radiation and Climate Experiment (SORCE) spacecraft into orbit in late January 2003. Built by Orbital Sciences Space Systems Group, SORCE will study and measure solar irradiance as a source of energy in the Earth's atmosphere with instruments built by the University of Colorado's Laboratory for Atmospheric and Space Physics (LASP).
Li, Yongfeng; Ma, Hua; Wang, Jiafu; Pang, Yongqiang; Zheng, Qiqi; Chen, Hongya; Han, Yajuan; Zhang, Jieqiu; Qu, Shaobo
2017-01-01
A high-efficiency tri-band quasi-continuous phase gradient metamaterial is designed and demonstrated based on spoof surface plasmon polaritons (SSPPs). High-efficiency polarizaiton conversion transmission is firstly achieved via tailoring phase differece between the transmisive SSPP and the space wave in orthogonal directions. As an example, a tri-band circular-to-circular (CTC) polarization conversion metamateiral (PCM) was designed by a nonlinearly dispersive phase difference. Using such PCM unit cell, a tri-band quasi-continuous phase gradient metamaterial (PGM) was then realized by virtue of the Pancharatnam-Berry phase. The distribution of the cross-polarization transmission phase along the x-direction is continuous except for two infinitely small intervals near the phases 0° and 360°, and thus the phase gradient has definition at any point along the x-direction. The simulated normalized polarization conversion transmission spectrums together with the electric field distributions for circularly polarized wave and linearly polarized wave demonstrated the high-efficiency anomalous refraction of the quasi-continuous PGM. The experimental verification for the linearly polarized incidence was also provided. PMID:28079185
NASA Astrophysics Data System (ADS)
Class, G.; Meyder, R.; Stratmanns, E.
1985-12-01
The large data base for validation and development of computer codes for two-phase flow, generated at the COSIMA facility, is reviewed. The aim of COSIMA is to simulate the hydraulic, thermal, and mechanical conditions in the subchannel and the cladding of fuel rods in pressurized water reactors during the blowout phase of a loss of coolant accident. In terms of fuel rod behavior, it is found that during blowout under realistic conditions only small strains are reached. For cladding rupture extremely high rod internal pressures are necessary. The behavior of fuel rod simulators and the effect of thermocouples attached to the cladding outer surface are clarified. Calculations performed with the codes RELAP and DRUFAN show satisfactory agreement with experiments. This can be improved by updating the phase separation models in the codes.
Score-Level Fusion of Phase-Based and Feature-Based Fingerprint Matching Algorithms
NASA Astrophysics Data System (ADS)
Ito, Koichi; Morita, Ayumi; Aoki, Takafumi; Nakajima, Hiroshi; Kobayashi, Koji; Higuchi, Tatsuo
This paper proposes an efficient fingerprint recognition algorithm combining phase-based image matching and feature-based matching. In our previous work, we have already proposed an efficient fingerprint recognition algorithm using Phase-Only Correlation (POC), and developed commercial fingerprint verification units for access control applications. The use of Fourier phase information of fingerprint images makes it possible to achieve robust recognition for weakly impressed, low-quality fingerprint images. This paper presents an idea of improving the performance of POC-based fingerprint matching by combining it with feature-based matching, where feature-based matching is introduced in order to improve recognition efficiency for images with nonlinear distortion. Experimental evaluation using two different types of fingerprint image databases demonstrates efficient recognition performance of the combination of the POC-based algorithm and the feature-based algorithm.
NASA Astrophysics Data System (ADS)
Mitryk, Shawn; Mueller, Guido
The Laser Interferometer Space Antenna (LISA) is a space-based modified Michelson interfer-ometer designed to measure gravitational radiation in the frequency range from 30 uHz to 1 Hz. The interferometer measurement system (IMS) utilizes one-way laser phase measurements to cancel the laser phase noise, reconstruct the proof-mass motion, and extract the gravitational wave (GW) induced laser phase modulations in post-processing using a technique called time-delay interferometry (TDI). Unfortunately, there exist few hard-ware verification experiments of the IMS. The University of Florida LISA Interferometry Simulator (UFLIS) is designed to perform hardware-in-the-loop simulations of the LISA interferometry system, modeling the characteris-tics of the LISA mission as accurately as possible. This depends, first, on replicating the laser pre-stabilization by locking the laser phase to an ultra-stable Zerodur cavity length reference using the PDH locking method. Phase measurements of LISA-like photodetector beat-notes are taken using the UF-phasemeter (PM) which can measure the laser BN frequency to within an accuracy of 0.22 uHz. The inter-space craft (SC) laser links including the time-delay due to the 5 Gm light travel time along the LISA arms, the laser Doppler shifts due to differential SC motion, and the GW induced laser phase modulations are simulated electronically using the electronic phase delay (EPD) unit. The EPD unit replicates the laser field propagation between SC by measuring a photodetector beat-note frequency with the UF-phasemeter and storing the information in memory. After the requested delay time, the frequency information is added to a Doppler offset and a GW-like frequency modulation. The signal is then regenerated with the inter-SC laser phase affects applied. Utilizing these components, I will present the first complete TDI simulations performed using the UFLIS. The LISA model is presented along-side the simulation, comparing the generation and measurement of LISA-like signals. Phasemeter measurements are used in post-processing and combined in the linear combinations defined by TDI, thus, canceling the laser phase and phase-lock loop noise to extract the applied GW modulation buried under the noise. Nine order of magnitude common mode laser noise cancellation is achieved at a frequency of 1 mHz and the GW signal is clearly visible after the laser and PLL noise cancellation.
NASA Astrophysics Data System (ADS)
Andrina, G.; Basso, V.; Saitta, L.
2004-08-01
The effort in optimising the AIV process has been mainly focused in the recent years on the standardisation of approaches and on the application of new methodologies. But the earlier the intervention, the greater the benefits in terms of cost and schedule. Early phases of AIV process relied up to now on standards that need to be tailored through company and personal expertise. A study has then been conducted in order to exploit the possibility to develop an expert system helping in making choices in the early, conceptual phase of Assembly, Integration and Verification, namely the Model Philosophy and the test definition. The work focused on a hybrid approach, allowing interaction between historical data and human expertise. The expert system that has been prototyped exploits both information elicited from domain experts and results of a Data Mining activity on the existent data bases of completed projects verification data. The Data Mining algorithms allow the extraction of past experience resident on ESA/ MATD data base, which contains information in the form of statistical summaries, costs, frequencies of on-ground and in flight failures. Finding non-trivial associations could then be utilised by the experts to manage new decisions in a controlled way (Standards driven) at the beginning or during the AIV Process Moreover, the Expert AIV could allow compilation of a set of feasible AIV schedules to support further programmatic-driven choices.
STAR-CCM+ Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David
2016-09-30
The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less
Army Apprenticeship Program (AAP). Evaluation of AAP Operations. Phase 3
1990-05-25
TRADOC DCST. Decreasing the layers of the bureaucracies should expedite the verification process and completion approvals. 58 e. Foster competition...in apprenticeship programs, etc. The DCST should personally give award for the top three programs. The awards with proper promotion should foster ...1988, Washington DC. (3) Latack , J. C. & Josephs, S. L. & Roach, B. L. & Levine, M., Carpenter Apprentices: Comparison of Career Transitions for Men
ERIC Educational Resources Information Center
Ball, Edward H.
"Reading Basics Plus" consists of an integrated set of texts, workbooks, duplicating masters, word cards, charts, and teacher's guidebooks. By a process of small group trials, students' and teachers' reactions to the proposed content of the "Reading Basics Plus" program for grades four, five, and six were obtained in order to provide editors and…
Traveler Phase 1A Joint Review
NASA Technical Reports Server (NTRS)
St. John, Clint; Scofield, Jan; Skoog, Mark; Flock, Alex; Williams, Ethan; Guirguis, Luke; Loudon, Kevin; Sutherland, Jeffrey; Lehmann, Richard; Garland, Michael;
2017-01-01
The briefing contains the preliminary findings and suggestions for improvement of methods used in development and evaluation of a multi monitor runtime assurance architecture for autonomous flight vehicles. Initial system design, implementation, verification, and flight testing has been conducted. As of yet detailed data review is incomplete, and flight testing has been limited to initial monitor force fights. Detailed monitor flight evaluations have yet to be performed.
Flight Test of the F/A-18 Active Aeroelastic Wing Airplane
NASA Technical Reports Server (NTRS)
Voracek, David
2007-01-01
A viewgraph presentation of flight tests performed on the F/A active aeroelastic wing airplane is shown. The topics include: 1) F/A-18 AAW Airplane; 2) F/A-18 AAW Control Surfaces; 3) Flight Test Background; 4) Roll Control Effectiveness Regions; 5) AAW Design Test Points; 6) AAW Phase I Test Maneuvers; 7) OBES Pitch Doublets; 8) OBES Roll Doublets; 9) AAW Aileron Flexibility; 10) Phase I - Lessons Learned; 11) Control Law Development and Verification & Validation Testing; 12) AAW Phase II RFCS Envelopes; 13) AAW 1-g Phase II Flight Test; 14) Region I - Subsonic 1-g Rolls; 15) Region I - Subsonic 1-g 360 Roll; 16) Region II - Supersonic 1-g Rolls; 17) Region II - Supersonic 1-g 360 Roll; 18) Region III - Subsonic 1-g Rolls; 19) Roll Axis HOS/LOS Comparison Region II - Supersonic (open-loop); 20) Roll Axis HOS/LOS Comparison Region II - Supersonic (closed-loop); 21) AAW Phase II Elevated-g Flight Test; 22) Region I - Subsonic 4-g RPO; and 23) Phase II - Lessons Learned
NASA Astrophysics Data System (ADS)
de Graauw, T.
2010-01-01
First of all, I would like to wish all of you an happy New Year, which I sincerely hope will bring you success, happiness and interesting new opportunities. For us in ALMA, the end of 2009 and the beginning of 2010 have been very exciting and this is once more a special moment in the development of our observatory. After transporting our third antenna to the high altitude Chajnantor plateau, at 5000 meters above sea level, our team successfully combined the outputs of these antennas using "phase closure", a standard method in interferometry. This achievement marks one more milestone along the way to the beginning of Commissioning and Science Verification, CSV, which, once completed, will mark the beginning of Early Science for ALMA. There was an official announcement about this milestone at the AAS meeting early January and we also wanted to share this good news with you through this newsletter, which contains the content of the announcement. In another area, this newsletter contains the progress on site and a presentation of the Atacama Compact Array (ACA). This is the second part of a two part series on antennas, a continuation of the article in the last newsletter. The ACA plays a crucial part in the imaging of extended sources with ALMA. Without the ACA, the ability to produce accurate images would be very restricted. Finally, as you know, we like to show the human face of this great endeavour we are building and this time, we decided to highlight the Department of Technical Services, another fundamental piece working actively to make ALMA the most powerful radio observatory ever built.
Geometry-constraint-scan imaging for in-line phase contrast micro-CT.
Fu, Jian; Yu, Guangyuan; Fan, Dekai
2014-01-01
X-ray phase contrast computed tomography (CT) uses the phase shift that x-rays undergo when passing through matter, rather than their attenuation, as the imaging signal and may provide better image quality in soft-tissue and biomedical materials with low atomic number. Here a geometry-constraint-scan imaging technique for in-line phase contrast micro-CT is reported. It consists of two circular-trajectory scans with x-ray detector at different positions, the phase projection extraction method with the Fresnel free-propagation theory and the filter back-projection reconstruction algorithm. This method removes the contact-detector scan and the pure phase object assumption in classical in-line phase contrast Micro-CT. Consequently it relaxes the experimental conditions and improves the image contrast. This work comprises a numerical study of this technique and its experimental verification using a biomedical composite dataset measured at an x-ray tube source Micro-CT setup. The numerical and experimental results demonstrate the validity of the presented method. It will be of interest for a wide range of in-line phase contrast Micro-CT applications in biology and medicine.
Harnessing Implementation Science to Increase the Impact of Health Disparity Research
Chinman, Matthew; Woodward, Eva N.; Curran, Geoffrey M.; Hausmann, Leslie R. M.
2017-01-01
Background Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows three steps: detecting (Phase 1), understanding (Phase 2), and reducing (Phase 3), disparities. While disparities have narrowed over time, many remain. Objectives We argue that implementation science could enhance disparities research by broadening the scope of Phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in Phase 3 studies. Methods We briefly review the focus of Phase 2 and Phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Results Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in Phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in Phase 3 studies. Many Phase 3 studies of disparity reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real world practice. Conclusions Disparities can be considered a “special case” of implementation challenges—when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own. PMID:28806362
Crewed Space Vehicle Battery Safety Requirements
NASA Technical Reports Server (NTRS)
Jeevarajan, Judith A.; Darcy, Eric C.
2014-01-01
This requirements document is applicable to all batteries on crewed spacecraft, including vehicle, payload, and crew equipment batteries. It defines the specific provisions required to design a battery that is safe for ground personnel and crew members to handle and/or operate during all applicable phases of crewed missions, safe for use in the enclosed environment of a crewed space vehicle, and safe for use in launch vehicles, as well as in unpressurized spaces adjacent to the habitable portion of a space vehicle. The required provisions encompass hazard controls, design evaluation, and verification. The extent of the hazard controls and verification required depends on the applicability and credibility of the hazard to the specific battery design and applicable missions under review. Evaluation of the design and verification program results shall be completed prior to certification for flight and ground operations. This requirements document is geared toward the designers of battery systems to be used in crewed vehicles, crew equipment, crew suits, or batteries to be used in crewed vehicle systems and payloads (or experiments). This requirements document also applies to ground handling and testing of flight batteries. Specific design and verification requirements for a battery are dependent upon the battery chemistry, capacity, complexity, charging, environment, and application. The variety of battery chemistries available, combined with the variety of battery-powered applications, results in each battery application having specific, unique requirements pertinent to the specific battery application. However, there are basic requirements for all battery designs and applications, which are listed in section 4. Section 5 includes a description of hazards and controls and also includes requirements.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Building the Qualification File of EGNOS with DOORS
NASA Astrophysics Data System (ADS)
Fabre, J.
2008-08-01
EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention NIOSH Dose Reconstruction Program Ten Year Review--Phase I Report on Quality of Science; Request for Public Review and... Ten Year Review--Phase I Report on Quality of Science.'' This publication is part of a review by NIOSH...
Using the Learning Cycle To Teach Physical Science: A Hands-on Approach for the Middle Grades.
ERIC Educational Resources Information Center
Beisenherz, Paul; Dantonio, Marylou
The Learning Cycle Strategy enables students themselves to construct discrete science concepts and includes an exploration phase, introduction phase, and application phase. This book focuses on the use of the Learning Cycle to teach physical sciences and is divided into three sections. Section I develops a rationale for the Learning Cycle as an…
Monte Carlo simulations to replace film dosimetry in IMRT verification.
Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig
2011-01-01
Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. Copyright © 2010. Published by Elsevier GmbH.
1990-02-01
copies Pl ,...,P. of a multiple module fp resolve nondeterminism (local or global) in an identical manner. 5. The copies PI,...,P, axe physically...recovery block. A recovery block consists of a conventional block (like in ALGOL or PL /I) which is provided with a means of error detection, called an...improved failures model for communicating processes. In Proceeding. NSF- SERC Seminar on Concurrency, volume 197 of Lecture Notes in Computer Science
CASL Dakota Capabilities Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Simmons, Chris; Williams, Brian J.
2017-10-10
The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.
ERIC Educational Resources Information Center
Dijkman, Bea; Reehuis, Lidwien; Roodbol, Petrie
2017-01-01
Universities of applied sciences in Europe face the challenge of preparing students in health and social care for working with older people and contributing to the innovations needed in light of the ageing of society, along with changes in the health and social care systems in many countries. Dealing with the special needs of older people and the…
1988-03-01
Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
Clampitt, J.; S?nchez, C.; Kwan, J.; ...
2016-11-22
We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less
Nord, B.; Buckley-Geer, E.; Lin, H.; ...
2016-08-05
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less
Cosmic shear measurements with Dark Energy Survey Science Verification data
Becker, M. R.
2016-07-06
Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less
Kwan, J.; Sánchez, C.; Clampitt, J.; ...
2016-10-05
We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe asmore » $$\\Omega_m = 0.31 \\pm 0.09$$ and the clustering amplitude of the matter power spectrum as $$\\sigma_8 = 0.74 +\\pm 0.13$$ after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into $$S_8$$ = $$\\sigma_8(\\Omega_m/0.3)^{0.16} = 0.74 \\pm 0.12$$ for our fiducial lens redshift bin at 0.35 < z < 0.5, while $$S_8 = 0.78 \\pm 0.09$$ using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.« less
Bonnett, C.; Troxel, M. A.; Hartley, W.; ...
2016-08-30
Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σ crit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nord, B.; Buckley-Geer, E.; Lin, H.
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either weremore » not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ~ 0.80–3.2 and in i-band surface brightness i SB ~ 23–25 mag arcsec –2 (2'' aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ~ 5''–9'' and M enc ~ 8 × 10 12 to 6 × 10 13 M ⊙, respectively.« less
Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform
NASA Astrophysics Data System (ADS)
Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.
2012-12-01
This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.
Galaxy-galaxy lensing in the Dark Energy Survey Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clampitt, J.; S?nchez, C.; Kwan, J.
We present galaxy-galaxy lensing results from 139 square degrees of Dark Energy Survey (DES) Science Verification (SV) data. Our lens sample consists of red galaxies, known as redMaGiC, which are specifically selected to have a low photometric redshift error and outlier rate. The lensing measurement has a total signal-to-noise of 29 over scales $0.09 < R < 15$ Mpc/$h$, including all lenses over a wide redshift range $0.2 < z < 0.8$. Dividing the lenses into three redshift bins for this constant moving number density sample, we find no evidence for evolution in the halo mass with redshift. We obtainmore » consistent results for the lensing measurement with two independent shear pipelines, ngmix and im3shape. We perform a number of null tests on the shear and photometric redshift catalogs and quantify resulting systematic uncertainties. Covariances from jackknife subsamples of the data are validated with a suite of 50 mock surveys. The results and systematics checks in this work provide a critical input for future cosmological and galaxy evolution studies with the DES data and redMaGiC galaxy samples. We fit a Halo Occupation Distribution (HOD) model, and demonstrate that our data constrains the mean halo mass of the lens galaxies, despite strong degeneracies between individual HOD parameters.« less
Development of the Simbol-X science verification model and its contribution for the IXO Mission
NASA Astrophysics Data System (ADS)
Maier, Daniel; Aschauer, Florian; Dick, Jürgen; Distratis, Giuseppe; Gebhardt, Henry; Herrmann, Sven; Kendziorra, Eckhard; Lauf, Thomas; Lechner, Peter; Santangelo, Andrea; Schanz, Thomas; Strüder, Lothar; Tenzer, Chris; Treis, Johannes
2010-07-01
Like the International X-ray Observatory (IXO) mission, the Simbol-X mission is a projected X-ray space telescope with spectral and imaging capabilities covering the energy range from 500 eV up to 80 keV. To detect photons within this wide range of energies, a silicon based "Depleted P-channel Field Effect Transistor" (DePFET)- matrix is used as the Low Energy Detector (LED) on top of an array of CdTe-Caliste modules, which act as the High Energy Detector (HED). A Science Verification Model (SVM) consisting of one LED quadrant in front of one Caliste module will be set up at our institute (IAAT) and operated under laboratory conditions that approximate the expected environment in space. As a first step we use the SVM to test and optimize the performance of the LED operation and data acquisition chain, consisting of an ADC, an event-preprocessor, a sequencer, and an interface controller. All these components have been developed at our institute with the objective to handle the high readout rate of approximately 8000 frames per second. The second step is to study the behaviour and the interactions of LED and HED operating as a combined detector system. We report on the development status of the SVM and its associated electronics and present first results of the currently achieved spectral performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwan, J.; Sánchez, C.; Clampitt, J.
We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe asmore » $$\\Omega_m = 0.31 \\pm 0.09$$ and the clustering amplitude of the matter power spectrum as $$\\sigma_8 = 0.74 +\\pm 0.13$$ after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into $$S_8$$ = $$\\sigma_8(\\Omega_m/0.3)^{0.16} = 0.74 \\pm 0.12$$ for our fiducial lens redshift bin at 0.35 < z < 0.5, while $$S_8 = 0.78 \\pm 0.09$$ using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.« less
redMaGiC: Selecting luminous red galaxies from the DES Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozo, E.; Rykoff, E. S.; Abate, A.
Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less
Galaxy bias from galaxy-galaxy lensing in the DES Science Verification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prat, J.; et al.
We present a measurement of galaxy-galaxy lensing around a magnitude-limited (more » $$i_{AB} < 22.5$$) sample of galaxies selected from the Dark Energy Survey Science Verification (DES-SV) data. We split these lenses into three photometric-redshift bins from 0.2 to 0.8, and determine the product of the galaxy bias $b$ and cross-correlation coefficient between the galaxy and dark matter overdensity fields $r$ in each bin, using scales above 4 Mpc/$h$ comoving, where we find the linear bias model to be valid given our current uncertainties. We compare our galaxy bias results from galaxy-galaxy lensing with those obtained from galaxy clustering (Crocce et al. 2016) and CMB lensing (Giannantonio et al. 2016) for the same sample of galaxies, and find our measurements to be in good agreement with those in Crocce et al. (2016), while, in the lowest redshift bin ($$z\\sim0.3$$), they show some tension with the findings in Giannantonio et al. (2016). Our results are found to be rather insensitive to a large range of systematic effects. We measure $$b\\cdot r$$ to be $$0.87\\pm 0.11$$, $$1.12 \\pm 0.16$$ and $$1.24\\pm 0.23$$, respectively for the three redshift bins of width $$\\Delta z = 0.2$$ in the range $0.2« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonnett, C.; Troxel, M. A.; Hartley, W.
Here we present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods—annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz—are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z’s. From the galaxies in the DES SV shear catalogue, which have meanmore » redshift 0.72±0.01 over the range 0.38 of approximately 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, Σ crit, finding levels of bias safely less than the statistical power of DES SV data. In conclusion, we recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less
redMaGiC: Selecting luminous red galaxies from the DES Science Verification data
Rozo, E.; Rykoff, E. S.; Abate, A.; ...
2016-05-30
Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less
Verification and Implementation of Operations Safety Controls for Flight Missions
NASA Technical Reports Server (NTRS)
Smalls, James R.; Jones, Cheryl L.; Carrier, Alicia S.
2010-01-01
There are several engineering disciplines, such as reliability, supportability, quality assurance, human factors, risk management, safety, etc. Safety is an extremely important engineering specialty within NASA, and the consequence involving a loss of crew is considered a catastrophic event. Safety is not difficult to achieve when properly integrated at the beginning of each space systems project/start of mission planning. The key is to ensure proper handling of safety verification throughout each flight/mission phase. Today, Safety and Mission Assurance (S&MA) operations engineers continue to conduct these flight product reviews across all open flight products. As such, these reviews help ensure that each mission is accomplished with safety requirements along with controls heavily embedded in applicable flight products. Most importantly, the S&MA operations engineers are required to look for important design and operations controls so that safety is strictly adhered to as well as reflected in the final flight product.
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childerson, M.T.; Fujita, R.K.
1985-01-01
A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less
Three years of operational experience from Schauinsland CTBT monitoring station.
Zähringer, M; Bieringer, J; Schlosser, C
2008-04-01
Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.
Integrated heat pipe-thermal storage system performance evaluation
NASA Technical Reports Server (NTRS)
Keddy, E.; Sena, J. T.; Merrigan, M.; Heidenreich, Gary
1987-01-01
An integrated thermal energy storage (TES) system, developed as a part of an organic Rankine cycle solar dynamic power system is described, and the results of the performance verification tests of this TES system are presented. The integrated system consists of potassium heat-pipe elements that incorporate TES canisters within the vapor space, along with an organic fluid heater tube used as the condenser region of the heat pipe. The heat pipe assembly was operated through the range of design conditions from the nominal design input of 4.8 kW to a maximum of 5.7 kW. The performance verification tests show that the system meets the functional requirements of absorbing the solar energy reflected by the concentrator, transporting the energy to the organic Rankine heater, providing thermal storage for the eclipse phase, and allowing uniform discharge from the thermal storage to the heater.
Kim, Kimin; Park, Jong-Kyu; Boozer, Allen H
2013-05-03
This Letter presents the first numerical verification for the bounce-harmonic (BH) resonance phenomena of the neoclassical transport in a tokamak perturbed by nonaxisymmetric magnetic fields. The BH resonances were predicted by analytic theories of neoclassical toroidal viscosity (NTV), as the parallel and perpendicular drift motions can be resonant and result in a great enhancement of the radial momentum transport. A new drift-kinetic δf guiding-center particle code, POCA, clearly verified that the perpendicular drift motions can reduce the transport by phase-mixing, but in the BH resonances the motions can form closed orbits and particles radially drift out fast. The POCA calculations on resulting NTV torque are largely consistent with analytic calculations, and show that the BH resonances can easily dominate the NTV torque when a plasma rotates in the perturbed tokamak and therefore, is a critical physics for predicting the rotation and stability in the International Thermonuclear Experimental Reactor.
Development of syntax of intuition-based learning model in solving mathematics problems
NASA Astrophysics Data System (ADS)
Yeni Heryaningsih, Nok; Khusna, Hikmatul
2018-01-01
The aim of the research was to produce syntax of Intuition Based Learning (IBL) model in solving mathematics problem for improving mathematics students’ achievement that valid, practical and effective. The subject of the research were 2 classes in grade XI students of SMAN 2 Sragen, Central Java. The type of the research was a Research and Development (R&D). Development process adopted Plomp and Borg & Gall development model, they were preliminary investigation step, design step, realization step, evaluation and revision step. Development steps were as follow: (1) Collected the information and studied of theories in Preliminary Investigation step, studied about intuition, learning model development, students condition, and topic analysis, (2) Designed syntax that could bring up intuition in solving mathematics problem and then designed research instruments. They were several phases that could bring up intuition, Preparation phase, Incubation phase, Illumination phase and Verification phase, (3) Realized syntax of Intuition Based Learning model that has been designed to be the first draft, (4) Did validation of the first draft to the validator, (5) Tested the syntax of Intuition Based Learning model in the classrooms to know the effectiveness of the syntax, (6) Conducted Focus Group Discussion (FGD) to evaluate the result of syntax model testing in the classrooms, and then did the revision on syntax IBL model. The results of the research were produced syntax of IBL model in solving mathematics problems that valid, practical and effective. The syntax of IBL model in the classroom were, (1) Opening with apperception, motivations and build students’ positive perceptions, (2) Teacher explains the material generally, (3) Group discussion about the material, (4) Teacher gives students mathematics problems, (5) Doing exercises individually to solve mathematics problems with steps that could bring up students’ intuition: Preparations, Incubation, Illumination, and Verification, (6) Closure with the review of students have learned or giving homework.