Sample records for complex surfaces verification

  1. The greenhouse theory of climate change - A test by an inadvertent global experiment

    NASA Technical Reports Server (NTRS)

    Ramanathan, V.

    1988-01-01

    The greenhouse theory of climate change has reached the crucial stage of verification. Surface warming as large as that predicted by models would be unprecedented during an interglacial period such as the present. The theory, its scope for verification, and the emerging complexities of the climate feedback mechanisms are discussed in this paper. The evidence for change is described and competing nonclimatic forcings are discussed.

  2. Development and Verification of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.

  3. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  4. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  5. Numerical Modeling of Ablation Heat Transfer

    NASA Technical Reports Server (NTRS)

    Ewing, Mark E.; Laker, Travis S.; Walker, David T.

    2013-01-01

    A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.

  6. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  7. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  8. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com

    2016-11-15

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  9. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  10. Development of an Unstructured, Three-Dimensional Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph; Stern, Eric; Palmer, Grant; Muppidi, Suman; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. The extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries as well as multi-dimensional physics, which have been shown to be important in some scenarios and are not captured by one-dimensional models. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  11. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    PubMed

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  12. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  13. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  14. High resolution simulations of orographic flow over a complex terrain on the Southeast coast of Brazil

    NASA Astrophysics Data System (ADS)

    Chou, S. C.; Zolino, M. M.; Gomes, J. L.; Bustamante, J. F.; Lima-e-Silva, P. P.

    2012-04-01

    The Eta Model is used operationally by CPTEC to produce weather forecasts over South America since 1997. The model has gone through upgrades. In order to prepare the model for operational higher resolution forecasts, the model is configured and tested over a region of complex topography located near the coast of Southeast Brazil. The Eta Model was configured, with 2-km horizontal resolution and 50 layers. The Eta-2km is a second nesting, it is driven by Eta-15km, which in its turn is driven by Era-Interim reanalyses. The model domain includes the two Brazilians cities, Rio de Janeiro and Sao Paulo, urban areas, preserved tropical forest, pasture fields, and complex terrain and coastline. Mountains can rise up to about 700m. The region suffers frequent events of floods and landslides. The objective of this work is to evaluate high resolution simulations of wind and temperature in this complex area. Verification of model runs uses observations taken from the nuclear power plant. Accurate near-surface wind direction and magnitude are needed for the plant emergency plan and winds are highly sensitive to model spatial resolution and atmospheric stability. Verification of two cases during summer shows that model has clear diurnal cycle signal for wind in that region. The area is characterized by weak winds which makes the simulation more difficult. The simulated wind magnitude is about 1.5m/s, which is close to observations of about 2m/s; however, the observed change of wind direction of the sea breeze is fast whereas it is slow in the simulations. Nighttime katabatic flow is captured by the simulations. Comparison against Eta-5km runs show that the valley circulation is better described in the 2-km resolution run. Simulated temperatures follow closely the observed diurnal cycle. Experiments improving some surface conditions such as the surface temperature and land cover show simulation error reduction and improved diurnal cycle.

  15. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    PubMed

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  16. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  17. The Interaction between Surface Color and Color Knowledge: Behavioral and Electrophysiological Evidence

    ERIC Educational Resources Information Center

    Bramao, Ines; Faisca, Luis; Forkstam, Christian; Inacio, Filomena; Araujo, Susana; Petersson, Karl Magnus; Reis, Alexandra

    2012-01-01

    In this study, we used event-related potentials (ERPs) to evaluate the contribution of surface color and color knowledge information in object identification. We constructed two color-object verification tasks--a surface and a knowledge verification task--using high color diagnostic objects; both typical and atypical color versions of the same…

  18. On Machine Capacitance Dimensional and Surface Profile Measurement System

    NASA Technical Reports Server (NTRS)

    Resnick, Ralph

    1993-01-01

    A program was awarded under the Air Force Machine Tool Sensor Improvements Program Research and Development Announcement to develop and demonstrate the use of a Capacitance Sensor System including Capacitive Non-Contact Analog Probe and a Capacitive Array Dimensional Measurement System to check the dimensions of complex shapes and contours on a machine tool or in an automated inspection cell. The manufacturing of complex shapes and contours and the subsequent verification of those manufactured shapes is fundamental and widespread throughout industry. The critical profile of a gear tooth; the overall shape of a graphite EDM electrode; the contour of a turbine blade in a jet engine; and countless other components in varied applications possess complex shapes that require detailed and complex inspection procedures. Current inspection methods for complex shapes and contours are expensive, time-consuming, and labor intensive.

  19. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  20. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  1. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  2. Near-field plasmonic beam engineering with complex amplitude modulation based on metasurface

    NASA Astrophysics Data System (ADS)

    Song, Xu; Huang, Lingling; Sun, Lin; Zhang, Xiaomeng; Zhao, Ruizhe; Li, Xiaowei; Wang, Jia; Bai, Benfeng; Wang, Yongtian

    2018-02-01

    Metasurfaces have recently intrigued extensive interest due to their ability to locally manipulate electromagnetic waves, which provide great feasibility for tailoring both propagation waves and surface plasmon polaritons (SPPs). Manipulation of SPPs with arbitrary complex fields is an important issue in integrated nanophotonics due to their capability of guiding waves with subwavelength footprints. Here, an approach with metasurfaces composed of nanoaperture arrays is proposed and experimentally demonstrated which can effectively manipulate the complex amplitude of SPPs in the near-field regime. Tailoring the azimuthal angles of individual nanoapertures and simultaneously tuning their geometric parameters, the phase and amplitude are controlled based on the Pancharatnam-Berry phases and their individual transmission coefficients. For the verification of the concept, Airy plasmons and axisymmetric Airy-SPPs are generated. The results of numerical simulations and near-field imaging are consistent with each other. Besides the rigorous simulations, we applied a 2D dipole analysis for additional analysis. This strategy of complex amplitude manipulation with metasurfaces can be used for potential applications in plasmonic beam shaping, integrated optoelectronic systems, and surface wave holography.

  3. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less

  4. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  5. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  6. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  7. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  8. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Challenges in the Verification of Reinforcement Learning Algorithms

    NASA Technical Reports Server (NTRS)

    Van Wesel, Perry; Goodloe, Alwyn E.

    2017-01-01

    Machine learning (ML) is increasingly being applied to a wide array of domains from search engines to autonomous vehicles. These algorithms, however, are notoriously complex and hard to verify. This work looks at the assumptions underlying machine learning algorithms as well as some of the challenges in trying to verify ML algorithms. Furthermore, we focus on the specific challenges of verifying reinforcement learning algorithms. These are highlighted using a specific example. Ultimately, we do not offer a solution to the complex problem of ML verification, but point out possible approaches for verification and interesting research opportunities.

  10. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  11. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    NASA Astrophysics Data System (ADS)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  12. Arms Control: Verification and Compliance. Foreign Policy Association Headline Series, No. 270.

    ERIC Educational Resources Information Center

    Krepon, Michael

    One in a series of booklets whose purpose is to stimulate greater and more effective understanding of world affairs among Americans, this five-chapter report is geared to the nonexpert wanting to know more about the complex topics of verification and compliance with arms control agreements. "Basic Concepts of Verification" examines the…

  13. Clean assembly and integration techniques for the Hubble Space Telescope High Fidelity Mechanical Simulator

    NASA Technical Reports Server (NTRS)

    Hughes, David W.; Hedgeland, Randy J.

    1994-01-01

    A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.

  14. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site wasmore » removed in its entirety.« less

  15. Evaluating and Improving Wind Forecasts over South China: The Role of Orographic Parameterization in the GRAPES Model

    NASA Astrophysics Data System (ADS)

    Zhong, Shuixin; Chen, Zitong; Xu, Daosheng; Zhang, Yanxia

    2018-06-01

    Unresolved small-scale orographic (SSO) drags are parameterized in a regional model based on the Global/Regional Assimilation and Prediction System for the Tropical Mesoscale Model (GRAPES TMM). The SSO drags are represented by adding a sink term in the momentum equations. The maximum height of the mountain within the grid box is adopted in the SSO parameterization (SSOP) scheme as compensation for the drag. The effects of the unresolved topography are parameterized as the feedbacks to the momentum tendencies on the first model level in planetary boundary layer (PBL) parameterization. The SSOP scheme has been implemented and coupled with the PBL parameterization scheme within the model physics package. A monthly simulation is designed to examine the performance of the SSOP scheme over the complex terrain areas located in the southwest of Guangdong. The verification results show that the surface wind speed bias has been much alleviated by adopting the SSOP scheme, in addition to reduction of the wind bias in the lower troposphere. The target verification over Xinyi shows that the simulations with the SSOP scheme provide improved wind estimation over the complex regions in the southwest of Guangdong.

  16. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  17. Field-scale and wellbore modeling of compaction-induced casing failures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hilbert, L.B. Jr.; Gwinn, R.L.; Moroney, T.A.

    1999-06-01

    Presented in this paper are the results and verification of field- and wellbore-scale large deformation, elasto-plastic, geomechanical finite element models of reservoir compaction and associated casing damage. The models were developed as part of a multidisciplinary team project to reduce the number of costly well failures in the diatomite reservoir of the South Belridge Field near Bakersfield, California. Reservoir compaction of high porosity diatomite rock induces localized shearing deformations on horizontal weak-rock layers and geologic unconformities. The localized shearing deformations result in casing damage or failure. Two-dimensional, field-scale finite element models were used to develop relationships between field operations, surfacemore » subsidence, and shear-induced casing damage. Pore pressures were computed for eighteen years of simulated production and water injection, using a three-dimensional reservoir simulator. The pore pressures were input to the two-dimensional geomechanical field-scale model. Frictional contact surfaces were used to model localized shear deformations. To capture the complex casing-cement-rock interaction that governs casing damage and failure, three-dimensional models of a wellbore were constructed, including a frictional sliding surface to model localized shear deformation. Calculations were compared to field data for verification of the models.« less

  18. Multi-centre audit of VMAT planning and pre-treatment verification.

    PubMed

    Jurado-Bruggeman, Diego; Hernández, Victor; Sáez, Jordi; Navarro, David; Pino, Francisco; Martínez, Tatiana; Alayrach, Maria-Elena; Ailleres, Norbert; Melero, Alejandro; Jornet, Núria

    2017-08-01

    We performed a multi-centre intercomparison of VMAT dose planning and pre-treatment verification. The aims were to analyse the dose plans in terms of dosimetric quality and deliverability, and to validate whether in-house pre-treatment verification results agreed with those of an external audit. The nine participating centres encompassed different machines, equipment, and methodologies. Two mock cases (prostate and head and neck) were planned using one and two arcs. A plan quality index was defined to compare the plans and different complexity indices were calculated to check their deliverability. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit (global 3D gamma, absolute dose differences, 10% of maximum dose threshold). Log-file analysis was performed to look for delivery errors. All centres fulfilled the dosimetric goals but plan quality and delivery complexity were heterogeneous and uncorrelated, depending on the manufacturer and the planner's methodology. Pre-treatment verifications results were within tolerance in all cases for gamma 3%-3mm evaluation. Nevertheless, differences between the external audit and in-house measurements arose due to different equipment or methodology, especially for 2%-2mm criteria with differences up to 20%. No correlation was found between complexity indices and verification results amongst centres. All plans fulfilled dosimetric constraints, but plan quality and complexity did not correlate and were strongly dependent on the planner and the vendor. In-house measurements cannot completely replace external audits for credentialing. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT STORMWATER MANAGEMENT INC., STORMFILTER SYSTEM WITH ZPG MEDIA

    EPA Science Inventory

    Verification testing of the Stormwater Management, Inc. StormFilter Using ZPG Filter Media was conducted on a 0.19 acre portion of the eastbound highway surface of Interstate 794, at an area commonly referred to as the "Riverwalk" site near downtown Milwaukee, Wisconsin...

  20. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  1. End-to-End Commitment

    NASA Technical Reports Server (NTRS)

    Newcomb, John

    2004-01-01

    The end-to-end test would verify the complex sequence of events from lander separation to landing. Due to the large distances involved and the significant delay time in sending a command and receiving verification, the lander needed to operate autonomously after it separated from the orbiter. It had to sense conditions, make decisions, and act accordingly. We were flying into a relatively unknown set of conditions-a Martian atmosphere of unknown pressure, density, and consistency to land on a surface of unknown altitude, and one which had an unknown bearing strength.

  2. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    EPA Science Inventory

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  4. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  5. Verification tests of durable TPS concepts

    NASA Technical Reports Server (NTRS)

    Shideler, J. L.; Webb, G. L.; Pittman, C. M.

    1984-01-01

    Titanium multiwall, superalloy honeycomb, and Advanced Carbon-carbon (ACC) multipost Thermal Protection System (TPS) concepts are being developed to provide durable protection for surfaces of future space transportation systems. Verification tests including thermal, vibration, acoustic, water absorption, lightning strike, and aerothermal tests are described. Preliminary results indicate that the three TPS concepts are viable up to a surface temperature in excess of 2300 F.

  6. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  7. Updated one-dimensional hydraulic model of the Kootenai River, Idaho-A supplement to Scientific Investigations Report 2005-5110

    USGS Publications Warehouse

    Czuba, Christiana R.; Barton, Gary J.

    2011-01-01

    The Kootenai Tribe of Idaho, in cooperation with local, State, Federal, and Canadian agency co-managers and scientists, is assessing the feasibility of a Kootenai River habitat restoration project in Boundary County, Idaho. The restoration project is focused on recovery of the endangered Kootenai River white sturgeon (Acipenser transmontanus) population, and simultaneously targets habitat-based recovery of other native river biota. River restoration is a complex undertaking that requires a thorough understanding of the river and floodplain landscape prior to restoration efforts. To assist in evaluating the feasibility of this endeavor, the U.S. Geological Survey developed an updated one-dimensional hydraulic model of the Kootenai River in Idaho between river miles (RMs) 105.6 and 171.9 to characterize the current hydraulic conditions. A previously calibrated model of the study area, based on channel geometry data collected during 2002 and 2003, was the basis for this updated model. New high-resolution bathymetric surveys conducted in the study reach between RMs 138 and 161.4 provided additional detail of channel morphology. A light detection and ranging (LIDAR) survey was flown in the Kootenai River valley in 2005 between RMs 105.6 and 159.5 to characterize the floodplain topography. Six temporary gaging stations installed in 2006-08 between RMs 154.1 and 161.2, combined with five permanent gaging stations in the study reach, provided discharge and water-surface elevations for model calibration and verification. Measured discharges ranging from about 4,800 to 63,000 cubic feet per second (ft3/s) were simulated for calibration events, and calibrated water-surface elevations ranged from about 1,745 to 1,820 feet (ft) throughout the extent of the model. Calibration was considered acceptable when the simulated and measured water-surface elevations at gaging stations differed by less than (+/-)0.15 ft. Model verification consisted of simulating 10 additional events with measured discharges ranging from about 4,900 to 52,000 ft3/s, and comparing simulated and measured water-surface elevations at gaging stations. Average water-surface-elevation error in the verification simulations was 0.05 ft, with the error ranging from -1.17 to 0.94 ft over the range of events and gaging stations. Additional verification included a graphical comparison of measured average velocities that range from 1.0 to 6.2 feet per second to simulated velocities at four sites within the study reach for measured discharges ranging from about 7,400 to 46,600 ft3/s. The availability of high-resolution bathymetric and LIDAR data, along with the additional gaging stations in the study reach, allowed for more detail to be added to the model and a more thorough calibration, sensitivity, and verification analysis to be conducted. Model resolution and performance is most improved between RMs 140 and 160, which includes the 18.3-mile reach of the Kootenai River white sturgeon critical habitat.

  8. Normal contour error measurement on-machine and compensation method for polishing complex surface by MRF

    NASA Astrophysics Data System (ADS)

    Chen, Hua; Chen, Jihong; Wang, Baorui; Zheng, Yongcheng

    2016-10-01

    The Magnetorheological finishing (MRF) process, based on the dwell time method with the constant normal spacing for flexible polishing, would bring out the normal contour error in the fine polishing complex surface such as aspheric surface. The normal contour error would change the ribbon's shape and removal characteristics of consistency for MRF. Based on continuously scanning the normal spacing between the workpiece and the finder by the laser range finder, the novel method was put forward to measure the normal contour errors while polishing complex surface on the machining track. The normal contour errors was measured dynamically, by which the workpiece's clamping precision, multi-axis machining NC program and the dynamic performance of the MRF machine were achieved for the verification and security check of the MRF process. The unit for measuring the normal contour errors of complex surface on-machine was designed. Based on the measurement unit's results as feedback to adjust the parameters of the feed forward control and the multi-axis machining, the optimized servo control method was presented to compensate the normal contour errors. The experiment for polishing 180mm × 180mm aspherical workpiece of fused silica by MRF was set up to validate the method. The results show that the normal contour error was controlled in less than 10um. And the PV value of the polished surface accuracy was improved from 0.95λ to 0.09λ under the conditions of the same process parameters. The technology in the paper has been being applied in the PKC600-Q1 MRF machine developed by the China Academe of Engineering Physics for engineering application since 2014. It is being used in the national huge optical engineering for processing the ultra-precision optical parts.

  9. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  10. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  11. Verification of monitor unit calculations for non-IMRT clinical radiotherapy: report of AAPM Task Group 114.

    PubMed

    Stern, Robin L; Heaton, Robert; Fraser, Martin W; Goddu, S Murty; Kirby, Thomas H; Lam, Kwok Leung; Molineu, Andrea; Zhu, Timothy C

    2011-01-01

    The requirement of an independent verification of the monitor units (MU) or time calculated to deliver the prescribed dose to a patient has been a mainstay of radiation oncology quality assurance. The need for and value of such a verification was obvious when calculations were performed by hand using look-up tables, and the verification was achieved by a second person independently repeating the calculation. However, in a modern clinic using CT/MR/PET simulation, computerized 3D treatment planning, heterogeneity corrections, and complex calculation algorithms such as convolution/superposition and Monte Carlo, the purpose of and methodology for the MU verification have come into question. In addition, since the verification is often performed using a simpler geometrical model and calculation algorithm than the primary calculation, exact or almost exact agreement between the two can no longer be expected. Guidelines are needed to help the physicist set clinically reasonable action levels for agreement. This report addresses the following charges of the task group: (1) To re-evaluate the purpose and methods of the "independent second check" for monitor unit calculations for non-IMRT radiation treatment in light of the complexities of modern-day treatment planning. (2) To present recommendations on how to perform verification of monitor unit calculations in a modern clinic. (3) To provide recommendations on establishing action levels for agreement between primary calculations and verification, and to provide guidance in addressing discrepancies outside the action levels. These recommendations are to be used as guidelines only and shall not be interpreted as requirements.

  12. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence study uncovered with the method of false injection and visualization of the results. Symmetry had dual functionality: there was a bug, which was hidden due to the symmetric nature of a test (it was detected afterward utilizing artificial false injection), on the other hand self-symmetry was used to design a new test, and in a case the analytical solution of the ADR equation was unknown. Assisting subroutines designed to check and post-process conservation of mass and oscillatory behavior. Finally, capability of the solver also checked for stiff reaction source term. The above test suite not only was a decent tool of error detection but also it provided a thorough feedback on the ADR solvers limitations. Such information is the crux of any rigorous numerical modeling for a modeler who deals with surface/subsurface pollution transport.

  13. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  14. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  15. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  16. Verification procedure for the wavefront quality of the primary mirrors for the MRO interferometer

    NASA Astrophysics Data System (ADS)

    Bakker, Eric J.; Olivares, Andres; Schmell, Reed A.; Schmell, Rodney A.; Gartner, Darren; Jaramillo, Anthony; Romero, Kelly; Rael, Andres; Lewis, Jeff

    2009-08-01

    We present the verification procedure for the 1.4 meter primary mirrors of the Magdalena Ridge Observatory Interferometer (MROI). Six mirrors are in mass production at Optical Surface Technologies (OST) in Albuquerque. The six identical parabolic mirrors will have a radius of curvature of 6300 mm and a final surface wavefront quality of 29 nm rms. The mirrors will be tested in a tower using a computer generated hologram, and the Intellium⢠H2000 interferometer from Engineering Synthesis Design, Inc. (ESDI). The mirror fabrication activities are currently in the early stage of polishing and have already delivered some promising results with the interferometer. A complex passive whiffle tree has been designed and fabricated by Advanced Mechanical and Optical Systems (AMOS, Belgium) that takes into account the gravity loading for an alt-alt mount. The final testing of the primary mirrors will be completed with the mirror cells that will be used in the telescopes. In addition we report on shear tests performed on the mirror cell pads on the back of the primary mirrors. These pads are glued to the mirror. The shear test has demonstrated that the glue can withstand at least 4.9 kilo Newton. This is within the requirements.

  17. Numerical Simulation of the Fluid-Structure Interaction of a Surface Effect Ship Bow Seal

    NASA Astrophysics Data System (ADS)

    Bloxom, Andrew L.

    Numerical simulations of fluid-structure interaction (FSI) problems were performed in an effort to verify and validate a commercially available FSI tool. This tool uses an iterative partitioned coupling scheme between CD-adapco's STAR-CCM+ finite volume fluid solver and Simulia's Abaqus finite element structural solver to simulate the FSI response of a system. Preliminary verification and validation work (V&V) was carried out to understand the numerical behavior of the codes individually and together as a FSI tool. Verification and Validation work that was completed included code order verification of the respective fluid and structural solvers with Couette-Poiseuille flow and Euler-Bernoulli beam theory. These results confirmed the 2 nd order accuracy of the spatial discretizations used. Following that, a mixture of solution verifications and model calibrations was performed with the inclusion of the physics models implemented in the solution of the FSI problems. Solution verifications were completed for fluid and structural stand-alone models as well as for the coupled FSI solutions. These results re-confirmed the spatial order of accuracy but for more complex flows and physics models as well as the order of accuracy of the temporal discretizations. In lieu of a good material definition, model calibration is performed to reproduce the experimental results. This work used model calibration for both instances of hyperelastic materials which were presented in the literature as validation cases because these materials were defined as linear elastic. Calibrated, three dimensional models of the bow seal on the University of Michigan bow seal test platform showed the ability to reproduce the experimental results qualitatively through averaging of the forces and seal displacements. These simulations represent the only current 3D results for this case. One significant result of this study is the ability to visualize the flow around the seal and to directly measure the seal resistances at varying cushion pressures, seal immersions, forward speeds, and different seal materials. SES design analysis could greatly benefit from the inclusion of flexible seals in simulations, and this work is a positive step in that direction. In future work, the inclusion of more complex seal geometries and contact will further enhance the capability of this tool.

  18. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  19. Assessing the capability of numerical methods to predict earthquake ground motion: the Euroseistest verification and validation project

    NASA Astrophysics Data System (ADS)

    Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K.

    2009-12-01

    During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions (consistent with the ESG2006 exercise which targeted the Grenoble Valley). Diffractions off the basin edges and induced surface-wave propagation mainly contribute to differences between predictions. The differences are particularly large in the elastic models but remain important also in models with attenuation. In the validation, predictions are compared with the recordings by a local array of 19 surface and borehole accelerometers. The level of agreement is found event-dependent. For the largest-magnitude event the agreement is surprisingly good even at high frequencies.

  20. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  1. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click

    PubMed Central

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128

  2. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    PubMed

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  3. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  4. Spatial scale analysis in geophysics - Integrating surface and borehole geophysics in groundwater studies

    USGS Publications Warehouse

    Paillet, Frederick L.; Singhroy, V.H.; Hansen, D.T.; Pierce, R.R.; Johnson, A.I.

    2002-01-01

    Integration of geophysical data obtained at various scales can bridge the gap between localized data from boreholes and site-wide data from regional survey profiles. Specific approaches to such analysis include: 1) comparing geophysical measurements in boreholes with the same measurement made from the surface; 2) regressing geophysical data obtained in boreholes with water-sample data from screened intervals; 3) using multiple, physically independent measurements in boreholes to develop multivariate response models for surface geophysical surveys; 4) defining subsurface cell geometry for most effective survey inversion methods; and 5) making geophysical measurements in boreholes to serve as independent verification of geophysical interpretations. Integrated analysis of surface electromagnetic surveys and borehole geophysical logs at a study site in south Florida indicates that salinity of water in the surficial aquifers is controlled by a simple wedge of seawater intrusion along the coast and by a complex pattern of upward brine seepage from deeper aquifers throughout the study area. This interpretation was verified by drilling three additional test boreholes in carefully selected locations.

  5. Analysis of potential errors in real-time streamflow data and methods of data verification by digital computer

    USGS Publications Warehouse

    Lystrom, David J.

    1972-01-01

    Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.

  6. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  7. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  8. Dosimetric characterization and output verification for conical brachytherapy surface applicators. Part I. Electronic brachytherapy source

    PubMed Central

    Fulkerson, Regina K.; Micka, John A.; DeWerd, Larry A.

    2014-01-01

    Purpose: Historically, treatment of malignant surface lesions has been achieved with linear accelerator based electron beams or superficial x-ray beams. Recent developments in the field of brachytherapy now allow for the treatment of surface lesions with specialized conical applicators placed directly on the lesion. Applicators are available for use with high dose rate (HDR) 192Ir sources, as well as electronic brachytherapy sources. Part I of this paper will discuss the applicators used with electronic brachytherapy sources; Part II will discuss those used with HDR 192Ir sources. Although the use of these applicators has gained in popularity, the dosimetric characteristics including depth dose and surface dose distributions have not been independently verified. Additionally, there is no recognized method of output verification for quality assurance procedures with applicators like these. Existing dosimetry protocols available from the AAPM bookend the cross-over characteristics of a traditional brachytherapy source (as described by Task Group 43) being implemented as a low-energy superficial x-ray beam (as described by Task Group 61) as observed with the surface applicators of interest. Methods: This work aims to create a cohesive method of output verification that can be used to determine the dose at the treatment surface as part of a quality assurance/commissioning process for surface applicators used with HDR electronic brachytherapy sources (Part I) and 192Ir sources (Part II). Air-kerma rate measurements for the electronic brachytherapy sources were completed with an Attix Free-Air Chamber, as well as several models of small-volume ionization chambers to obtain an air-kerma rate at the treatment surface for each applicator. Correction factors were calculated using MCNP5 and EGSnrc Monte Carlo codes in order to determine an applicator-specific absorbed dose to water at the treatment surface from the measured air-kerma rate. Additionally, relative dose measurements of the surface dose distributions and characteristic depth dose curves were completed in-phantom. Results: Theoretical dose distributions and depth dose curves were generated for each applicator and agreed well with the measured values. A method of output verification was created that allows users to determine the applicator-specific dose to water at the treatment surface based on a measured air-kerma rate. Conclusions: The novel output verification methods described in this work will reduce uncertainties in dose delivery for treatments with these kinds of surface applicators, ultimately improving patient care. PMID:24506635

  9. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  10. Lifting-surface-theory aspect-ratio corrections to the lift and hinge-moment parameters for full-span elevators on horizontal tail surfaces

    NASA Technical Reports Server (NTRS)

    Swanson, Robert S; Crandall, Stewart M

    1948-01-01

    A limited number of lifting-surface-theory solutions for wings with chordwise loadings resulting from angle of attack, parabolic-ac camber, and flap deflection are now available. These solutions were studied with the purpose of determining methods of extrapolating the results in such a way that they could be used to determine lifting-surface-theory values of the aspect-ratio corrections to the lift and hinge-moment parameters for both angle-of-attack and flap-deflection-type loading that could be used to predict the characteristics of horizontal tail surfaces from section data with sufficient accuracy for engineering purposes. Such a method was devised for horizontal tail surfaces with full-span elevators. In spite of the fact that the theory involved is rather complex, the method is simple to apply and may be applied without any knowledge of lifting-surface theory. A comparison of experimental finite-span and section value and of the estimated values of the lift and hinge-moment parameters for three horizontal tail surfaces was made to provide an experimental verification of the method suggested. (author)

  11. TECHNOLOGY VERIFICATION OF COMMERCIALLY AVAILABLE METHODS FOR DECONTAMINATION OF INDOOR SURFACES CONTAMINATED WITH BIOLOGICAL OR CHEMICAL AGENTS

    EPA Science Inventory

    To support the Nation's Homeland Security Program, this U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) project is conducted to verify the performance of commercially available products, methods, and equipment for decontamination of hard and...

  12. Verification Survey of Rooms 113, 114, and 208 of the Inhalation Toxicology Laboratory, Lovelace Respiratory Research Institute, Albuquerque, NM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.J. Vitkus

    2008-06-25

    The objectives of the verification survey were to confirm that accessible surfaces of the three laboratories meet the DOE’s established criteria for residual contamination. Drain pipes and ductwork were not included within the survey scope.

  13. Letter Report - Verification Survey of Final Grids at the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2009-02-17

    Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16

  14. Second order upwind Lagrangian particle method for Euler equations

    DOE PAGES

    Samulyak, Roman; Chen, Hsin -Chiang; Yu, Kwangmin

    2016-06-01

    A new second order upwind Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface / multiphase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) an upwind second-order particle-based algorithm with limiter, providing accuracy and longmore » term stability, and (c) accurate resolution of states at free interfaces. In conclusion, numerical verification tests demonstrating the convergence order for fixed domain and free surface problems are presented.« less

  15. Second order upwind Lagrangian particle method for Euler equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samulyak, Roman; Chen, Hsin -Chiang; Yu, Kwangmin

    A new second order upwind Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface / multiphase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) an upwind second-order particle-based algorithm with limiter, providing accuracy and longmore » term stability, and (c) accurate resolution of states at free interfaces. In conclusion, numerical verification tests demonstrating the convergence order for fixed domain and free surface problems are presented.« less

  16. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  17. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  18. Verification of micro-scale photogrammetry for smooth three-dimensional object measurement

    NASA Astrophysics Data System (ADS)

    Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard

    2017-05-01

    By using sub-millimetre laser speckle pattern projection we show that photogrammetry systems are able to measure smooth three-dimensional objects with surface height deviations less than 1 μm. The projection of laser speckle patterns allows correspondences on the surface of smooth spheres to be found, and as a result, verification artefacts with low surface height deviations were measured. A combination of VDI/VDE and ISO standards were also utilised to provide a complete verification method, and determine the quality parameters for the system under test. Using the proposed method applied to a photogrammetry system, a 5 mm radius sphere was measured with an expanded uncertainty of 8.5 μm for sizing errors, and 16.6 μm for form errors with a 95 % confidence interval. Sphere spacing lengths between 6 mm and 10 mm were also measured by the photogrammetry system, and were found to have expanded uncertainties of around 20 μm with a 95 % confidence interval.

  19. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  20. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  1. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  2. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  3. Simple method to verify OPC data based on exposure condition

    NASA Astrophysics Data System (ADS)

    Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu

    2006-03-01

    In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.

  4. Verification of the CFD simulation system SAUNA for complex aircraft configurations

    NASA Astrophysics Data System (ADS)

    Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.

    1994-04-01

    This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.

  5. Marbles for the Imagination

    NASA Technical Reports Server (NTRS)

    Shue, Jack

    2004-01-01

    The end-to-end test would verify the complex sequence of events from lander separation to landing. Due to the large distances involved and the significant delay time in sending a command and receiving verification, the lander needed to operate autonomously after it separated from the orbiter. It had to sense conditions, make decisions, and act accordingly. We were flying into a relatively unknown set of conditions-a Martian atmosphere of unknown pressure, density, and consistency to land on a surface of unknown altitude, and one which had an unknown bearing strength. In order to touch down safely on Mars the lander had to orient itself for descent and entry, modulate itself to maintain proper lift, pop a parachute, jettison its aeroshell, deploy landing legs and radar, ignite a terminal descent engine, and fly a given trajectory to the surface. Once on the surface, it would determine its orientation, raise the high-gain antenna, perform a sweep to locate Earth, and begin transmitting information. It was this complicated, autonomous sequence that the end-to-end test was to simulate.

  6. ETV, LT2 and You: How the Environmental Technology Verification Program Can Assist with the Long Term 2 Enhanced Surface Water Treatment Rule

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Drinking Water Systems (DWS) Center has verified the performance of treatment technologies that may be used by communities in meeting the newly promulgated (2006) U.S. Environmental Protection Agency (USEPA) Long Term 2 Enhanced Sur...

  7. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  8. Comparative study of the swabbing properties of seven commercially available swab materials for cleaning verification.

    PubMed

    Corrigan, Damion K; Piletsky, Sergey; McCrossen, Sean

    2009-01-01

    This article compares the technical performances of several different commercially available swabbing materials for the purpose of cleaning verification. A steel surface was soiled with solutions of acetaminophen, nicotinic acid, diclofenac, and benzamidine and wiped with each swabbing material. The compounds were extracted with water or ethanol (depending on polarity of analyte) and their concentration in extract was quantified spectrophotometrically. The study also investigated swab debris on the wiped surface. The swab performances were compared and the best swab material was identified.

  9. Verification of Geosat sea surface topography in the Gulf Stream extension with surface drifting buoys and hydrographic measurements

    NASA Astrophysics Data System (ADS)

    Willebrand, J.; KäSe, R. H.; Stammer, D.; Hinrichsen, H.-H.; Krauss, W.

    1990-03-01

    Altimeter data from Geosat have been analyzed in the Gulf Stream extension area. Horizontal maps of the sea surface height anomaly relative to an annual mean for various 17-day intervals were constructed using an objective mapping procedure. The mean sea level was approximated by the dynamic topography from climatological hydrographic data. Geostrophic surface velocities derived from the composite maps (mean plus anomaly) are significantly correlated with surface drifter velocities observed during an oceanographie experiment in the spring of 1987. The drifter velocities contain much energy on scales less than 100 km which are not resolved in the altimetric maps. It is shown that the composite sea surface height also agrees well with ground verification from hydrographic data along sections in a triangle between the Azores, Newfoundland, and Bermuda, except in regions of high mean gradients.

  10. A Rapid Method to Achieve Aero-Engine Blade Form Detection

    PubMed Central

    Sun, Bin; Li, Bing

    2015-01-01

    This paper proposes a rapid method to detect aero-engine blade form, according to the characteristics of an aero-engine blade surface. This method first deduces an inclination error model in free-form surface measurements based on the non-contact laser triangulation principle. Then a four-coordinate measuring system was independently developed, a special fixture was designed according to the blade shape features, and a fast measurement of the blade features path was planned. Finally, by using the inclination error model for correction of acquired data, the measurement error that was caused by tilt form is compensated. As a result the measurement accuracy of the Laser Displacement Sensor was less than 10 μm. After the experimental verification, this method makes full use of optical non-contact measurement fast speed, high precision and wide measuring range of features. Using a standard gauge block as a measurement reference, the coordinate system conversion data is simple and practical. It not only improves the measurement accuracy of the blade surface, but also its measurement efficiency. Therefore, this method increases the value of the measurement of complex surfaces. PMID:26039420

  11. A rapid method to achieve aero-engine blade form detection.

    PubMed

    Sun, Bin; Li, Bing

    2015-06-01

    This paper proposes a rapid method to detect aero-engine blade form, according to the characteristics of an aero-engine blade surface. This method first deduces an inclination error model in free-form surface measurements based on the non-contact laser triangulation principle. Then a four-coordinate measuring system was independently developed, a special fixture was designed according to the blade shape features, and a fast measurement of the blade features path was planned. Finally, by using the inclination error model for correction of acquired data, the measurement error that was caused by tilt form is compensated. As a result the measurement accuracy of the Laser Displacement Sensor was less than 10 μm. After the experimental verification, this method makes full use of optical non-contact measurement fast speed, high precision and wide measuring range of features. Using a standard gauge block as a measurement reference, the coordinate system conversion data is simple and practical. It not only improves the measurement accuracy of the blade surface, but also its measurement efficiency. Therefore, this method increases the value of the measurement of complex surfaces.

  12. Hydrologic conditions in urban Miami-Dade County, Florida, and the effect of groundwater pumpage and increased sea level on canal leakage and regional groundwater flow

    USGS Publications Warehouse

    Hughes, Joseph D.; White, Jeremy T.

    2014-01-01

    The model was designed specifically to evaluate the effect of groundwater pumpage on canal leakage at the surface-water-basin scale and thus may not be appropriate for (1) predictions that are dependent on data not included in the calibration process (for example, subdaily simulation of high-intensity events and travel times) and (or) (2) hydrologic conditions that are substantially different from those during the calibration and verification periods. The reliability of the model is limited by the conceptual model of the surface-water and groundwater system, the spatial distribution of physical properties, the scale and discretization of the system, and specified boundary conditions. Some of the model limitations are manifested in model errors. Despite these limitations, however, the model represents the complexities of the interconnected surface-water and groundwater systems that affect how the systems respond to groundwater pumpage, sea-level rise, and other hydrologic stresses. The model also quantifies the relative effects of groundwater pumpage and sea-level rise on the surface-water and groundwater systems.

  13. Formal verification of a set of memory management units

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  14. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  15. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  16. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  17. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  18. Model for growth of fractal solid state surface and possibility of its verification by means of atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Kulikov, D. A.; Potapov, A. A.; Rassadin, A. E.; Stepanov, A. V.

    2017-10-01

    In the paper, methods of verification of models for growth of solid state surface by means of atomic force microscopy are suggested. Simulation of growth of fractals with cylindrical generatrix on the solid state surface is presented. Our mathematical model of this process is based on generalization of the Kardar-Parisi-Zhang equation. Corner stones of this generalization are both conjecture of anisotropy of growth of the surface and approximation of small angles. The method of characteristics has been applied to solve the Kardar-Parisi-Zhang equation. Its solution should be considered up to the gradient catastrophe. The difficulty of nondifferentiability of fractal initial generatrix has been overcome by transition from a mathematical fractal to a physical one.

  19. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  20. Cleanup Verification Package for the 600-47 Waste Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Cutlip

    This cleanup verification package documents completion of interim remedial action for the 600-47 waste site. This site consisted of several areas of surface debris and contamination near the banks of the Columbia River across from Johnson Island. Contaminated material identified in field surveys included four areas of soil, wood, nuts, bolts, and other metal debris.

  1. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  2. Formal Verification of the AAMP-FV Microcode

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam

    1999-01-01

    This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.

  3. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  4. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  5. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  6. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    PubMed

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  7. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix D: Ionospheric measurements for IVEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.

    As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.

  8. Granular Material Flows with Interstitial Fluid Effects

    NASA Technical Reports Server (NTRS)

    Hunt, Melany L.; Brennen, Christopher E.

    2004-01-01

    The research focused on experimental measurements of the rheological properties of liquid-solid and granular flows. In these flows, the viscous effects of the interstitial fluid, the inertia of the fluid and particles, and the collisional interactions of the particles may all contribute to the flow mechanics. These multiphase flows include industrial problems such as coal slurry pipelines, hydraulic fracturing processes, fluidized beds, mining and milling operation, abrasive water jet machining, and polishing and surface erosion technologies. In addition, there are a wide range of geophysical flows such as debris flows, landslides and sediment transport. In extraterrestrial applications, the study of transport of particulate materials is fundamental to the mining and processing of lunar and Martian soils and the transport of atmospheric dust (National Research Council 2000). The recent images from Mars Global Surveyor spacecraft dramatically depict the complex sand and dust flows on Mars, including dune formation and dust avalanches on the slip-face of dune surfaces. These Aeolian features involve a complex interaction of the prevailing winds and deposition or erosion of the sediment layer; these features make a good test bed for the verification of global circulation models of the Martian atmosphere.

  9. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix B: Surface ground motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, T.A.; Baker, D.F.; Edwards, C.L.

    1993-10-01

    Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less

  10. New Physical Optics Method for Curvilinear Refractive Surfaces and its Verification in the Design and Testing of W-band Dual-Aspheric Lenses

    DTIC Science & Technology

    2013-10-01

    its Verification in the Design and Testing of W-band Dual-Aspheric Lenses A. Altintas and V. Yurchenko EEE Department, Bilkent University Ankara...Theory and Techn., Vol. 55, 239, 2007 [5] ZEMAX Development Corporation, Zemax- EE , http://www.zemax.com/ [6] Pasqualini D. and Maci S., ”High-Frequency

  11. The Verification of a Method for Detecting and Quantifying Diethylene Glycol, Triethylene Glycol, Tetraethylene Glycol, 2-Butoxyethanol and 2-Methoxyethanolin in Ground and Surface Waters

    EPA Science Inventory

    This verification study was a special project designed to determine the efficacy of a draft standard operating procedure (SOP) developed by US EPA Region 3 for the determination of selected glycols in drinking waters that may have been impacted by active unconventional oil and ga...

  12. Failure of Cleaning Verification in Pharmaceutical Industry Due to Uncleanliness of Stainless Steel Surface.

    PubMed

    Haidar Ahmad, Imad A; Blasko, Andrei

    2017-08-11

    The aim of this work is to identify the parameters that affect the recovery of pharmaceutical residues from the surface of stainless steel coupons. A series of factors were assessed, including drug product spike levels, spiking procedure, drug-excipient ratios, analyst-to-analyst variability, intraday variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned the coupon surface was identified as the major contributor to low and variable recoveries. Assessment of cleaning the surface of the coupons with clean-in-place solutions (CIP) gave high recovery (>90%) and reproducible results (Srel≤4%) regardless of the conditions that were assessed previously. The approach was successfully applied for cleaning verification of small molecules (MW <1,000 Da) as well as large biomolecules (MW up to 50,000 Da).

  13. Numerical Simulations For the F-16XL Aircraft Configuration

    NASA Technical Reports Server (NTRS)

    Elmiligui, Alaa A.; Abdol-Hamid, Khaled; Cavallo, Peter A.; Parlette, Edward B.

    2014-01-01

    Numerical simulations of flow around the F-16XL are presented as a contribution to the Cranked Arrow Wing Aerodynamic Project International II (CAWAPI-II). The NASA Tetrahedral Unstructured Software System (TetrUSS) is used to perform numerical simulations. This CFD suite, developed and maintained by NASA Langley Research Center, includes an unstructured grid generation program called VGRID, a postprocessor named POSTGRID, and the flow solver USM3D. The CRISP CFD package is utilized to provide error estimates and grid adaption for verification of USM3D results. A subsonic high angle-of-attack case flight condition (FC) 25 is computed and analyzed. Three turbulence models are used in the calculations: the one-equation Spalart-Allmaras (SA), the two-equation shear stress transport (SST) and the ke turbulence models. Computational results, and surface static pressure profiles are presented and compared with flight data. Solution verification is performed using formal grid refinement studies, the solution of Error Transport Equations, and adaptive mesh refinement. The current study shows that the USM3D solver coupled with CRISP CFD can be used in an engineering environment in predicting vortex-flow physics on a complex configuration at flight Reynolds numbers.

  14. INTERIM REPORT--INDEPENDENT VERIFICATION SURVEY OF SECTION 3, SURVEY UNITS 1, 4 AND 5 EXCAVATED SURFACES, WHITTAKER CORPORATION, REYNOLDS INDUSTRIAL PARK, TRANSFER, PENNSYLVANIA DCN: 5002-SR-04-0"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ADAMS, WADE C

    At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removingmore » a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.« less

  15. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  16. Development of Onboard Computer Complex for Russian Segment of ISS

    NASA Technical Reports Server (NTRS)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  17. High-speed autoverifying technology for printed wiring boards

    NASA Astrophysics Data System (ADS)

    Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi

    1996-10-01

    We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).

  18. Data requirements for verification of ram glow chemistry

    NASA Technical Reports Server (NTRS)

    Swenson, G. R.; Mende, S. B.

    1985-01-01

    A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.

  19. Human vs autonomous control of planetary roving vehicles

    NASA Technical Reports Server (NTRS)

    Whitney, W. M.

    1974-01-01

    Supervisory or semiautonomous control has some compelling advantages over step-by-step human command and verification for the operation of roving vehicles on remote planetary surfaces. There are also disadvantages in relation to the complex system that must be mobilized and the chain of events that must be enacted to conduct a mission. Which of the two control methods is better on technical grounds may not be the deciding factor in its acceptance or rejection. Some of the issues that affect changes in spacecraft design and operation are summarized. To accelerate the movement toward more autonomous machines, it will be necessary to understand and to address the problems that such autonomy will create for other elements of the control system and for the control process.

  20. Verification of ANSYS Fluent and OpenFOAM CFD platforms for prediction of impact flow

    NASA Astrophysics Data System (ADS)

    Tisovská, Petra; Peukert, Pavel; Kolář, Jan

    The main goal of the article is a verification of the heat transfer coefficient numerically predicted by two CDF platforms - ANSYS-Fluent and OpenFOAM on the problem of impact flows oncoming from 2D nozzle. Various mesh parameters and solver settings were tested under several boundary conditions and compared to known experimental results. The best solver setting, suitable for further optimization of more complex geometry is evaluated.

  1. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  2. Engineering of the LISA Pathfinder mission—making the experiment a practical reality

    NASA Astrophysics Data System (ADS)

    Warren, Carl; Dunbar, Neil; Backler, Mike

    2009-05-01

    LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.

  3. Calibration and verification of models of organic carbon removal kinetics in Aerated Submerged Fixed-Bed Biofilm Reactors (ASFBBR): a case study of wastewater from an oil-refinery.

    PubMed

    Trojanowicz, Karol; Wójcik, Włodzimierz

    2011-01-01

    The article presents a case-study on the calibration and verification of mathematical models of organic carbon removal kinetics in biofilm. The chosen Harremöes and Wanner & Reichert models were calibrated with a set of model parameters obtained both during dedicated studies conducted at pilot- and lab-scales for petrochemical wastewater conditions and from the literature. Next, the models were successfully verified through studies carried out utilizing a pilot ASFBBR type bioreactor installed in an oil-refinery wastewater treatment plant. During verification the pilot biofilm reactor worked under varying surface organic loading rates (SOL), dissolved oxygen concentrations and temperatures. The verification proved that the models can be applied in practice to petrochemical wastewater treatment engineering for e.g. biofilm bioreactor dimensioning.

  4. The mechanical analysis of thermo-magneto-electric laminated composites in nanoscale with the consideration of surface and flexoelectric effects

    NASA Astrophysics Data System (ADS)

    Shi, Shuanhu; Li, Peng; Jin, Feng

    2018-01-01

    A theoretical thermo-magneto-electric (TME) bilayer model is established based on the Hamilton principle, in which both surface effect and flexoelectricity are all taken into account. The governing equations are proposed with the aid of the nonlinear constitutive relations of giant magnetostrictive materials. These equations are general, which can be applied to analyze the coupled extensional, shear and bending deformations at both macroscale and nanoscale. As a specific example, the coupled extensional and bending motion of a slender beam suffering from external magnetic field and thermal variation is investigated, in which the Miller-Shenoy coefficient, magneto-electric (ME) effect, strain gradient and displacement are discussed in detail. After the necessary verification, a critical thickness of the TME model is proposed, below which the surface effect exhibits a remarkable influence on the mechanical behaviors and can not be ignored. It is revealed that the surface effect, flexoelectric effect and temperature increment are beneficial for the enhancement of the induced electric field. This study can provide theoretical basis for the design of nanoscale laminates, especially for the performance evaluation of ME composites under complex environment.

  5. Satellite detection of oil on the marine surface

    NASA Technical Reports Server (NTRS)

    Wilson, M. J.; Oneill, P. E.; Estes, J. E.

    1981-01-01

    The ability of two widely dissimilar spaceborne imaging sensors to detect surface oil accumulations in the marine environment has been evaluated using broadly different techniques. Digital Landsat multispectral scanner (MSS) data consisting of two visible and two near infrared channels has been processed to enhance contrast between areas of known oil coverage and background clean surface water. These enhanced images have then been compared to surface verification data gathered by aerial reconnaissance during the October 15, 1975, Landsat overpass. A similar evaluation of oil slick imaging potential has been made for digitally enhanced Seasat-A synthetic aperture radar (SAR) data from July 18, 1979. Due to the premature failure of this satellite, however, no concurrent surface verification data were collected. As a substitute, oil slick configuration information has been generated for the comparison using meteorological and oceanographic data. The test site utilized in both studies was the extensive area of natural seepage located off Coal Oil Point, adjacent to the University of California, Santa Barbara.

  6. 40 CFR 98.448 - Geologic sequestration monitoring, reporting, and verification (MRV) plan.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... than 1 year. (2) Identification of potential surface leakage pathways for CO2 in the maximum monitoring area and the likelihood, magnitude, and timing, of surface leakage of CO2 through these pathways. (3) A...

  7. 40 CFR 98.448 - Geologic sequestration monitoring, reporting, and verification (MRV) plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... than 1 year. (2) Identification of potential surface leakage pathways for CO2 in the maximum monitoring area and the likelihood, magnitude, and timing, of surface leakage of CO2 through these pathways. (3) A...

  8. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  9. Lagrangian particle method for compressible fluid dynamics

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Wang, Xingyu; Chen, Hsin-Chiang

    2018-06-01

    A new Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface/multiphase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) a second-order particle-based algorithm that reduces to the first-order upwind method at local extremal points, providing accuracy and long term stability, and (c) more accurate resolution of entropy discontinuities and states at free interfaces. While the method is consistent and convergent to a prescribed order, the conservation of momentum and energy is not exact and depends on the convergence order. The method is generalizable to coupled hyperbolic-elliptic systems. Numerical verification tests demonstrating the convergence order are presented as well as examples of complex multiphase flows.

  10. CFD: A Castle in the Sand?

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Wood, William A.

    2004-01-01

    The computational simulation community is not routinely publishing independently verifiable tests to accompany new models or algorithms. A survey reveals that only 22% of new models published are accompanied by tests suitable for independently verifying the new model. As the community develops larger codes with increased functionality, and hence increased complexity in terms of the number of building block components and their interactions, it becomes prohibitively expensive for each development group to derive the appropriate tests for each component. Therefore, the computational simulation community is building its collective castle on a very shaky foundation of components with unpublished and unrepeatable verification tests. The computational simulation community needs to begin publishing component level verification tests before the tide of complexity undermines its foundation.

  11. Combining Space Geodesy, Seismology, and Geochemistry for Monitoring Verification and Accounting of CO 2 in Sequestration Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swart, Peter K.; Dixon, Tim

    2014-09-30

    A series of surface geophysical and geochemical techniques are tested in order to demonstrate and validate low cost approaches for Monitoring, Verification and Accounting (MVA) of the integrity of deep reservoirs for CO 2 storage. These techniques are (i) surface deformation by GPS; ii) surface deformation by InSAR; iii) passive source seismology via broad band seismometers; and iv) soil gas monitoring with a cavity ring down spectrometer for measurement of CO 2 concentration and carbon isotope ratio. The techniques were tested at an active EOR (Enhanced Oil Recovery) site in Texas. Each approach has demonstrated utility. Assuming Carbon Capture, Utilizationmore » and Storage (CCUS) activities become operational in the future, these techniques can be used to augment more expensive down-hole techniques.« less

  12. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  13. Model Checking Satellite Operational Procedures

    NASA Astrophysics Data System (ADS)

    Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri

    2011-08-01

    We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.

  14. Investigations on the magnetization behavior of magnetic composite particles

    NASA Astrophysics Data System (ADS)

    Eichholz, Christian; Knoll, Johannes; Lerche, Dietmar; Nirschl, Hermann

    2014-11-01

    In life sciences the application of surface functionalized magnetic composite particles is establishing in diagnostics and in downstream processing of modern biotechnology. These magnetic composite particles consist of non-magnetic material, e.g. polystyrene, which serves as a matrix for the second magnetic component, usually colloidal magnetite. Because of the multitude of magnetic cores these magnetic beads show a complex magnetization behavior which cannot be described with the available approaches for homogeneous magnetic material. Therefore, in this work a new model for the magnetization behavior of magnetic composite particles is developed. By introducing an effective magnetization and considering an overall demagnetization factor the deviation of the demagnetization of homogeneously magnetized particles is taken into account. Calculated and experimental results show a good agreement which allows for the verification of the adapted model of particle magnetization. Besides, a newly developed magnetic analyzing centrifuge is used for the characterization of magnetic composite particle systems. The experimental results, also used for the model verification, give both, information about the magnetic properties and the interaction behavior of particle systems. By adding further components to the particle solution, such as salts or proteins, industrial relevant systems can be reconstructed. The analyzing tool can be used to adapt industrial processes without time-consuming preliminary tests with large samples in the process equipments.

  15. Cleaning verification: Exploring the effect of the cleanliness of stainless steel surface on sample recovery.

    PubMed

    Haidar Ahmad, Imad A; Tam, James; Li, Xue; Duffield, William; Tarara, Thomas; Blasko, Andrei

    2017-02-05

    The parameters affecting the recovery of pharmaceutical residues from the surface of stainless steel coupons for quantitative cleaning verification method development have been studied, including active pharmaceutical ingredient (API) level, spiking procedure, API/excipient ratio, analyst-to-analyst variability, inter-day variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned coupon surface was identified as the major contributor to low and variable recoveries. Assessment of acid, base, and oxidant washes, as well as the order of treatment, showed that a base-water-acid-water-oxidizer-water wash procedure resulted in consistent, accurate spiked recovery (>90%) and reproducible results (S rel ≤4%). By applying this cleaning procedure to the previously used coupons that failed the cleaning acceptance criteria, multiple analysts were able to obtain consistent recoveries from day-to-day for different APIs, and API/excipient ratios at various spike levels. We successfully applied our approach for cleaning verification of small molecules (MW<1000Da) as well as large biomolecules (MW up to 50,000Da). Method robustness was greatly influenced by the sample preparation procedure, especially for analyses using total organic carbon (TOC) determination. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Hazardous Materials Verification and Limited Characterization Report on Sodium and Caustic Residuals in Materials and Fuel Complex Facilities MFC-799/799A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary Mecham

    2010-08-01

    This report is a companion to the Facilities Condition and Hazard Assessment for Materials and Fuel Complex Sodium Processing Facilities MFC-799/799A and Nuclear Calibration Laboratory MFC-770C (referred to as the Facilities Condition and Hazards Assessment). This report specifically responds to the requirement of Section 9.2, Item 6, of the Facilities Condition and Hazards Assessment to provide an updated assessment and verification of the residual hazardous materials remaining in the Sodium Processing Facilities processing system. The hazardous materials of concern are sodium and sodium hydroxide (caustic). The information supplied in this report supports the end-point objectives identified in the Transition Planmore » for Multiple Facilities at the Materials and Fuels Complex, Advanced Test Reactor, Central Facilities Area, and Power Burst Facility, as well as the deactivation and decommissioning critical decision milestone 1, as specified in U.S. Department of Energy Guide 413.3-8, “Environmental Management Cleanup Projects.” Using a tailored approach and based on information obtained through a combination of process knowledge, emergency management hazardous assessment documentation, and visual inspection, this report provides sufficient detail regarding the quantity of hazardous materials for the purposes of facility transfer; it also provides that further characterization/verification of these materials is unnecessary.« less

  17. WE-DE-201-11: Sensitivity and Specificity of Verification Methods Based On Total Reference Air Kerma (TRAK) Or On User Provided Dose Points for Graphically Planned Skin HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damato, A; Devlin, P; Bhagwat, M

    Purpose: To investigate the sensitivity and specificity of a novel verification methodology for image-guided skin HDR brachytherapy plans using a TRAK-based reasonableness test, compared to a typical manual verification methodology. Methods: Two methodologies were used to flag treatment plans necessitating additional review due to a potential discrepancy of 3 mm between planned dose and clinical target in the skin. Manual verification was used to calculate the discrepancy between the average dose to points positioned at time of planning representative of the prescribed depth and the expected prescription dose. Automatic verification was used to calculate the discrepancy between TRAK of themore » clinical plan and its expected value, which was calculated using standard plans with varying curvatures, ranging from flat to cylindrically circumferential. A plan was flagged if a discrepancy >10% was observed. Sensitivity and specificity were calculated using as a criteria for true positive that >10% of plan dwells had a distance to prescription dose >1 mm different than prescription depth (3 mm + size of applicator). All HDR image-based skin brachytherapy plans treated at our institution in 2013 were analyzed. Results: 108 surface applicator plans to treat skin of the face, scalp, limbs, feet, hands or abdomen were analyzed. Median number of catheters was 19 (range, 4 to 71) and median number of dwells was 257 (range, 20 to 1100). Sensitivity/specificity were 57%/78% for manual and 70%/89% for automatic verification. Conclusion: A check based on expected TRAK value is feasible for irregularly shaped, image-guided skin HDR brachytherapy. This test yielded higher sensitivity and specificity than a test based on the identification of representative points, and can be implemented with a dedicated calculation code or with pre-calculated lookup tables of ideally shaped, uniform surface applicators.« less

  18. Validation (not just verification) of Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Duren, Riley M.

    2006-01-01

    ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.

  19. Precision segmented reflector, figure verification sensor

    NASA Technical Reports Server (NTRS)

    Manhart, Paul K.; Macenka, Steve A.

    1989-01-01

    The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed

  20. Human Factors Analysis and Layout Guideline Development for the Canadian Surface Combatant (CSC) Project

    DTIC Science & Technology

    2013-04-01

    project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of

  1. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  2. Hybrid Gama Emission Tomography (HGET): FY16 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Erin A.; Smith, Leon E.; Wittman, Richard S.

    2017-02-01

    Current International Atomic Energy Agency (IAEA) methodologies for the verification of fresh low-enriched uranium (LEU) and mixed oxide (MOX) fuel assemblies are volume-averaging methods that lack sensitivity to individual pins. Further, as fresh fuel assemblies become more and more complex (e.g., heavy gadolinium loading, high degrees of axial and radial variation in fissile concentration), the accuracy of current IAEA instruments degrades and measurement time increases. Particularly in light of the fact that no special tooling is required to remove individual pins from modern fuel assemblies, the IAEA needs new capabilities for the verification of unirradiated (i.e., fresh LEU and MOX)more » assemblies to ensure that fissile material has not been diverted. Passive gamma emission tomography has demonstrated potential to provide pin-level verification of spent fuel, but gamma-ray emission rates from unirradiated fuel emissions are significantly lower, precluding purely passive tomography methods. The work presented here introduces the concept of Hybrid Gamma Emission Tomography (HGET) for verification of unirradiated fuels, in which a neutron source is used to actively interrogate the fuel assembly and the resulting gamma-ray emissions are imaged using tomographic methods to provide pin-level verification of fissile material concentration.« less

  3. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, themore » issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  4. Bullying in School: Case Study of Prevention and Psycho-Pedagogical Correction

    ERIC Educational Resources Information Center

    Ribakova, Laysan A.; Valeeva, Roza A.; Merker, Natalia

    2016-01-01

    The purpose of the study was the theoretical justification and experimental verification of content, complex forms and methods to ensure effective prevention and psycho-pedagogical correction of bullying in school. 53 teenage students from Kazan took part in the experiment. A complex of diagnostic techniques for the detection of violence and…

  5. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  6. Research on the injectors remanufacturing

    NASA Astrophysics Data System (ADS)

    Daraba, D.; Alexandrescu, I. M.; Daraba, C.

    2017-05-01

    During the remanufacturing process, the injector body - after disassembling and cleaning process - should be subjected to some strict control processes, both visually and by an electronic microscope, for evidencing any defects that may occur on the sealing surface of the injector body and the atomizer. In this paper we present the path followed by an injector body in the process of remanufacturing, exemplifying the verification method of roughness and hardness of the sealing surfaces, as well as the microscopic analysis of the sealing surface areas around the inlet. These checks can indicate which path the injector body has to follow during the remanufacturing. The control methodology of the injector body, that is established on the basis of this research, helps preventing some defective injector bodies to enter into the remanufacturing process, thus reducing to a minimum the number of remanufactured injectors to be declared non-conforming after final verification process.

  7. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  8. Verification of transport equations in a general purpose commercial CFD code.

    NASA Astrophysics Data System (ADS)

    Melot, Matthieu; Nennemann, Bernd; Deschênes, Claire

    2016-11-01

    In this paper, the Verification and Validation methodology is presented. This method aims to increase the reliability and the trust that can be placed into complex CFD simulations. The first step of this methodology, the code verification is presented in greater details. The CFD transport equations in steady state, transient and Arbitrary Eulerian Lagrangian (ALE, used for transient moving mesh) formulations in Ansys CFX are verified. It is shown that the expected spatial and temporal order of convergence are achieved for the steady state and the transient formulations. Unfortunately this is not completely the case for the ALE formulation. As for a lot of other commercial and in-house CFD codes, the temporal convergence of the velocity is limited to a first order where a second order would have been expected.

  9. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  10. Dosimetry for audit and clinical trials: challenges and requirements

    NASA Astrophysics Data System (ADS)

    Kron, T.; Haworth, A.; Williams, I.

    2013-06-01

    Many important dosimetry audit networks for radiotherapy have their roots in clinical trial quality assurance (QA). In both scenarios it is essential to test two issues: does the treatment plan conform with the clinical requirements and is the plan a reasonable representation of what is actually delivered to a patient throughout their course of treatment. Part of a sound quality program would be an external audit of these issues with verification of the equivalence of plan and treatment typically referred to as a dosimetry audit. The increasing complexity of radiotherapy planning and delivery makes audits challenging. While verification of absolute dose delivered at a reference point was the standard of external dosimetry audits two decades ago this is often deemed inadequate for verification of treatment approaches such as Intensity Modulated Radiation Therapy (IMRT) and Volumetric Modulated Arc Therapy (VMAT). As such, most dosimetry audit networks have successfully introduced more complex tests of dose delivery using anthropomorphic phantoms that can be imaged, planned and treated as a patient would. The new challenge is to adapt this approach to ever more diversified radiotherapy procedures with image guided/adaptive radiotherapy, motion management and brachytherapy being the focus of current research.

  11. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  12. Simple Verification of the Parabolic Shape of a Rotating Liquid and a Boat on Its Surface

    ERIC Educational Resources Information Center

    Sabatka, Z.; Dvorak, L.

    2010-01-01

    This article describes a simple and inexpensive way to create and to verify the parabolic surface of a rotating liquid. The liquid is water. The second part of the article deals with the problem of a boat on the surface of a rotating liquid. (Contains 1 table, 10 figures and 5 footnotes.)

  13. Chemical studies of elements with Z ⩾ 104 in gas phase

    NASA Astrophysics Data System (ADS)

    Türler, Andreas; Eichler, Robert; Yakushev, Alexander

    2015-12-01

    Chemical investigations of superheavy elements in the gas-phase, i.e. elements with Z ≥ 104, allow assessing the influence of relativistic effects on their chemical properties. Furthermore, for some superheavy elements and their compounds quite unique gas-phase chemical properties were predicted. The experimental verification of these properties yields supporting evidence for a firm assignment of the atomic number. Prominent examples are the high volatility observed for HsO4 or the very weak interaction of Cn with gold surfaces. The unique properties of HsO4 were exploited to discover the doubly-magic even-even nucleus 270Hs and the new isotope 271Hs. The combination of kinematic pre-separation and gas-phase chemistry allowed gaining access to a new class of relatively fragile compounds, the carbonyl complexes of elements Sg through Mt. A not yet resolved issue concerns the interaction of Fl with gold surfaces. While competing experiments agree on the fact that Fl is a volatile element, there are discrepancies concerning its adsorption on gold surfaces with respect to its daughter Cn. The elucidation of these and other questions amounts to the fascination that gas-phase chemical investigations exert on current research at the extreme limits of chemistry today.

  14. Environmental and Sustainable Technology Evaluations (ESTE): Verification of Microbial Resistant Building Materials

    EPA Science Inventory

    This is an ESTE project summary brief. Many of the finished interior surfaces of homes and buildings are composed of materials that are prone to mold growth. These surfaces include gypsum board, wood flooring, insulation, and components of the heating and air conditioning system...

  15. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  16. Deformable structure registration of bladder through surface mapping.

    PubMed

    Xiong, Li; Viswanathan, Akila; Stewart, Alexandra J; Haker, Steven; Tempany, Clare M; Chin, Lee M; Cormack, Robert A

    2006-06-01

    Cumulative dose distributions in fractionated radiation therapy depict the dose to normal tissues and therefore may permit an estimation of the risk of normal tissue complications. However, calculation of these distributions is highly challenging because of interfractional changes in the geometry of patient anatomy. This work presents an algorithm for deformable structure registration of the bladder and the verification of the accuracy of the algorithm using phantom and patient data. In this algorithm, the registration process involves conformal mapping of genus zero surfaces using finite element analysis, and guided by three control landmarks. The registration produces a correspondence between fractions of the triangular meshes used to describe the bladder surface. For validation of the algorithm, two types of balloons were inflated gradually to three times their original size, and several computerized tomography (CT) scans were taken during the process. The registration algorithm yielded a local accuracy of 4 mm along the balloon surface. The algorithm was then applied to CT data of patients receiving fractionated high-dose-rate brachytherapy to the vaginal cuff, with the vaginal cylinder in situ. The patients' bladder filling status was intentionally different for each fraction. The three required control landmark points were identified for the bladder based on anatomy. Out of an Institutional Review Board (IRB) approved study of 20 patients, 3 had radiographically identifiable points near the bladder surface that were used for verification of the accuracy of the registration. The verification point as seen in each fraction was compared with its predicted location based on affine as well as deformable registration. Despite the variation in bladder shape and volume, the deformable registration was accurate to 5 mm, consistently outperforming the affine registration. We conclude that the structure registration algorithm presented works with reasonable accuracy and provides a means of calculating cumulative dose distributions.

  17. Comments on the Synergism Between the Analytic Planetary Boundary-Layer Model and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Brown, R. A.

    2005-08-01

    This paper is adapted from a presentation at the session of the European Geophysical Society meeting in 2002 honouring Joost Businger. It documents the interaction of the non-linear planetary boundary-layer (PBL) model (UW-PBL) and satellite remote sensing of marine surface winds from verification and calibration studies for the sensor model function to the current state of verification of the model by satellite data. It is also a personal history where Joost Businger had seminal input to this research at several critical junctures. The first scatterometer in space was on SeaSat in 1978, while currently in orbit there are the QuikSCAT and ERS-2 scatterometers and the WindSat radiometer. The volume and detail of data from the scatterometers during the past decade are unprecedented, though the value of these data depends on a careful interpretation of the PBL dynamics. The model functions (algorithms) that relate surface wind to sensor signal have evolved from straight empirical correlation with simple surface-layer 10-m winds to satellite sensor model functions for surface pressure fields. A surface stress model function is also available. The validation data for the satellite model functions depended crucially on the PBL solution. The non-linear solution for the flow of fluid in the boundary layer of a rotating coordinate system was completed in 1969. The implications for traditional ways of measuring and modelling the PBL were huge and continue to this day. Unfortunately, this solution replaced an elegant one by Ekman with a stability/finite perturbation equilibrium solution. Consequently, there has been great reluctance to accept this solution. The verification of model predictions has been obtained from the satellite data.

  18. Model-Based Engineering for Supply Chain Risk Management

    DTIC Science & Technology

    2015-09-30

    Privacy, 2009 [19] Julien Delange Wheel Brake System Example using AADL; Feiler, Peter; Hansson, Jörgen; de Niz, Dionisio; & Wrage, Lutz. System ...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis

  19. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.

  20. Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets

    NASA Technical Reports Server (NTRS)

    Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.

    1978-01-01

    A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.

  1. Model Checking for Verification of Interactive Health IT Systems

    PubMed Central

    Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui

    2015-01-01

    Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166

  2. Automation bias and verification complexity: a systematic review.

    PubMed

    Lyell, David; Coiera, Enrico

    2017-03-01

    While potentially reducing decision errors, decision support systems can introduce new types of errors. Automation bias (AB) happens when users become overreliant on decision support, which reduces vigilance in information seeking and processing. Most research originates from the human factors literature, where the prevailing view is that AB occurs only in multitasking environments. This review seeks to compare the human factors and health care literature, focusing on the apparent association of AB with multitasking and task complexity. EMBASE, Medline, Compendex, Inspec, IEEE Xplore, Scopus, Web of Science, PsycINFO, and Business Source Premiere from 1983 to 2015. Evaluation studies where task execution was assisted by automation and resulted in errors were included. Participants needed to be able to verify automation correctness and perform the task manually. Tasks were identified and grouped. Task and automation type and presence of multitasking were noted. Each task was rated for its verification complexity. Of 890 papers identified, 40 met the inclusion criteria; 6 were in health care. Contrary to the prevailing human factors view, AB was found in single tasks, typically involving diagnosis rather than monitoring, and with high verification complexity. The literature is fragmented, with large discrepancies in how AB is reported. Few studies reported the statistical significance of AB compared to a control condition. AB appears to be associated with the degree of cognitive load experienced in decision tasks, and appears to not be uniquely associated with multitasking. Strategies to minimize AB might focus on cognitive load reduction. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. Surface inspection: Research and development

    NASA Technical Reports Server (NTRS)

    Batchelder, J. S.

    1987-01-01

    Surface inspection techniques are used for process learning, quality verification, and postmortem analysis in manufacturing for a spectrum of disciplines. First, trends in surface analysis are summarized for integrated circuits, high density interconnection boards, and magnetic disks, emphasizing on-line applications as opposed to off-line or development techniques. Then, a closer look is taken at microcontamination detection from both a patterned defect and a particulate inspection point of view.

  4. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  5. Definition and verification of a complex aircraft for aerodynamic calculations

    NASA Technical Reports Server (NTRS)

    Edwards, T. A.

    1986-01-01

    Techniques are reviewed which are of value in CAD/CAM CFD studies of the geometries of new fighter aircraft. In order to refine the computations of the flows to take advantage of the computing power available from supercomputers, it is often necessary to interpolate the geometry of the mesh selected for the numerical analysis of the aircraft shape. Interpolating the geometry permits a higher level of detail in calculations of the flow past specific regions of a design. A microprocessor-based mathematics engine is described for fast image manipulation and rotation to verify that the interpolated geometry will correspond to the design geometry in order to ensure that the flow calculations will remain valid through the interpolation. Applications of the image manipulation system to verify geometrical representations with wire-frame and shaded-surface images are described.

  6. Environmental Verification Experiment for the Explorer Platform (EVEEP)

    NASA Technical Reports Server (NTRS)

    Norris, Bonnie; Lorentson, Chris

    1992-01-01

    Satellites and long-life spacecraft require effective contamination control measures to ensure data accuracy and maintain overall system performance margins. Satellite and spacecraft contamination can occur from either molecular or particulate matter. Some of the sources of the molecular species are as follows: mass loss from nonmetallic materials; venting of confined spacecraft or experiment volumes; exhaust effluents from attitude control systems; integration and test activities; and improper cleaning of surfaces. Some of the sources of particulates are as follows: leaks or purges which condense upon vacuum exposure; abrasion of movable surfaces; and micrometeoroid impacts. The Environmental Verification Experiment for the Explorer Platform (EVEEP) was designed to investigate the following aspects of spacecraft contamination control: materials selection; contamination modeling of existing designs; and thermal vacuum testing of a spacecraft with contamination monitors.

  7. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  8. Lagrangian particle method for compressible fluid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samulyak, Roman; Wang, Xingyu; Chen, Hsin -Chiang

    A new Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface / multi-phase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) a second-order particle-based algorithm that reduces to the first-order upwind method at local extremalmore » points, providing accuracy and long term stability, and (c) more accurate resolution of entropy discontinuities and states at free inter-faces. While the method is consistent and convergent to a prescribed order, the conservation of momentum and energy is not exact and depends on the convergence order . The method is generalizable to coupled hyperbolic-elliptic systems. As a result, numerical verification tests demonstrating the convergence order are presented as well as examples of complex multiphase flows.« less

  9. Lagrangian particle method for compressible fluid dynamics

    DOE PAGES

    Samulyak, Roman; Wang, Xingyu; Chen, Hsin -Chiang

    2018-02-09

    A new Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface / multi-phase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) a second-order particle-based algorithm that reduces to the first-order upwind method at local extremalmore » points, providing accuracy and long term stability, and (c) more accurate resolution of entropy discontinuities and states at free inter-faces. While the method is consistent and convergent to a prescribed order, the conservation of momentum and energy is not exact and depends on the convergence order . The method is generalizable to coupled hyperbolic-elliptic systems. As a result, numerical verification tests demonstrating the convergence order are presented as well as examples of complex multiphase flows.« less

  10. Verification of forecast ensembles in complex terrain including observation uncertainty

    NASA Astrophysics Data System (ADS)

    Dorninger, Manfred; Kloiber, Simon

    2017-04-01

    Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.

  11. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  12. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  13. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  14. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  15. Bar codes and intrinsic-surface-roughness tag: Accurate and low-cost accountability for CFE. [Conventional force equipment (CFE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeVolpi, A.; Palm, R.

    CFE poses a number of verification challenges that could be met in part by an accurate and low-cost means of aiding in accountability of treaty-limited equipment. Although the treaty as signed does not explicitly call for the use of tags, there is a provision for recording serial numbers'' and placing special marks'' on equipment subject to reduction. There are approximately 150,000 residual items to be tracked for CFE-I, about half for each alliance of state parties. These highly mobile items are subject to complex treaty limitations: deployment limits and zones, ceilings subceilings, holdings and allowances. There are controls and requirementsmore » for storage, conversion, and reduction. In addition, there are national security concerns regarding modernization and mobilization capability. As written into the treaty, a heavy reliance has been placed on human inspectors for CFE verification. Inspectors will mostly make visual observations and photographs as the means of monitoring compliance; these observations can be recorded by handwriting or keyed into a laptop computer. CFE is now less a treaty between two alliances than a treaty among 22 state parties, with inspection data an reports to be shared with each party in the official languages designated by CSCE. One of the potential roles for bar-coded tags would be to provide a universal, exchangable, computer-compatible language for tracking TLE. 10 figs.« less

  16. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    NASA Technical Reports Server (NTRS)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  17. KSC-04PD-2180

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. At Astrotech Space Operations in Titusville, Fla., Joe Galamback mounts a bracket on a solar panel on the Deep Impact spacecraft. Galamback is a lead mechanic technician with Ball Aerospace and Technologies Corp. in Boulder, Colo. The spacecraft is undergoing verification testing after its long road trip from Colorado.A NASA Discovery mission, Deep Impact will probe beneath the surface of Comet Tempel 1 on July 4, 2005, when the comet is 83 million miles from Earth, and reveal the secrets of its interior. After releasing a 3- by 3- foot projectile to crash onto the surface, Deep Impacts flyby spacecraft will collect pictures and data of how the crater forms, measuring the craters depth and diameter, as well as the composition of the interior of the crater and any material thrown out, and determining the changes in natural outgassing produced by the impact. It will send the data back to Earth through the antennas of the Deep Space Network. The spacecraft is scheduled to launch Dec. 30, 2004, aboard a Boeing Delta II rocket from Launch Complex 17 at Cape Canaveral Air Force Station, Fla.

  18. Pore-scale modeling of moving contact line problems in immiscible two-phase flow

    NASA Astrophysics Data System (ADS)

    Kucala, Alec; Noble, David; Martinez, Mario

    2016-11-01

    Accurate modeling of moving contact line (MCL) problems is imperative in predicting capillary pressure vs. saturation curves, permeability, and preferential flow paths for a variety of applications, including geological carbon storage (GCS) and enhanced oil recovery (EOR). Here, we present a model for the moving contact line using pore-scale computational fluid dynamics (CFD) which solves the full, time-dependent Navier-Stokes equations using the Galerkin finite-element method. The MCL is modeled as a surface traction force proportional to the surface tension, dependent on the static properties of the immiscible fluid/solid system. We present a variety of verification test cases for simple two- and three-dimensional geometries to validate the current model, including threshold pressure predictions in flows through pore-throats for a variety of wetting angles. Simulations involving more complex geometries are also presented to be used in future simulations for GCS and EOR problems. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  19. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    NASA Technical Reports Server (NTRS)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using traditional verification methodologies. Output from object-based verification within NCAR s Meteorological Evaluation Tools reveals that the WRF runs initialized with LIS+MODIS data consistently generated precipitation objects that better matched observed precipitation objects, especially at higher precipitation intensities. The LIS+MODIS runs produced on average a 4% increase in matched precipitation areas and a simultaneous 4% decrease in unmatched areas during three months of daily simulations.

  20. Verification of Methods for Assessing the Sustainability of Monitored Natural Attenuation (MNA)

    DTIC Science & Technology

    2013-01-01

    surface CVOC chlorinated volatile organic compound DCE cis-1,2-Dichloroethylene DNAPL dense non-aqueous phase liquid DO dissolved oxygen DOC...considered detailed representations of aquifer heterogeneity, DNAPL distributions, and interfacial surface area. Thus, the upscaled SZD function considers...the effects of decreases in interfacial surface area with time as NAPL mass depletes, but not in an explicit manner. Likewise, the upscaled model is

  1. Development and Experimental Verification of Surface Effects in a Fluidic Model

    DTIC Science & Technology

    2006-01-01

    FROM A HE PLASMA INSIDE A POLYSTYRENE MICROCHANNEL. 43 FIGURE 30: THE EMISSION SPECTRA FROM A MIXED HEXAFLUOROETHYLENE/HE PLASMA INSIDE THE...MICROCHANNEL 47 FIGURE 35: THE ADSORPTION OF GLUCOSE OXIDASE TO DIFFERENT POLYMER SURFACES WAS SHOWN TO HAVE A SIGNIFICANT EFFECT ON ELECTROOSMOTIC FLOW...approach involves neglecting non-ideal (convective-diffusive) effects 5 by assuming well- mixed protein in contact with an idealized surface. Coupled

  2. Orion GN&C Fault Management System Verification: Scope And Methodology

    NASA Technical Reports Server (NTRS)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  3. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    PubMed

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (p<0.01), with a corresponding significant decrease in US-based vendors (71.9% in 2013 and 65% in 2014). Most vendors did little to prevent youth access in either year, with 67.6% in 2013 and 63.2% in 2014 employing no age verification or relying exclusively on strategies that cannot effectively verify age. Effective age verification strategies such as online age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, p<0.01) or age verification at delivery (6.4% in 2013 and 8.1% in 2104) were rarely advertised on IEV websites. Nearly all vendors advertised accepting credit cards, and about ¾ shipping via United States Postal Service, similar to the internet cigarette industry prior to federal bans. The number of IEVs grew sharply from 2013 to 2014, with poor age verification practices. New and expanded regulations for online e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)

    DOT National Transportation Integrated Search

    1997-11-01

    The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...

  5. Flight Testing the Rotor Systems Research Aircraft (RSRA)

    NASA Technical Reports Server (NTRS)

    Hall, G. W.; Merrill, R. K.

    1983-01-01

    In the late 1960s, efforts to advance the state-of-the-art in rotor systems technology indicated a significant gap existed between our ability to accurately predict the characteristics of a complex rotor system and the results obtained through flight verification. Even full scale wind tunnel efforts proved inaccurate because of the complex nature of a rotating, maneuvering rotor system. The key element missing, which prevented significant advances, was our inability to precisely measure the exact rotor state as a function of time and flight condition. Two Rotor Research Aircraft (RSRA) were designed as pure research aircraft and dedicated rotor test vehicles whose function is to fill the gap between theory, wind tunnel testing, and flight verification. The two aircraft, the development of the piloting techniques required to safely fly the compound helicopter, the government flight testing accomplished to date, and proposed future research programs.

  6. vvtools v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, Richard R.

    Vvtools is a suite of testing tools, with a focus on reproducible verification and validation. They are written in pure Python, and contain a test harness and an automated process management tool. Users of vvtools can develop suites of verification and validation tests and run them on small to large high performance computing resources in an automated and reproducible way. The test harness enables complex processes to be performed in each test and even supports a one-level parent/child dependency between tests. It includes a built in capability to manage workloads requiring multiple processors and platforms that use batch queueing systems.

  7. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  8. Highly efficient simulation environment for HDTV video decoder in VLSI design

    NASA Astrophysics Data System (ADS)

    Mao, Xun; Wang, Wei; Gong, Huimin; He, Yan L.; Lou, Jian; Yu, Lu; Yao, Qingdong; Pirsch, Peter

    2002-01-01

    With the increase of the complex of VLSI such as the SoC (System on Chip) of MPEG-2 Video decoder with HDTV scalability especially, simulation and verification of the full design, even as high as the behavior level in HDL, often proves to be very slow, costly and it is difficult to perform full verification until late in the design process. Therefore, they become bottleneck of the procedure of HDTV video decoder design, and influence it's time-to-market mostly. In this paper, the architecture of Hardware/Software Interface of HDTV video decoder is studied, and a Hardware-Software Mixed Simulation (HSMS) platform is proposed to check and correct error in the early design stage, based on the algorithm of MPEG-2 video decoding. The application of HSMS to target system could be achieved by employing several introduced approaches. Those approaches speed up the simulation and verification task without decreasing performance.

  9. A model-based design and validation approach with OMEGA-UML and the IF toolset

    NASA Astrophysics Data System (ADS)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  10. Fast regional readout CMOS Image Sensor for dynamic MLC tracking

    NASA Astrophysics Data System (ADS)

    Zin, H.; Harris, E.; Osmond, J.; Evans, P.

    2014-03-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  11. Evolution of Software-Only-Simulation at NASA IV and V

    NASA Technical Reports Server (NTRS)

    McCarty, Justin; Morris, Justin; Zemerick, Scott

    2014-01-01

    Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.

  12. Validation and Verification (V&V) of Safety-Critical Systems Operating Under Off-Nominal Conditions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2012-01-01

    Loss of control (LOC) remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft LOC accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. Research is underway at the National Aeronautics and Space Administration (NASA) in the development of advanced onboard system technologies for preventing or recovering from loss of vehicle control and for assuring safe operation under off-nominal conditions associated with aircraft LOC accidents. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V&V) and ultimate certification. The V&V of complex integrated systems poses highly significant technical challenges and is the subject of a parallel research effort at NASA. This chapter summarizes the V&V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft LOC accidents. A summary of recent research accomplishments in this effort is referenced.

  13. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  14. Calibration of Ge gamma-ray spectrometers for complex sample geometries and matrices

    NASA Astrophysics Data System (ADS)

    Semkow, T. M.; Bradt, C. J.; Beach, S. E.; Haines, D. K.; Khan, A. J.; Bari, A.; Torres, M. A.; Marrantino, J. C.; Syed, U.-F.; Kitto, M. E.; Hoffman, T. J.; Curtis, P.

    2015-11-01

    A comprehensive study of the efficiency calibration and calibration verification of Ge gamma-ray spectrometers was performed using semi-empirical, computational Monte-Carlo (MC), and transfer methods. The aim of this study was to evaluate the accuracy of the quantification of gamma-emitting radionuclides in complex matrices normally encountered in environmental and food samples. A wide range of gamma energies from 59.5 to 1836.0 keV and geometries from a 10-mL jar to 1.4-L Marinelli beaker were studied on four Ge spectrometers with the relative efficiencies between 102% and 140%. Density and coincidence summing corrections were applied. Innovative techniques were developed for the preparation of artificial complex matrices from materials such as acidified water, polystyrene, ethanol, sugar, and sand, resulting in the densities ranging from 0.3655 to 2.164 g cm-3. They were spiked with gamma activity traceable to international standards and used for calibration verifications. A quantitative method of tuning MC calculations to experiment was developed based on a multidimensional chi-square paraboloid.

  15. LIVVkit 2: An extensible land ice verification and validation toolkit for comparing observations and models?

    NASA Astrophysics Data System (ADS)

    Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.

    2016-12-01

    Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.

  16. Engineering support activities for the Apollo 17 Surface Electrical Properties Experiment.

    NASA Technical Reports Server (NTRS)

    Cubley, H. D.

    1972-01-01

    Description of the engineering support activities which were required to ensure fulfillment of objectives specified for the Apollo 17 SEP (Surface Electrical Properties) Experiment. Attention is given to procedural steps involving verification of hardware acceptability to the astronauts, computer simulation of the experiment hardware, field trials, receiver antenna pattern measurements, and the qualification test program.

  17. Verification, Validation and Accreditation using AADL

    DTIC Science & Technology

    2011-05-03

    component h component, c r2 socsr hhh  max. height (surface relative), hsr r1 pwbsra thh  max. height (absolute), ha pwb pwb t c0. Context-Specific...5512 digital oscillatorABC_9230 Warning Module PWB component component, c r2 hhh max. height (surface relative), hsr r1 pwbsra thh  max. height

  18. Surface contamination analysis technology team overview

    NASA Astrophysics Data System (ADS)

    Burns, H. Dewitt, Jr.

    1996-11-01

    The surface contamination analysis technology (SCAT) team was originated as a working roup of NASA civil service, Space Shuttle contractor, and university groups. Participating members of the SCAT Team have included personnel from NASA Marshall Space Flight Center's Materials and Processes Laboratory and Langley Research Center's Instrument Development Group; contractors-Thiokol Corporation's Inspection Technology Group, AC Engineering support contractor, Aerojet, SAIC, and Lockheed MArtin/Oak Ridge Y-12 support contractor and Shuttle External Tank prime contractor; and the University of Alabama in Huntsville's Center for Robotics and Automation. The goal of the SCAT team as originally defined was to develop and integrate a multi-purpose inspection head for robotic application to in-process inspection of contamination sensitive surfaces. One area of interest was replacement of ozone depleting solvents currently used for surface cleanliness verification. The team approach brought together the appropriate personnel to determine what surface inspection techniques were applicable to multi-program surface cleanliness inspection. Major substrates of interest were chosen to simulate space shuttle critical bonding surface or surfaces sensitive to contamination such as fuel system component surfaces. Inspection techniques evaluated include optically stimulated electron emission or photoelectron emission; Fourier transform infrared spectroscopy; near infrared fiber optic spectroscopy; and, ultraviolet fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992. Instrumentation specifications and designs developed under this effort include a portable diffuse reflectance FTIR system built by Surface Optics Corporation and a third generation optically stimulated electron emission system built by LaRC. This paper will discuss the evaluation of the various techniques on a number of substrate materials contaminated with hydrocarbons, silicones, and fluorocarbons. Discussion will also include standards development for instrument calibration and testing.

  19. Validation and verification of a virtual environment for training naval submarine officers

    NASA Astrophysics Data System (ADS)

    Zeltzer, David L.; Pioch, Nicholas J.

    1996-04-01

    A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.

  20. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  1. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Xiong; Viswanathan, Akila; Stewart, Alexandra J.

    Cumulative dose distributions in fractionated radiation therapy depict the dose to normal tissues and therefore may permit an estimation of the risk of normal tissue complications. However, calculation of these distributions is highly challenging because of interfractional changes in the geometry of patient anatomy. This work presents an algorithm for deformable structure registration of the bladder and the verification of the accuracy of the algorithm using phantom and patient data. In this algorithm, the registration process involves conformal mapping of genus zero surfaces using finite element analysis, and guided by three control landmarks. The registration produces a correspondence between fractionsmore » of the triangular meshes used to describe the bladder surface. For validation of the algorithm, two types of balloons were inflated gradually to three times their original size, and several computerized tomography (CT) scans were taken during the process. The registration algorithm yielded a local accuracy of 4 mm along the balloon surface. The algorithm was then applied to CT data of patients receiving fractionated high-dose-rate brachytherapy to the vaginal cuff, with the vaginal cylinder in situ. The patients' bladder filling status was intentionally different for each fraction. The three required control landmark points were identified for the bladder based on anatomy. Out of an Institutional Review Board (IRB) approved study of 20 patients, 3 had radiographically identifiable points near the bladder surface that were used for verification of the accuracy of the registration. The verification point as seen in each fraction was compared with its predicted location based on affine as well as deformable registration. Despite the variation in bladder shape and volume, the deformable registration was accurate to 5 mm, consistently outperforming the affine registration. We conclude that the structure registration algorithm presented works with reasonable accuracy and provides a means of calculating cumulative dose distributions.« less

  3. Explicit Pharmacokinetic Modeling: Tools for Documentation, Verification, and Portability

    EPA Science Inventory

    Quantitative estimates of tissue dosimetry of environmental chemicals due to multiple exposure pathways require the use of complex mathematical models, such as physiologically-based pharmacokinetic (PBPK) models. The process of translating the abstract mathematics of a PBPK mode...

  4. The mark of vegetation change on Earth's surface energy balance: data-driven diagnostics and model validation

    NASA Astrophysics Data System (ADS)

    Cescatti, A.; Duveiller, G.; Hooker, J.

    2017-12-01

    Changing vegetation cover not only affects the atmospheric concentration of greenhouse gases but also alters the radiative and non-radiative properties of the surface. The result of competing biophysical processes on Earth's surface energy balance varies spatially and seasonally, and can lead to warming or cooling depending on the specific vegetation change and on the background climate. To date these effects are not accounted for in land-based climate policies because of the complexity of the phenomena, contrasting model predictions and the lack of global data-driven assessments. To overcome the limitations of available observation-based diagnostics and of the on-going model inter-comparison, here we present a new benchmarking dataset derived from satellite remote sensing. This global dataset provides the potential changes induced by multiple vegetation transitions on the single terms of the surface energy balance. We used this dataset for two major goals: 1) Quantify the impact of actual vegetation changes that occurred during the decade 2000-2010, showing the overwhelming role of tropical deforestation in warming the surface by reducing evapotranspiration despite the concurrent brightening of the Earth. 2) Benchmark a series of ESMs against data-driven metrics of the land cover change impacts on the various terms of the surface energy budget and on the surface temperature. We anticipate that the dataset could be also used to evaluate future scenarios of land cover change and to develop the monitoring, reporting and verification guidelines required for the implementation of mitigation plans that account for biophysical land processes.

  5. Surface contamination analysis technology team overview

    NASA Technical Reports Server (NTRS)

    Burns, H. Dewitt

    1995-01-01

    A team was established which consisted of representatives from NASA (Marshall Space Flight Center and Langley Research Center), Thiokol Corporation, the University of Alabama in Huntsville, AC Engineering, SAIC, Martin Marietta, and Aerojet. The team's purpose was to bring together the appropriate personnel to determine what surface inspection techniques were applicable to multiprogram bonding surface cleanliness inspection. In order to identify appropriate techniques and their sensitivity to various contaminant families, calibration standards were developed. Producing standards included development of consistent low level contamination application techniques. Oxidation was also considered for effect on inspection equipment response. Ellipsometry was used for oxidation characterization. Verification testing was then accomplished to show that selected inspection techniques could detect subject contaminants at levels found to be detrimental to critical bond systems of interest. Once feasibility of identified techniques was shown, selected techniques and instrumentation could then be incorporated into a multipurpose inspection head and integrated with a robot for critical surface inspection. Inspection techniques currently being evaluated include optically stimulated electron emission (OSEE); near infrared (NIR) spectroscopy utilizing fiber optics; Fourier transform infrared (FTIR) spectroscopy; and ultraviolet (UV) fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992 assuming appropriate funding levels are maintained. This paper gives an overview of work accomplished by the team and future plans.

  6. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  7. Three-dimensional surface contouring of macroscopic objects by means of phase-difference images.

    PubMed

    Velásquez Prieto, Daniel; Garcia-Sucerquia, Jorge

    2006-09-01

    We report a technique to determine the 3D contour of objects with dimensions of at least 4 orders of magnitude larger than the illumination optical wavelength. Our proposal is based on the numerical reconstruction of the optical wave field of digitally recorded holograms. The required modulo 2pi phase map in any contouring process is obtained by means of the direct subtraction of two phase-contrast images under different illumination angles to create a phase-difference image of a still object. Obtaining the phase-difference images is only possible by using the capability of numerical reconstruction of the complex optical field provided by digital holography. This unique characteristic leads us to a robust, reliable, and fast procedure that requires only two images. A theoretical analysis of the contouring system is shown, with verification by means of numerical and experimental results.

  8. Alloy-assisted deposition of three-dimensional arrays of atomic gold catalyst for crystal growth studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Yin; Jiang, Yuanwen; Cherukara, Mathew J.

    Large-scale assembly of individual atoms over smooth surfaces is difficult to achieve. A configuration of an atom reservoir, in which individual atoms can be readily extracted, may successfully address this challenge. In this work, we demonstrate that a liquid gold-silicon alloy established in classical vapor-liquid-solid growth can deposit ordered and three-dimensional rings of isolated gold atoms over silicon nanowire sidewalls. Here, we perform ab initio molecular dynamics simulation and unveil a surprising single atomic gold-catalyzed chemical etching of silicon. Experimental verification of this catalytic process in silicon nanowires yields dopant-dependent, massive and ordered 3D grooves with spacing down to similarmore » to 5 nm. Finally, we use these grooves as self-labeled and ex situ markers to resolve several complex silicon growths, including the formation of nodes, kinks, scale-like interfaces, and curved backbones.« less

  9. The BEFWM system for detection and phase conjugation of a weak laser beam

    NASA Astrophysics Data System (ADS)

    Khizhnyak, Anatoliy; Markov, Vladimir

    2007-09-01

    Real environmental conditions, such as atmospheric turbulence and aero-optics effects, make practical implementation of the object-in-the-loop (TIL) algorithm a very difficult task, especially when the system is set to operate with a signal from the diffuse surface image-resolved object. The problem becomes even more complex since for the remote object the intensity of the returned signal is extremely low. This presentation discusses the results of an analysis and experimental verification of a thresholdless coherent signal receiving system, capable not only in high-sensitivity detection of an ultra weak object-scattered light, but also in its high-gain amplification and phase conjugation. The process of coherent detection by using the Brillouin Enhanced Four Wave Mixing (BEFWM) enables retrieval of complete information on the received signal, including accurate measurement of its wavefront. This information can be used for direct real-time control of the adaptive mirror.

  10. Experimental Verification of Entanglement Generated in a Plasmonic System.

    PubMed

    Dieleman, F; Tame, M S; Sonnefraud, Y; Kim, M S; Maier, S A

    2017-12-13

    A core process in many quantum tasks is the generation of entanglement. It is being actively studied in a variety of physical settings-from simple bipartite systems to complex multipartite systems. In this work we experimentally study the generation of bipartite entanglement in a nanophotonic system. Entanglement is generated via the quantum interference of two surface plasmon polaritons in a beamsplitter structure, i.e., utilizing the Hong-Ou-Mandel (HOM) effect, and its presence is verified using quantum state tomography. The amount of entanglement is quantified by the concurrence and we find values of up to 0.77 ± 0.04. Verifying entanglement in the output state from HOM interference is a nontrivial task and cannot be inferred from the visibility alone. The techniques we use to verify entanglement could be applied to other types of photonic system and therefore may be useful for the characterization of a range of different nanophotonic quantum devices.

  11. Alloy-assisted deposition of three-dimensional arrays of atomic gold catalyst for crystal growth studies

    DOE PAGES

    Fang, Yin; Jiang, Yuanwen; Cherukara, Mathew J.; ...

    2017-12-08

    Large-scale assembly of individual atoms over smooth surfaces is difficult to achieve. A configuration of an atom reservoir, in which individual atoms can be readily extracted, may successfully address this challenge. In this work, we demonstrate that a liquid gold-silicon alloy established in classical vapor-liquid-solid growth can deposit ordered and three-dimensional rings of isolated gold atoms over silicon nanowire sidewalls. Here, we perform ab initio molecular dynamics simulation and unveil a surprising single atomic gold-catalyzed chemical etching of silicon. Experimental verification of this catalytic process in silicon nanowires yields dopant-dependent, massive and ordered 3D grooves with spacing down to similarmore » to 5 nm. Finally, we use these grooves as self-labeled and ex situ markers to resolve several complex silicon growths, including the formation of nodes, kinks, scale-like interfaces, and curved backbones.« less

  12. Finite element simulation and experimental verification of ultrasonic non-destructive inspection of defects in additively manufactured materials

    NASA Astrophysics Data System (ADS)

    Taheri, H.; Koester, L.; Bigelow, T.; Bond, L. J.

    2018-04-01

    Industrial applications of additively manufactured components are increasing quickly. Adequate quality control of the parts is necessary in ensuring safety when using these materials. Base material properties, surface conditions, as well as location and size of defects are some of the main targets for nondestructive evaluation of additively manufactured parts, and the problem of adequate characterization is compounded given the challenges of complex part geometry. Numerical modeling can allow the interplay of the various factors to be studied, which can lead to improved measurement design. This paper presents a finite element simulation verified by experimental results of ultrasonic waves scattering from flat bottom holes (FBH) in additive manufacturing materials. A focused beam immersion ultrasound transducer was used for both the modeling and simulations in the additive manufactured samples. The samples were SS17 4 PH steel samples made by laser sintering in a powder bed.

  13. Comprehensive analytical model for locally contacted rear surface passivated solar cells

    NASA Astrophysics Data System (ADS)

    Wolf, Andreas; Biro, Daniel; Nekarda, Jan; Stumpp, Stefan; Kimmerle, Achim; Mack, Sebastian; Preu, Ralf

    2010-12-01

    For optimum performance of solar cells featuring a locally contacted rear surface, the metallization fraction as well as the size and distribution of the local contacts are crucial, since Ohmic and recombination losses have to be balanced. In this work we present a set of equations which enable to calculate this trade off without the need of numerical simulations. Our model combines established analytical and empirical equations to predict the energy conversion efficiency of a locally contacted device. For experimental verification, we fabricate devices from float zone silicon wafers of different resistivity using the laser fired contact technology for forming the local rear contacts. The detailed characterization of test structures enables the determination of important physical parameters, such as the surface recombination velocity at the contacted area and the spreading resistance of the contacts. Our analytical model reproduces the experimental results very well and correctly predicts the optimum contact spacing without the use of free fitting parameters. We use our model to estimate the optimum bulk resistivity for locally contacted devices fabricated from conventional Czochralski-grown silicon material. These calculations use literature values for the stable minority carrier lifetime to account for the bulk recombination caused by the formation of boron-oxygen complexes under carrier injection.

  14. WRF Simulation over the Eastern Africa by use of Land Surface Initialization

    NASA Astrophysics Data System (ADS)

    Sakwa, V. N.; Case, J.; Limaye, A. S.; Zavodsky, B.; Kabuchanga, E. S.; Mungai, J.

    2014-12-01

    The East Africa region experiences severe weather events associated with hazards of varying magnitude. It receives heavy precipitation which leads to wide spread flooding and lack of sufficient rainfall in some parts results into drought. Cases of flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). The source of heat and moisture depends on the state of the land surface which interacts with the boundary layer of the atmosphere to produce excessive precipitation or lack of it that leads to severe drought. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Improved modeling capabilities within the region have the potential to enhance forecast guidance in support of daily operations and high-impact weather over East Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Non-hydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over Eastern Africa.SPoRT and SERVIR provide land surface initialization datasets and model verification tool. The NASA Land Information System (LIS) provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Model verification is done using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. These MET tools enable KMS to monitor model forecast accuracy in near real time. This study highlights verification results of WRF runs over East Africa using the LIS land surface initialization.

  15. Verification of fluid-structure-interaction algorithms through the method of manufactured solutions for actuator-line applications

    NASA Astrophysics Data System (ADS)

    Vijayakumar, Ganesh; Sprague, Michael

    2017-11-01

    Demonstrating expected convergence rates with spatial- and temporal-grid refinement is the ``gold standard'' of code and algorithm verification. However, the lack of analytical solutions and generating manufactured solutions presents challenges for verifying codes for complex systems. The application of the method of manufactured solutions (MMS) for verification for coupled multi-physics phenomena like fluid-structure interaction (FSI) has only seen recent investigation. While many FSI algorithms for aeroelastic phenomena have focused on boundary-resolved CFD simulations, the actuator-line representation of the structure is widely used for FSI simulations in wind-energy research. In this work, we demonstrate the verification of an FSI algorithm using MMS for actuator-line CFD simulations with a simplified structural model. We use a manufactured solution for the fluid velocity field and the displacement of the SMD system. We demonstrate the convergence of both the fluid and structural solver to second-order accuracy with grid and time-step refinement. This work was funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Wind Energy Technologies Office, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.

  16. The role of the real-time simulation facility, SIMFAC, in the design, development and performance verification of the Shuttle Remote Manipulator System (SRMS) with man-in-the-loop

    NASA Technical Reports Server (NTRS)

    Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.

    1980-01-01

    The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.

  17. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  18. Preliminary report on the Black Thunder, Wyoming CTBT R and D experiment quicklook report: LLNL input from regional stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P.E.; Glenn, L.A.

    This report presents a preliminary summary of the data recorded at three regional seismic stations from surface blasting at the Black Thunder Coal Mine in northeast Wyoming. The regional stations are part of a larger effort that includes many more seismic stations in the immediate vicinity of the mine. The overall purpose of this effort is to characterize the source function and propagation characteristics of large typical surface mine blasts. A detailed study of source and propagation features of conventional surface blasts is a prerequisite to attempts at discriminating this type of blasting activity from other sources of seismic events.more » The Black Thunder Seismic experiment is a joint verification effort to determine seismic source and path effects that result from very large, but routine ripple-fired surface mining blasts. Studies of the data collected will be for the purpose of understanding how the near-field and regional seismic waveforms from these surface mining blasts are similar to, and different from, point shot explosions and explosions at greater depth. The Black Hills Station is a Designated Seismic Station that was constructed for temporary occupancy by the Former Soviet Union seismic verification scientists in accordance with the Threshold Test Ban Treaty protocol.« less

  19. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    NASA Technical Reports Server (NTRS)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  20. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  1. Development of automated optical verification technologies for control systems

    NASA Astrophysics Data System (ADS)

    Volegov, Peter L.; Podgornov, Vladimir A.

    1999-08-01

    The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.

  2. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  3. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  4. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1982-01-01

    The use of superminicomputers for solving a series of increasingly complex thermal analysis problems is investigated. The approach involved (1) installation and verification of the SPAR thermal analyzer software on superminicomputers at Langley Research Center and Goddard Space Flight Center, (2) solution of six increasingly complex thermal problems on this equipment, and (3) comparison of solution (accuracy, CPU time, turnaround time, and cost) with solutions on large mainframe computers.

  5. An approach for modelling snowcover ablation and snowmelt runoff in cold region environments

    NASA Astrophysics Data System (ADS)

    Dornes, Pablo Fernando

    Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.

  6. Verification of the Icarus Material Response Tool

    NASA Technical Reports Server (NTRS)

    Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre

    2017-01-01

    Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.

  7. Circulation of spoof surface plasmon polaritons: Implementation and verification

    NASA Astrophysics Data System (ADS)

    Pan, Junwei; Wang, Jiafu; Qiu, Tianshuo; Pang, Yongqiang; Li, Yongfeng; Zhang, Jieqiu; Qu, Shaobo

    2018-05-01

    In this letter, we are dedicated to implementation and experimental verification of broadband circulator for spoof surface plasmon polaritons (SSPPs). For the ease of fabrication, a circulator operating in X band was firstly designed. The comb-like transmission lines (CL-TLs), a typical SSPP structure, are adopted as the three branches of the Y-junction. To enable broadband coupling of SSPP, a transition section is added on each end of the CL-TLs. Through such a design, the circulator can operate under the sub-wavelength SSPP mode in a broad band. The simulation results show that the insertion loss is less than 0.5dB while the isolation and return loss are higher than 20dB in 9.4-12.0GHz. A prototype was fabricated and measured. The experimental results are consistent with the simulation results and verify the broadband circulation performance in X band.

  8. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.

    PubMed

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-06-27

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  9. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    PubMed Central

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  10. System engineering of the Atacama Large Millimeter/submillimeter Array

    NASA Astrophysics Data System (ADS)

    Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel

    2012-09-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.

  11. The integration of a mesh reflector to a 15-foot box truss structure. Task 3: Box truss analysis and technology development

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Thiemet, W. F.; Morosow, G.

    1987-01-01

    To demonstrate the design and integration of a reflective mesh surface to a deployable truss structure, a mesh reflector was installed on a 15 foot box truss cube. The specific features demonstrated include: (1) sewing seams in reflective mesh; (2) mesh stretching to desired preload; (3) installation of surface tie cords; (4) installation of reflective surface on truss; (5) setting of reflective surface; (6) verification of surface shape/accuracy; (7) storage and deployment; (8) repeatability of reflector surface; and (9) comparison of surface with predicted shape using analytical methods developed under a previous task.

  12. Cleanup Verification Package for the 118-H-6:2, 105-H Reactor Ancillary Support Areas, Below-Grade Structures, and Underlying Soils; the 118-H-6:3, 105-H Reactor Fuel Storage Basin and Underlying Soils; The 118-H-6:3 Fuel Storage Basin Deep Zone Side Slope Soils; the 100-H-9, 100-H-10, and 100-H-13 French Drains; the 100-H-11 and 100-H-12 Expansion Box French Drains; and the 100-H-14 and 100-H-31 Surface Contamination Zones

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Appel

    2006-06-29

    This cleanup verification package documents completion of removal actions for the 105-H Reactor Ancillary Support Areas, Below-Grade Structures, and Underlying Soils (subsite 118-H-6:2); 105-H Reactor Fuel Storage Basin and Underlying Soils (118-H-6:3); and Fuel Storage Basin Deep Zone Side Slope Soils. This CVP also documents remedial actions for the following seven additional waste sties: French Drain C (100-H-9), French Drain D (100-H-10), Expansion Box French Drain E (100-H-11), Expansion Box French Drain F (100-H-12), French Drain G (100-H-13), Surface Contamination Zone H (100-H-14), and the Polychlorinated Biphenyl Surface Contamination Zone (100-H-31).

  13. Numerical study on 3D composite morphing actuators

    NASA Astrophysics Data System (ADS)

    Oishi, Kazuma; Saito, Makoto; Anandan, Nishita; Kadooka, Kevin; Taya, Minoru

    2015-04-01

    There are a number of actuators using the deformation of electroactive polymer (EAP), where fewer papers seem to have focused on the performance of 3D morphing actuators based on the analytical approach, due mainly to their complexity. The present paper introduces a numerical analysis approach on the large scale deformation and motion of a 3D half dome shaped actuator composed of thin soft membrane (passive material) and EAP strip actuators (EAP active coupon with electrodes on both surfaces), where the locations of the active EAP strips is a key parameter. Simulia/Abaqus Static and Implicit analysis code, whose main feature is the high precision contact analysis capability among structures, are used focusing on the whole process of the membrane to touch and wrap around the object. The unidirectional properties of the EAP coupon actuator are used as input data set for the material properties for the simulation and the verification of our numerical model, where the verification is made as compared to the existing 2D solution. The numerical results can demonstrate the whole deformation process of the membrane to wrap around not only smooth shaped objects like a sphere or an egg, but also irregularly shaped objects. A parametric study reveals the proper placement of the EAP coupon actuators, with the modification of the dome shape to induce the relevant large scale deformation. The numerical simulation for the 3D soft actuators shown in this paper could be applied to a wider range of soft 3D morphing actuators.

  14. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less

  15. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  16. SU-F-T-269: Preliminary Experience of Kuwait Cancer Control Center (KCCC) On IMRT Treatment Planning and Pre-Treatment Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, TKR; Sherif, M; Subramanian, N

    Purpose: The complexity of IMRT delivery requires pre-treatment quality assurance and plan verification. KCCC has implemented IMRT clinically in few sites and will extend to all sites. Recently, our Varian linear accelerator and Eclipse planning system were upgraded from Millennium 80 to 120 Multileaf Collimator (MLC) and from v8.6 to 11.0 respectively. Our preliminary experience on the pre-treatment quality assurance verification is discussed. Methods: Eight Breast, Three Prostate and One Hypopharynx cancer patients were planned with step and shoot IMRT. All breast cases were planned before the upgrade with 60% cases treated. The ICRU 83 recommendations were followed for themore » dose prescription and constraints to OAR for all cases. Point dose measurement was done with CIRS cylindrical phantom and PTW 0.125 cc ionization chamber. Measured dose was compared with calculated dose at the point of measurement. Map CHECK diode array phantom was used for the plan verification. Planned and measured doses were compared by applying gamma index of 3% (dose difference) / 3 mm DTA (average distance to agreement). For all cases, a plan is considered to be successful if more than 95% of the tested diodes pass the gamma test. A prostate case was chosen to compare the plan verification before and after the upgrade. Results: Point dose measurement results were in agreement with the calculated doses. The maximum deviation observed was 2.3%. The passing rate of average gamma index was measured higher than 97% for the plan verification of all cases. Similar result was observed for plan verification of the chosen prostate case before and after the upgrade. Conclusion: Our preliminary experience from the obtained results validates the accuracy of our QA process and provides confidence to extend IMRT to all sites in Kuwait.« less

  17. Quantification of improvements in an operational global-scale ocean thermal analysis system. (Reannouncement with new availability information)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, R.M.; Harding, J.M.; Pollak, K.D.

    1992-02-01

    Global-scale analyses of ocean thermal structure produced operationally at the U.S. Navy`s Fleet Numerical Oceanography Center are verified, along with an ocean thermal climatology, against unassimilated bathythermograph (bathy), satellite multichannel sea surface temperature (MCSST), and ship sea surface temperature (SST) data. Verification statistics are calculated from the three types of data for February-April of 1988 and February-April of 1990 in nine verification areas covering most of the open ocean in the Northern Hemisphere. The analyzed thermal fields were produced by version 1.0 of the Optimum Thermal Interpolation System (OTIS 1.0) in 1988, but by an upgraded version of this model,more » referred to as OTIS 1.1, in 1990. OTIS 1.1 employs exactly the same analysis methodology as OTIS 1.0. The principal difference is that OTIS 1.1 has twice the spatial resolution of OTIS 1.0 and consequently uses smaller spatial decorrelation scales and noise-to-signal ratios. As a result, OTIS 1.1 is able to represent more horizontal detail in the ocean thermal fields than its predecessor. Verification statistics for the SST fields derived from bathy and MCSST data are consistent with each other, showing similar trends and error levels. These data indicate that the analyzed SST fields are more accurate in 1990 than in 1988, and generally more accurate than climatology for both years. Verification statistics for the SST fields derived from ship data are inconsistent with those derived from the bathy and MCSST data, and show much higher error levels indicative of observational noise.« less

  18. To thine own self be true? Clarifying the effects of identity discrepancies on psychological distress and emotions.

    PubMed

    Kalkhoff, Will; Marcussen, Kristen; Serpe, Richard T

    2016-07-01

    After many years of research across disciplines, it remains unclear whether people are more motivated to seek appraisals that accurately match self-views (self-verification) or are as favorable as possible (self-enhancement). Within sociology, mixed findings in identity theory have fueled the debate. A problem here is that a commonly employed statistical approach does not take into account the direction of a discrepancy between how we see ourselves and how we think others see us in terms of a given identity, yet doing so is critical for determining which self-motive is at play. We offer a test of three competing models of identity processes, including a new "mixed motivations" model where self-verification and self-enhancement operate simultaneously. We compare the models using the conventional statistical approach versus response surface analysis. The latter method allows us to determine whether identity discrepancies involving over-evaluation are as distressing as those involving under-evaluation. We use nationally representative data and compare results across four different identities and multiple outcomes. The two statistical approaches lead to the same conclusions more often than not and mostly support identity theory and its assumption that people seek self-verification. However, response surface tests reveal patterns that are mistaken as evidence of self-verification by conventional procedures, especially for the spouse identity. We also find that identity discrepancies have different effects on distress and self-conscious emotions (guilt and shame). Our findings have implications not only for research on self and identity across disciplines, but also for many other areas of research that incorporate these concepts and/or use difference scores as explanatory variables. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  20. DISCOVER-AQ: a unique acoustic propagation verification and validation data set

    DOT National Transportation Integrated Search

    2015-08-09

    In 2013, the National Aeronautics and Space Administration conducted a month-long flight test for the Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality research effort in Houston...

  1. Applications of HCMM satellite data to the study of urban heating patterns

    NASA Technical Reports Server (NTRS)

    Carlson, T. N. (Principal Investigator)

    1980-01-01

    A research summary is presented and is divided into two major areas, one developmental and the other basic science. In the first three sub-categories are discussed: image processing techniques, especially the method whereby surface temperature image are converted to images of surface energy budget, moisture availability and thermal inertia; model development; and model verification. Basic science includes the use of a method to further the understanding of the urban heat island and anthropogenic modification of the surface heating, evaporation over vegetated surfaces, and the effect of surface heat flux on plume spread.

  2. A Survey of Measurement, Mitigation, and Verification Field Technologies for Carbon Sequestration Geologic Storage

    NASA Astrophysics Data System (ADS)

    Cohen, K. K.; Klara, S. M.; Srivastava, R. D.

    2004-12-01

    The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared sensors and; aerial and satellite imagery. This abstract will describe results, similarities, and contrasts for funded studies from the U.S. DOE's Carbon Sequestration Program including examples from the Sleipner North Sea Project, the Canadian Weyburn Field/Dakota Gasification Plant Project, the Frio Formation Texas Project, and Yolo County Bioreactor Landfill Project. The abstract will also address the following: How are the terms ``measurement,'' ``mitigation''and ``verification'' defined in the Program? What is the U.S. DOE's Carbon Sequestration Program Roadmap and what are the Roadmap goals for MM&V? What is the current status of MM&V technologies?

  3. Photogrammetric Verification of Fiber Optic Shape Sensors on Flexible Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Moore, Jason P.; Rogge, Matthew D.; Jones, Thomas W.

    2012-01-01

    Multi-core fiber (MCF) optic shape sensing offers the possibility of providing in-flight shape measurements of highly flexible aerospace structures and control surfaces for such purposes as gust load alleviation, flutter suppression, general flight control and structural health monitoring. Photogrammetric measurements of surface mounted MCF shape sensing cable can be used to quantify the MCF installation path and verify measurement methods.

  4. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  5. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  6. Principles of Sterilization of Mars Descent Vehicle Elements

    NASA Astrophysics Data System (ADS)

    Trofimov, Vladislav; Deshevaya, Elena; Khamidullina, N.; Kalashnikov, Viktor

    Due to COSPAR severe requirements to permissible microbiological contamination of elements of down-to-Mars S/C as well as complexity of their chemical composition and structure the exposure of such S/C elements to antimicrobial treatment (sterilization) at their integration requires application of a wide set of methods: chemical, ultraviolet, radiation. The report describes the analysis of all the aspects of applicable methods of treatment for cleaning of elements’ surfaces and inner contents from microbiota. The analysis showed that the most important, predictable and controllable method is radiation processing (of the elements which don’t change their properties after effective treatment). The experience of ionizing radiation application for sterilization of products for medicine, etc. shows that, depending on initial microbial contamination of lander elements, the required absorbed dose can be within the range 12 ÷ 35 kGr. The analysis of the effect of irregularity of radiation absorption in complex structure elements to the choice of radiation methodology was made and the algorithm of the choice of effective conditions of radiation treatment and control of sterilization efficiency was suggested. The important phase of establishing of the effective condition of each structure element treatment is experimental verification of real microbiological contamination in terms of S/C integration, contamination maximum decrease using another cleaning procedures (mechanical, chemical, ultraviolet) and determination of radiation resistance of spore microorganisms typical for the shops of space technology manufacturing and assembling. Proceeding from three parameters (irregularity of radiation absorption in a concrete element, its initial microbial contamination and resistance of microorganisms to the effect of radiation) the condition of the packed object sterilization is chosen, the condition that prevents secondary contamination, ensures given reliability of the treatment without final experimental microbiological verification only by simple control of the absorbed dose at critical points. All the process phases (from the choice of treatment conditions to provision of the procedure safety) are strictly regulated by Russian legislation in accordance with international standards.

  7. Software Testing and Verification in Climate Model Development

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  8. JPRS Report, Soviet Union, World Economy and International Relations, Number 2, February 1990.

    DTIC Science & Technology

    1990-06-21

    of the simplistic line of reasoning was too strong, and the new concept was too unusual. The main reason, however, was the exceptional complexity ...many other problems which were previously considered strictly internal. This also applies to participation in the joint resolution of global problems...would give rise to the need to resolve the complex issues of the verification of the elimination itself and, possibly, of the production of fissionable

  9. METHANOGENESIS AND SULFATE REDUCTION IN CHEMOSTATS: II. MODEL DEVELOPMENT AND VERIFICATION

    EPA Science Inventory

    A comprehensive dynamic model is presented that simulates methanogenesis and sulfate reduction in a continuously stirred tank reactor (CSTR). This model incorporates the complex chemistry of anaerobic systems. A salient feature of the model is its ability to predict the effluent ...

  10. Structural Element Testing in Support of the Design of the NASA Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Wade C.; Thesken, John C.; Schleicher, Eric; Wagner, Perry; Kirsch, Michael T.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). For the design and manufacturing of the CCM, the team adopted the building block approach where design and manufacturing risks were mitigated through manufacturing trials and structural testing at various levels of complexity. Following NASA's Structural Design Verification Requirements, a further objective was the verification of design analysis methods and the provision of design data for critical structural features. Test articles increasing in complexity from basic material characterization coupons through structural feature elements and large structural components, to full-scale structures were evaluated. This paper discusses only four elements tests three of which include joints and one that includes a tapering honeycomb core detail. For each test series included are specimen details, instrumentation, test results, a brief analysis description, test analysis correlation and conclusions.

  11. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  12. Towards Verification of Operational Procedures Using Auto-Generated Diagnostic Trees

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Lutz, Robyn; Patterson-Hine, Ann

    2009-01-01

    The design, development, and operation of complex space, lunar and planetary exploration systems require the development of general procedures that describe a detailed set of instructions capturing how mission tasks are performed. For both crewed and uncrewed NASA systems, mission safety and the accomplishment of the scientific mission objectives are highly dependent on the correctness of procedures. In this paper, we describe how to use the auto-generated diagnostic trees from existing diagnostic models to improve the verification of standard operating procedures. Specifically, we introduce a systematic method, namely the Diagnostic Tree for Verification (DTV), developed with the goal of leveraging the information contained within auto-generated diagnostic trees in order to check the correctness of procedures, to streamline the procedures in terms of reducing the number of steps or use of resources in them, and to propose alternative procedural steps adaptive to changing operational conditions. The application of the DTV method to a spacecraft electrical power system shows the feasibility of the approach and its range of capabilities

  13. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  14. Beyond the Caster Semenya controversy: the case of the use of genetics for gender testing in sport.

    PubMed

    Wonkam, Ambroise; Fieggen, Karen; Ramesar, Raj

    2010-12-01

    Caster Semenya won the eight-hundred-meter title in the Berlin World Athletics Championships in 2009. Few hours after, Caster was at the center of a harsh contestation on gender. The International Association of Athletics Federations started an investigation, which was not respectful of her privacy. Caster's case highlights the need for an improvement in the awareness of genetic counseling principles amongst professionals, the public and various stakeholders. We critically examine the historical steps of gender verification in the Olympics, the violation of genetic counseling principles in Caster's case and outline some reflections on the complexity of the genetics of Disorders of sex development (DSD). Variability in both genotypes and phenotypes in DSD may not allow any etiological or functional classification at this point in time that could permit uncontroversial gender verification for fairer sport participation. We strongly suggest revisiting the pertinence of gender verification, and the process whereby this is done.

  15. Safeguardability of the vitrification option for disposal of plutonium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillay, K.K.S.

    1996-05-01

    Safeguardability of the vitrification option for plutonium disposition is rather complex and there is no experience base in either domestic or international safeguards for this approach. In the present treaty regime between the US and the states of the former Soviet Union, bilaterial verifications are considered more likely with potential for a third-party verification of safeguards. There are serious technological limitations to applying conventional bulk handling facility safeguards techniques to achieve independent verification of plutonium in borosilicate glass. If vitrification is the final disposition option chosen, maintaining continuity of knowledge of plutonium in glass matrices, especially those containing boron andmore » those spike with high-level wastes or {sup 137}Cs, is beyond the capability of present-day safeguards technologies and nondestructive assay techniques. The alternative to quantitative measurement of fissile content is to maintain continuity of knowledge through a combination of containment and surveillance, which is not the international norm for bulk handling facilities.« less

  16. Formal verification of a microcoded VIPER microprocessor using HOL

    NASA Technical Reports Server (NTRS)

    Levitt, Karl; Arora, Tejkumar; Leung, Tony; Kalvala, Sara; Schubert, E. Thomas; Windley, Philip; Heckman, Mark; Cohen, Gerald C.

    1993-01-01

    The Royal Signals and Radar Establishment (RSRE) and members of the Hardware Verification Group at Cambridge University conducted a joint effort to prove the correspondence between the electronic block model and the top level specification of Viper. Unfortunately, the proof became too complex and unmanageable within the given time and funding constraints, and is thus incomplete as of the date of this report. This report describes an independent attempt to use the HOL (Cambridge Higher Order Logic) mechanical verifier to verify Viper. Deriving from recent results in hardware verification research at UC Davis, the approach has been to redesign the electronic block model to make it microcoded and to structure the proof in a series of decreasingly abstract interpreter levels, the lowest being the electronic block level. The highest level is the RSRE Viper instruction set. Owing to the new approach and some results on the proof of generic interpreters as applied to simple microprocessors, this attempt required an effort approximately an order of magnitude less than the previous one.

  17. The space shuttle launch vehicle aerodynamic verification challenges

    NASA Technical Reports Server (NTRS)

    Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.

    1985-01-01

    The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.

  18. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  19. 4D cone beam CT phase sorting using high frequency optical surface measurement during image guided radiotherapy

    NASA Astrophysics Data System (ADS)

    Price, G. J.; Marchant, T. E.; Parkhurst, J. M.; Sharrock, P. J.; Whitfield, G. A.; Moore, C. J.

    2011-03-01

    In image guided radiotherapy (IGRT) two of the most promising recent developments are four dimensional cone beam CT (4D CBCT) and dynamic optical metrology of patient surfaces. 4D CBCT is now becoming commercially available and finds use in treatment planning and verification, and whilst optical monitoring is a young technology, its ability to measure during treatment delivery without dose consequences has led to its uptake in many institutes. In this paper, we demonstrate the use of dynamic patient surfaces, simultaneously captured during CBCT acquisition using an optical sensor, to phase sort projection images for 4D CBCT volume reconstruction. The dual modality approach we describe means that in addition to 4D volumetric data, the system provides correlated wide field measurements of the patient's skin surface with high spatial and temporal resolution. As well as the value of such complementary data in verification and motion analysis studies, it introduces flexibility into the acquisition of the signal required for phase sorting. The specific technique used may be varied according to individual patient circumstances and the imaging target. We give details of three different methods of obtaining a suitable signal from the optical surfaces: simply following the motion of triangulation spots used to calibrate the surfaces' absolute height; monitoring the surface height in a single, arbitrarily selected, camera pixel; and tracking, in three dimensions, the movement of a surface feature. In addition to describing the system and methodology, we present initial results from a case study oesophageal cancer patient.

  20. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  1. Continuous-variable quantum homomorphic signature

    NASA Astrophysics Data System (ADS)

    Li, Ke; Shang, Tao; Liu, Jian-wei

    2017-10-01

    Quantum cryptography is believed to be unconditionally secure because its security is ensured by physical laws rather than computational complexity. According to spectrum characteristic, quantum information can be classified into two categories, namely discrete variables and continuous variables. Continuous-variable quantum protocols have gained much attention for their ability to transmit more information with lower cost. To verify the identities of different data sources in a quantum network, we propose a continuous-variable quantum homomorphic signature scheme. It is based on continuous-variable entanglement swapping and provides additive and subtractive homomorphism. Security analysis shows the proposed scheme is secure against replay, forgery and repudiation. Even under nonideal conditions, it supports effective verification within a certain verification threshold.

  2. Verification Failures: What to Do When Things Go Wrong

    NASA Astrophysics Data System (ADS)

    Bertacco, Valeria

    Every integrated circuit is released with latent bugs. The damage and risk implied by an escaped bug ranges from almost imperceptible to potential tragedy; unfortunately it is impossible to discern within this range before a bug has been exposed and analyzed. While the past few decades have witnessed significant efforts to improve verification methodology for hardware systems, these efforts have been far outstripped by the massive complexity of modern digital designs, leading to product releases for which an always smaller fraction of system's states has been verified. The news of escaped bugs in large market designs and/or safety critical domains is alarming because of safety and cost implications (due to replacements, lawsuits, etc.).

  3. An Update on the Role of Systems Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Muheim, Danniella; Menzel, Michael; Mosier, Gary; Irish, Sandra; Maghami, Peiman; Mehalick, Kimberly; Parrish, Keith

    2010-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2014. System-level verification of critical performance requirements will rely on integrated observatory models that predict the wavefront error accurately enough to verify that allocated top-level wavefront error of 150 nm root-mean-squared (rms) through to the wave-front sensor focal plane is met. The assembled models themselves are complex and require the insight of technical experts to assess their ability to meet their objectives. This paper describes the systems engineering and modeling approach used on the JWST through the detailed design phase.

  4. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  5. Assessment of Potential Location of High Arsenic Contamination Using Fuzzy Overlay and Spatial Anisotropy Approach in Iron Mine Surrounding Area

    PubMed Central

    Wirojanagud, Wanpen; Srisatit, Thares

    2014-01-01

    Fuzzy overlay approach on three raster maps including land slope, soil type, and distance to stream can be used to identify the most potential locations of high arsenic contamination in soils. Verification of high arsenic contamination was made by collection samples and analysis of arsenic content and interpolation surface by spatial anisotropic method. A total of 51 soil samples were collected at the potential contaminated location clarified by fuzzy overlay approach. At each location, soil samples were taken at the depth of 0.00-1.00 m from the surface ground level. Interpolation surface of the analysed arsenic content using spatial anisotropic would verify the potential arsenic contamination location obtained from fuzzy overlay outputs. Both outputs of the spatial surface anisotropic and the fuzzy overlay mapping were significantly spatially conformed. Three contaminated areas with arsenic concentrations of 7.19 ± 2.86, 6.60 ± 3.04, and 4.90 ± 2.67 mg/kg exceeded the arsenic content of 3.9 mg/kg, the maximum concentration level (MCL) for agricultural soils as designated by Office of National Environment Board of Thailand. It is concluded that fuzzy overlay mapping could be employed for identification of potential contamination area with the verification by surface anisotropic approach including intensive sampling and analysis of the substances of interest. PMID:25110751

  6. Analysis of particulate contamination on tape lift samples from the VETA optical surfaces

    NASA Technical Reports Server (NTRS)

    Germani, Mark S.

    1992-01-01

    Particulate contamination analysis was carried out on samples taken from the Verification Engineering Test Article (VETA) x-ray detection system. A total of eighteen tape lift samples were taken from the VETA optical surfaces. Initially, the samples were tested using a scanning electron microscope. Additionally, particle composition was determined by energy dispersive x-ray spectrometry. Results are presented in terms of particle loading per sample.

  7. Verification of satellite radar remote sensing based estimates of boreal and subalpine growing seasons using an ecosystem process model and surface biophysical measurement network information

    NASA Technical Reports Server (NTRS)

    McDonald, K. C.; Kimball, J. S.; Zimmerman, R.

    2002-01-01

    We employ daily surface Radar backscatter data from the SeaWinds Ku-band Scatterometer onboard Quikscat to estimate landscape freeze-thaw state and associated length of the seasonal non-frozen period as a surrogate for determining the annual growing season across boreal and subalpine regions of North America for 2000 and 2001.

  8. Investigation of optical/infrared sensor techniques for application satellites

    NASA Technical Reports Server (NTRS)

    Kaufman, I.

    1972-01-01

    A method of scanning an optical sensor array by acoustic surface waves is discussed. Data cover detailed computer based analysis of the operation of a multielement acoustic surface-wave-scanned optical sensor, the development of design and operation techniques that were used to show the feasibility of an integrated array to design several such arrays, and experimental verification of a number of the calculations with discrete sensor devices.

  9. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less

  10. Laparoscopic management of a large ovarian cyst in the neonate.

    PubMed

    Mahomed, A; Jibril, A; Youngson, G

    1998-10-01

    Laparotomy has become the preferred approach to the excision of large, complex abdominal cysts in the neonate. We describe a laparoscopic-assisted decapsulation of an antenatally diagnosed abdominal cyst that was noted on postnatal ultrasound scan to have a complex echo pattern. This limited procedure allows for accurate verification of the diagnosis, institution of appropriate therapy, and organ salvage. It represents a superior management option that obviates the significant complications associated with conservative management.

  11. DEVELOPMENT AND VERIFICATION OF A SCREENING MODEL FOR SURFACE SPREADING OF PETROLEUM

    EPA Science Inventory

    Overflows and leakage from aboveground storage tanks and pipelines carrying crude oil and petroleum products occur frequently. The spilled hydrocarbons pose environmental threats by contaminating the surrounding soil and the underlying ground water. Predicting the fate and transp...

  12. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    NASA Technical Reports Server (NTRS)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.

  13. Precipitation Discrimination from Satellite Infrared Temperatures over the CCOPE Mesonet Region.

    NASA Astrophysics Data System (ADS)

    Weiss, Mitchell; Smith, Eric A.

    1987-06-01

    A quantitative investigation of the relationship between satellite-derived cloud-top temperature parameters and the detection of intense convective rainfall is described. The area of study is that of the Cooperative Convective Precipitation Experiment (CCOPE), which was held near Miles City, Montana during the summer of 1981. Cloud-top temperatures, derived from the GOES-West operational satellite, were used to calculate a variety of parameters for objectively quantifying the convective intensity of a storm. A dense network of rainfall provided verification of surface rainfall. The cloud-top temperature field and surface rainfall data were processed into equally sized grid domains in order to best depict the individual samples of instantaneous precipitation.The technique of statistical discriminant analysis was used to determine which combinations of cloud-top temperature parameters best classify rain versus no-rain occurrence using three different rain-rate cutoffs: 1, 4, and 10 mm h1. Time lags within the 30 min rainfall verification were tested to determine the optimum time delay associated with rainfall reaching the ground.A total of six storm cases were used to develop and test the statistical models. Discrimination of rain events was found to be most accurate when using a 10 mm h1 rain-rate cutoff. Use parameters designated as coldest cloud-top temperature, the spatial mean of coldest cloud-top temperature, and change over time of mean coldest cloud-top temperature were found to be the best classifiers of rainfall in this study. Combining both a 10-min time lag (in terms of surface verification) with a 10 mm h1 rain-rate threshold resulted in classifying over 60% of all rain and no-rain cases correctly.

  14. Pore-scale modeling of moving contact line problems in immiscible two-phase flow.

    NASA Astrophysics Data System (ADS)

    Kucala, A.; Noble, D.; Martinez, M. J.

    2016-12-01

    Two immiscible fluids in static equilibrium form a common interface along a solid surface, characterized as the static contact (wetting) angle and is a function of surface geometry, intermolecular forces, and interfacial surface energies manifested as interfacial tension. This static configuration may become perturbed due to external force imbalances (mass injection, pressure gradients, buoyancy, etc.) and the contact line location and interface curvature becomes dynamic. Accurate modeling of moving contact line (MCL) problems is imperative in predicting capillary pressure vs. saturation curves, permeability, and preferential flow paths for a variety of applications, including geological carbon storage (GCS) and enhanced oil recovery (EOR). Here, we present a model for the moving contact line using pore-scale computational fluid dynamics (CFD) which solves the full, time-dependent Navier-Stokes equations using the Galerkin finite-element method. The MCL is modeled as a surface traction force proportional to the surface tension, dependent on the static properties of the immiscible fluid/solid system. The moving two-phase interface is tracked using the level set method and discretized with the conformal decomposition finite element method (CDFEM), allowing for surface tension effects to be computed at the exact interface location. We present a variety of verification test cases for simple two- and three-dimensional geometries to validate the current model, including threshold pressure predictions in flows through pore-throats for a variety of wetting angles. Simulations involving more complex geometries are also presented to be used in future simulations for GCS and EOR problems. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000

  15. Field Study for Remote Sensing: An instructor's manual

    NASA Technical Reports Server (NTRS)

    Wake, W. H. (Editor); Hull, G. A. (Editor)

    1981-01-01

    The need for and value of field work (surface truthing) in the verification of image identification from high atitude infrared and multispectral space sensor images are discussed in this handbook which presents guidelines for developing instructional and research procedures in remote sensing of the environment.

  16. Investigation of Cleanliness Verification Techniques for Rocket Engine Hardware

    NASA Technical Reports Server (NTRS)

    Fritzemeier, Marilyn L.; Skowronski, Raymund P.

    1994-01-01

    Oxidizer propellant systems for liquid-fueled rocket engines must meet stringent cleanliness requirements for particulate and nonvolatile residue. These requirements were established to limit residual contaminants which could block small orifices or ignite in the oxidizer system during engine operation. Limiting organic residues in high pressure oxygen systems, such as in the Space Shuttle Main Engine (SSME), is particularly important. The current method of cleanliness verification for the SSME uses an organic solvent flush of the critical hardware surfaces. The solvent is filtered and analyzed for particulate matter followed by gravimetric determination of the nonvolatile residue (NVR) content of the filtered solvent. The organic solvents currently specified for use (1, 1, 1-trichloroethane and CFC-113) are ozone-depleting chemicals slated for elimination by December 1995. A test program is in progress to evaluate alternative methods for cleanliness verification that do not require the use of ozone-depleting chemicals and that minimize or eliminate the use of solvents regulated as hazardous air pollutants or smog precursors. Initial results from the laboratory test program to evaluate aqueous-based methods and organic solvent flush methods for NVR verification are provided and compared with results obtained using the current method. Evaluation of the alternative methods was conducted using a range of contaminants encountered in the manufacture of rocket engine hardware.

  17. An overview of the NSCAT/N-ROSS program

    NASA Technical Reports Server (NTRS)

    Martin, B. D.; Freilich, Michael H.; Li, F. K.; Callahan, Phillip S.

    1986-01-01

    The NASA Scatterometer (NSCAT) to fly on the U.S. Navy Remote Ocean Sensing System (N-ROSS) mission is presented. The overall N-ROSS mission, the NSCAT flight instrument and groundbased data processing/distribution system, and NASA-supported science and verification activities are described. The N-ROSS system is designed to provide measurements of near-surface wind, ocean topography, wave height, sea-surface temperature, and atmospheric water content over the global oceans. The NSCAT is an improved version of the Seasat scatterometer. It will measure near surface vector winds.

  18. Experimental verification of ‘waveguide’ plasmonics

    NASA Astrophysics Data System (ADS)

    Prudêncio, Filipa R.; Costa, Jorge R.; Fernandes, Carlos A.; Engheta, Nader; Silveirinha, Mário G.

    2017-12-01

    Surface plasmons polaritons are collective excitations of an electron gas that occur at an interface between negative-ɛ and positive-ɛ media. Here, we report the experimental observation of such surface waves using simple waveguide metamaterials filled only with available positive-ɛ media at microwave frequencies. In contrast to optical designs, in our setup the propagation length of the surface plasmons can be rather long as low loss conventional dielectrics are chosen to avoid typical losses from negative-ɛ media. Plasmonic phenomena have potential applications in enhancing light-matter interactions, implementing nanoscale photonic circuits and integrated photonics.

  19. Verification of the ideal magnetohydrodynamic response at rational surfaces in the VMEC code

    DOE PAGES

    Lazerson, Samuel A.; Loizu, Joaquim; Hirshman, Steven; ...

    2016-01-13

    The VMEC nonlinear ideal MHD equilibrium code [S. P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)] is compared against analytic linear ideal MHD theory in a screw-pinch-like configuration. The focus of such analysis is to verify the ideal MHD response at magnetic surfaces which possess magnetic transform (ι) which is resonant with spectral values of the perturbed boundary harmonics. A large aspect ratio circular cross section zero-beta equilibrium is considered. This equilibrium possess a rational surface with safety factor q = 2 at a normalized flux value of 0.5. A small resonant boundary perturbation is introduced, excitingmore » a response at the resonant rational surface. The code is found to capture the plasma response as predicted by a newly developed analytic theory that ensures the existence of nested flux surfaces by allowing for a jump in rotational transform (ι=1/q). The VMEC code satisfactorily reproduces these theoretical results without the necessity of an explicit transform discontinuity (Δι) at the rational surface. It is found that the response across the rational surfaces depends upon both radial grid resolution and local shear (dι/dΦ, where ι is the rotational transform and Φ the enclosed toroidal flux). Calculations of an implicit Δι suggest that it does not arise due to numerical artifacts (attributed to radial finite differences in VMEC) or existence conditions for flux surfaces as predicted by linear theory (minimum values of Δι). Scans of the rotational transform profile indicate that for experimentally relevant levels of transform shear the response becomes increasing localised. Furthermore, careful examination of a large experimental tokamak equilibrium, with applied resonant fields, indicates that this shielding response is present, suggesting the phenomena is not limited to this verification exercise.« less

  20. Verification and validation of an advanced model of heat and mass transfer in the protective clothing

    NASA Astrophysics Data System (ADS)

    Łapka, Piotr; Furmański, Piotr

    2018-04-01

    The paper presents verification and validation of an advanced numerical model of heat and moisture transfer in the multi-layer protective clothing and in components of the experimental stand subjected to either high surroundings temperature or high radiative heat flux emitted by hot objects. The developed model included conductive-radiative heat transfer in the hygroscopic porous fabrics and air gaps as well as conductive heat transfer in components of the stand. Additionally, water vapour diffusion in the pores and air spaces as well as phase transition of the bound water in the fabric fibres (sorption and desorption) were accounted for. All optical phenomena at internal or external walls were modelled and the thermal radiation was treated in the rigorous way, i.e., semi-transparent absorbing, emitting and scattering fabrics with the non-grey properties were assumed. The air was treated as transparent. Complex energy and mass balances as well as optical conditions at internal or external interfaces were formulated in order to find values of temperatures, vapour densities and radiation intensities at these interfaces. The obtained highly non-linear coupled system of discrete equations was solved by the Finite Volume based in-house iterative algorithm. The developed model passed discretisation convergence tests and was successfully verified against the results obtained applying commercial software for simplified cases. Then validation was carried out using experimental measurements collected during exposure of the protective clothing to high radiative heat flux emitted by the IR lamp. Satisfactory agreement of simulated and measured temporal variation of temperature at external and internal surfaces of the multi-layer clothing was attained.

  1. Development Of Metallic Thermal Protection System For The Expert Re-Entry Vehicle: Design Verification

    NASA Astrophysics Data System (ADS)

    Fatemi, Javad

    2011-05-01

    The thermal protection system of the EXPERT re-entry vehicle is subjected to accelerations, vibrations, acoustic and shock loads during launch and aero-heating loads and aerodynamic forces during re-entry. To fully understand the structural and thermomechanical performances of the TPS, heat transfer analysis, thermal stress analysis, and thermal buckling analysis must be performed. This requires complex three-dimensional thermal and structural models of the entire TPS including the insulation and sensors. Finite element (FE) methods are employed to assess the thermal and structural response of the TPS to the mechanical and aerothermal loads. The FE analyses results are used for the design verification and design improvement of the EXPERT thermal protection system.

  2. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.

  3. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.

  4. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  5. Thermal/structural design verification strategies for large space structures

    NASA Technical Reports Server (NTRS)

    Benton, David

    1988-01-01

    Requirements for space structures of increasing size, complexity, and precision have engendered a search for thermal design verification methods that do not impose unreasonable costs, that fit within the capabilities of existing facilities, and that still adequately reduce technical risk. This requires a combination of analytical and testing methods. This requires two approaches. The first is to limit thermal testing to sub-elements of the total system only in a compact configuration (i.e., not fully deployed). The second approach is to use a simplified environment to correlate analytical models with test results. These models can then be used to predict flight performance. In practice, a combination of these approaches is needed to verify the thermal/structural design of future very large space systems.

  6. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers.

    PubMed

    Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.

  7. Atmospheric verification mission for the TSS/STARFAC tethered satellite

    NASA Technical Reports Server (NTRS)

    Wood, George M., Jr.; Stuart, Thomas D.; Crouch, Donald S.; Deloach, Richard; Brown, Kenneth G.

    1991-01-01

    Two types of a tethered satellite system (TSS) - a basic 1.8-m-diameter spherical spacecraft and the Shuttle Tethered Aerothermodynamic Research Facility (STARFAC) are considered. Issues related to the deployment and retrieval of a large satellite with exceedingly long tethers are discussed, and the objectives of an Atmospheric Verification Mission (ATM) are outlined. Focus is concentrated on the ATM satellite which will fly after TSS-1 and before the fully instrumented and costlier TSS-2. The differences between the AVM and TSS-2, including the configuration of the aerodynamic stabilizers, instrumentation, and the materials of construction are outlined. The basic Kevlar tether defined for the TSS-2 is being considered for use with the AVM, however, a complex tether is under consideration as well.

  8. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, T. A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. This paper presents a procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  9. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, Thomas A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. A procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system is presented. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  10. Leak Mitigation in Mechanically Pumped Fluid Loops for Long Duration Space Missions

    NASA Technical Reports Server (NTRS)

    Miller, Jennifer R.; Birur, Gajanana; Bame, David; Mastropietro, A. J.; Bhandari, Pradeep; Lee, Darlene; Karlmann, Paul; Liu, Yuanming

    2013-01-01

    Mechanically pumped fluid loops (MPFLs) are increasingly considered for spacecraft thermal control. A concern for long duration space missions is the leak of fluid leading to performance degradation or potential loop failure. An understanding of leak rate through analysis, as well as destructive and non-destructive testing, provides a verifiable means to quantify leak rates. The system can be appropriately designed to maintain safe operating pressures and temperatures throughout the mission. Two MPFLs on the Mars Science Laboratory Spacecraft, launched November 26, 2011, maintain the temperature of sensitive electronics and science instruments within a -40 deg C to 50 deg C range during launch, cruise, and Mars surface operations. With over 100 meters of complex tubing, fittings, joints, flex lines, and pumps, the system must maintain a minimum pressure through all phases of the mission to provide appropriate performance. This paper describes the process of design, qualification, test, verification, and validation of the components and assemblies employed to minimize risks associated with excessive fluid leaks from pumped fluid loop systems.

  11. Verification of the numerical model of insert-type joint of scaffolding in relation to experimental research

    NASA Astrophysics Data System (ADS)

    Pieńko, Michał; Błazik-Borowa, Ewa

    2018-01-01

    This paper presents the problem of comparing the results of computer simulations with the results of laboratory tests. The subject of the study was the insert-type joint of scaffolding loaded with a bending moment. The research was carried out on the real elements of the scaffolding. Due to the complexity of the connection different friction coefficients and depths of wedge insertion were taken into account in the analysis. The aim of conducting the series of analyses was to determine the sensitivity of the model to the mentioned characteristics. Since laboratory tests were carried out on the real samples, there were no preparations of surface involved in the load transfer. This approach caused many problems with the clear definition of the nature of work of individual node elements during the load. The analysis consist of two stages: the stage in which the connection is defined (the wedge is inserted into the rosette), and the loading stage (the node is loaded by the bending moment).

  12. Geomorphic Identification and Verification of Recent Sedimentation Patterns in the Woonasquatucket River, North Providence, Rhode Island

    DTIC Science & Technology

    2007-03-01

    8 The Centre Cotton Manufacturing Company (1812).................................................................. 9...112 Data collection...sedimentation surface) derived from the CF:CS 210Pb model. Data are not available for vibracores CMS- SD-4210 and CMS-SD-4213 in 1938 and 1958 and

  13. Climate Verification Using Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A robust method previously used to detect observed intra- to multi-decadal (IMD) climate regimes was adapted to test whether climate models could reproduce IMD variations in U.S. surface temperatures during 1919-2008. This procedure, called the running Mann Whitney Z (MWZ) method, samples data ranki...

  14. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.

  15. Simulating CO2 Leakage and Seepage From Geologic Carbon Sequestration Sites: Implications for Near-Surface Monitoring

    NASA Astrophysics Data System (ADS)

    Oldenburg, C. M.; Lewicki, J. L.; Zhang, Y.

    2003-12-01

    The injection of CO2 into deep geologic formations for the purpose of carbon sequestration entails risk that CO2 will leak upward from the target formation and ultimately seep out of the ground surface. We have developed a coupled subsurface and atmospheric surface layer modeling capability based on TOUGH2 to simulate CO2 leakage and seepage. Simulation results for representative subsurface and surface layer conditions are used to specify the requirements of potential near-surface monitoring strategies relevant to both health, safety, and environmental risk assessment as well as sequestration verification. The coupled model makes use of the standard multicomponent and multiphase framework of TOUGH2 and extends the model domain to include an atmospheric surface layer. In the atmospheric surface layer, we assume a logarithmic velocity profile for the time-averaged wind and make use of Pasquill-Gifford and Smagorinski dispersion coefficients to model surface layer dispersion. Results for the unsaturated zone and surface layer show that the vadose zone pore space can become filled with pure CO2 even for small leakage fluxes, but that CO2 concentrations above the ground surface are very low due to the strong effects of dispersion caused by surface winds. Ecological processes such as plant photosynthesis and root respiration, as well as biodegradation in soils, strongly affect near-surface CO2 concentrations and fluxes. The challenge for geologic carbon sequestration verification is to discern the leakage and seepage signal from the ecological signal. Our simulations point to the importance of subsurface monitoring and the need for geochemical (e.g., isotopic) analyses to distinguish leaking injected fossil CO2 from natural ecological CO2. This work was supported by the Office of Science, U.S. Department of Energy under contract No. DE-AC03-76SF00098.

  16. Hand Grasping Synergies As Biometrics.

    PubMed

    Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana

    2017-01-01

    Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.

  17. Design for Verification: Using Design Patterns to Build Reliable Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)

    2003-01-01

    Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.

  18. Verification of cardiac mechanics software: benchmark problems and solutions for testing active and passive material behaviour.

    PubMed

    Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A

    2015-12-08

    Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.

  19. Case-Study of the High School Student's Family Values Formation

    ERIC Educational Resources Information Center

    Valeeva, Roza A.; Korolyeva, Natalya E.; Sakhapova, Farida Kh.

    2016-01-01

    The aim of the research is the theoretical justification and experimental verification of content, complex forms and methods to ensure effective development of the high school students' family values formation. 93 lyceum students from Kazan took part in the experiment. To study students' family values we have applied method of studying personality…

  20. Complex VLSI Feature Comparison for Commercial Microelectronics Verification

    DTIC Science & Technology

    2014-03-27

    69 4.2.4 Circuit E . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 4.3 Summary...used for high-performance consumer microelectronics. Volume is a significant factor in constraining the technology limit for defense circuits, but it...surveyed in a 2010 Department of Commerce report found counterfeit chips difficult to identify due to improved fabrication quality in overseas counterfeit

  1. PUNCHED CARD SYSTEM NEEDN'T BE COMPLEX TO GIVE COMPLETE CONTROL.

    ERIC Educational Resources Information Center

    BEMIS, HAZEL T.

    AT WORCESTER JUNIOR COLLEGE, MASSACHUSETTS, USE OF A MANUALLY OPERATED PUNCHED CARD SYSTEM HAS RESULTED IN (1) SIMPLIFIED REGISTRATION PROCEDURES, (2) QUICK ANALYSIS OF CONFLICTS AND PROBLEMS IN CLASS SCHEDULING, (3) READY ACCESS TO STATISTICAL INFORMATION, (4) DIRECTORY INFORMATION IN A WIDE RANGE OF CLASSIFICATIONS, (5) EASY VERIFICATION OF…

  2. [Implication of inverse-probability weighting method in the evaluation of diagnostic test with verification bias].

    PubMed

    Kang, Leni; Zhang, Shaokai; Zhao, Fanghui; Qiao, Youlin

    2014-03-01

    To evaluate and adjust the verification bias existed in the screening or diagnostic tests. Inverse-probability weighting method was used to adjust the sensitivity and specificity of the diagnostic tests, with an example of cervical cancer screening used to introduce the Compare Tests package in R software which could be implemented. Sensitivity and specificity calculated from the traditional method and maximum likelihood estimation method were compared to the results from Inverse-probability weighting method in the random-sampled example. The true sensitivity and specificity of the HPV self-sampling test were 83.53% (95%CI:74.23-89.93)and 85.86% (95%CI: 84.23-87.36). In the analysis of data with randomly missing verification by gold standard, the sensitivity and specificity calculated by traditional method were 90.48% (95%CI:80.74-95.56)and 71.96% (95%CI:68.71-75.00), respectively. The adjusted sensitivity and specificity under the use of Inverse-probability weighting method were 82.25% (95% CI:63.11-92.62) and 85.80% (95% CI: 85.09-86.47), respectively, whereas they were 80.13% (95%CI:66.81-93.46)and 85.80% (95%CI: 84.20-87.41) under the maximum likelihood estimation method. The inverse-probability weighting method could effectively adjust the sensitivity and specificity of a diagnostic test when verification bias existed, especially when complex sampling appeared.

  3. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification.

    PubMed

    Palmer, Antony L; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H

    2015-11-21

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  4. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification

    NASA Astrophysics Data System (ADS)

    Palmer, Antony L.; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H.

    2015-11-01

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  5. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered ofmore » the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  6. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  7. Characterization of a novel two dimensional diode array the "magic plate" as a radiation detector for radiation therapy treatment.

    PubMed

    Wong, J H D; Fuduli, I; Carolan, M; Petasecca, M; Lerch, M L F; Perevertaylo, V L; Metcalfe, P; Rosenfeld, A B

    2012-05-01

    Intensity modulated radiation therapy (IMRT) utilizes the technology of multileaf collimators to deliver highly modulated and complex radiation treatment. Dosimetric verification of the IMRT treatment requires the verification of the delivered dose distribution. Two dimensional ion chamber or diode arrays are gaining popularity as a dosimeter of choice due to their real time feedback compared to film dosimetry. This paper describes the characterization of a novel 2D diode array, which has been named the "magic plate" (MP). It was designed to function as a 2D transmission detector as well as a planar detector for dose distribution measurements in a solid water phantom for the dosimetric verification of IMRT treatment delivery. The prototype MP is an 11 × 11 detector array based on thin (50 μm) epitaxial diode technology mounted on a 0.6 mm thick Kapton substrate using a proprietary "drop-in" technology developed by the Centre for Medical Radiation Physics, University of Wollongong. A full characterization of the detector was performed, including radiation damage study, dose per pulse effect, percent depth dose comparison with CC13 ion chamber and build up characteristics with a parallel plane ion chamber measurements, dose linearity, energy response and angular response. Postirradiated magic plate diodes showed a reproducibility of 2.1%. The MP dose per pulse response decreased at higher dose rates while at lower dose rates the MP appears to be dose rate independent. The depth dose measurement of the MP agrees with ion chamber depth dose measurements to within 0.7% while dose linearity was excellent. MP showed angular response dependency due to the anisotropy of the silicon diode with the maximum variation in angular response of 10.8% at gantry angle 180°. Angular dependence was within 3.5% for the gantry angles ± 75°. The field size dependence of the MP at isocenter agrees with ion chamber measurement to within 1.1%. In the beam perturbation study, the surface dose increased by 12.1% for a 30 × 30 cm(2) field size at the source to detector distance (SDD) of 80 cm whilst the transmission for the MP was 99%. The radiation response of the magic plate was successfully characterized. The array of epitaxial silicon based detectors with "drop-in" packaging showed properties suitable to be used as a simplified multipurpose and nonperturbing 2D radiation detector for radiation therapy dosimetric verification.

  8. Comparison of Orbiter STS-2 development flight instrumentation data with thermal math model predictions

    NASA Technical Reports Server (NTRS)

    Norman, I.; Rochelle, W. C.; Kimbrough, B. S.; Ritrivi, C. A.; Ting, P. C.; Dotts, R. L.

    1982-01-01

    Thermal performance verification of Reusable Surface Insulation (RSI) has been accomplished by comparisons of STS-2 Orbiter Flight Test (OFT) data with Thermal Math Model (TMM) predictions. The OFT data was obtained from Development Flight Instrumentation RSI plug and gap thermocouples. Quartertile RSI TMMs were developed using measured flight data for surface temperature and pressure environments. Reference surface heating rates, derived from surface temperature data, were multiplied by gap heating ratios to obtain tile sidewall heating rates. This TMM analysis resulted in good agreement of predicted temperatures with flight data for thermocouples located in the RSI, Strain Isolation Pad, filler bar and structure.

  9. Measurement of Surface Interfacial Tension as a Function of Temperature Using Pendant Drop Images

    NASA Astrophysics Data System (ADS)

    Yakhshi-Tafti, Ehsan; Kumar, Ranganathan; Cho, Hyoung J.

    2011-10-01

    Accurate and reliable measurements of surface tension at the interface of immiscible phases are crucial to understanding various physico-chemical reactions taking place between those. Based on the pendant drop method, an optical (graphical)-numerical procedure was developed to determine surface tension and its dependency on the surrounding temperature. For modeling and experimental verification, chemically inert and thermally stable perfluorocarbon (PFC) oil and water was used. Starting with geometrical force balance, governing equations were derived to provide non-dimensional parameters which were later used to extract values for surface tension. Comparative study verified the accuracy and reliability of the proposed method.

  10. TransFit: Finite element analysis data fitting software

    NASA Technical Reports Server (NTRS)

    Freeman, Mark

    1993-01-01

    The Advanced X-Ray Astrophysics Facility (AXAF) mission support team has made extensive use of geometric ray tracing to analyze the performance of AXAF developmental and flight optics. One important aspect of this performance modeling is the incorporation of finite element analysis (FEA) data into the surface deformations of the optical elements. TransFit is software designed for the fitting of FEA data of Wolter I optical surface distortions with a continuous surface description which can then be used by SAO's analytic ray tracing software, currently OSAC (Optical Surface Analysis Code). The improved capabilities of Transfit over previous methods include bicubic spline fitting of FEA data to accommodate higher spatial frequency distortions, fitted data visualization for assessing the quality of fit, the ability to accommodate input data from three FEA codes plus other standard formats, and options for alignment of the model coordinate system with the ray trace coordinate system. TransFit uses the AnswerGarden graphical user interface (GUI) to edit input parameters and then access routines written in PV-WAVE, C, and FORTRAN to allow the user to interactively create, evaluate, and modify the fit. The topics covered include an introduction to TransFit: requirements, designs philosophy, and implementation; design specifics: modules, parameters, fitting algorithms, and data displays; a procedural example; verification of performance; future work; and appendices on online help and ray trace results of the verification section.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PCB DETECTION TECHNOLOGY, DEXSIL CORPORATION L2000DX ANALYZER

    EPA Science Inventory

    The L2000DX Analyzer (dimensions: 9 x 9.5 x 4.25 in.) is a field-portable ion-specific electrode instrument, weighing approximately 5 lb 12 oz, designed to quantify concentrations of PCBS, chlorinated solvents, and pesticides in soils, water, transformer oils, and surface wipes. ...

  12. Bistatic radar sea state monitoring system design

    NASA Technical Reports Server (NTRS)

    Ruck, G. T.; Krichbaum, C. K.; Everly, J. O.

    1975-01-01

    Remote measurement of the two-dimensional surface wave height spectrum of the ocean by the use of bistatic radar techniques was examined. Potential feasibility and experimental verification by field experiment are suggested. The required experimental hardware is defined along with the designing, assembling, and testing of several required experimental hardware components.

  13. Inexpensive Eddy-Current Standard

    NASA Technical Reports Server (NTRS)

    Berry, Robert F., Jr.

    1985-01-01

    Radial crack replicas serve as evaluation standards. Technique entails intimately joining two pieces of appropriate aluminum alloy stock and centering drilled hole through and along interface. Bore surface of hole presents two vertical stock interface lines 180 degrees apart. These lines serve as radial crack defect replicas during eddy-current technique setup and verification.

  14. An Experimental High-Resolution Forecast System During the Vancouver 2010 Winter Olympic and Paralympic Games

    NASA Astrophysics Data System (ADS)

    Mailhot, J.; Milbrandt, J. A.; Giguère, A.; McTaggart-Cowan, R.; Erfani, A.; Denis, B.; Glazer, A.; Vallée, M.

    2014-01-01

    Environment Canada ran an experimental numerical weather prediction (NWP) system during the Vancouver 2010 Winter Olympic and Paralympic Games, consisting of nested high-resolution (down to 1-km horizontal grid-spacing) configurations of the GEM-LAM model, with improved geophysical fields, cloud microphysics and radiative transfer schemes, and several new diagnostic products such as density of falling snow, visibility, and peak wind gust strength. The performance of this experimental NWP system has been evaluated in these winter conditions over complex terrain using the enhanced mesoscale observing network in place during the Olympics. As compared to the forecasts from the operational regional 15-km GEM model, objective verification generally indicated significant added value of the higher-resolution models for near-surface meteorological variables (wind speed, air temperature, and dewpoint temperature) with the 1-km model providing the best forecast accuracy. Appreciable errors were noted in all models for the forecasts of wind direction and humidity near the surface. Subjective assessment of several cases also indicated that the experimental Olympic system was skillful at forecasting meteorological phenomena at high-resolution, both spatially and temporally, and provided enhanced guidance to the Olympic forecasters in terms of better timing of precipitation phase change, squall line passage, wind flow channeling, and visibility reduction due to fog and snow.

  15. A coupled vegetation/sediment transport model for dryland environments

    NASA Astrophysics Data System (ADS)

    Mayaud, Jerome R.; Bailey, Richard M.; Wiggs, Giles F. S.

    2017-04-01

    Dryland regions are characterized by patchy vegetation, erodible surfaces, and erosive aeolian processes. Understanding how these constituent factors interact and shape landscape evolution is critical for managing potential environmental and anthropogenic impacts in drylands. However, modeling wind erosion on partially vegetated surfaces is a complex problem that has remained challenging for researchers. We present the new, coupled cellular automaton Vegetation and Sediment TrAnsport (ViSTA) model, which is designed to address fundamental questions about the development of arid and semiarid landscapes in a spatially explicit way. The technical aspects of the ViSTA model are described, including a new method for directly imposing oblique wind and transport directions onto a cell-based domain. Verification tests for the model are reported, including stable state solutions, the impact of drought and fire stress, wake flow dynamics, temporal scaling issues, and the impact of feedbacks between sediment movement and vegetation growth on landscape morphology. The model is then used to simulate an equilibrium nebkha dune field, and the resultant bed forms are shown to have very similar size and spacing characteristics to nebkhas observed in the Skeleton Coast, Namibia. The ViSTA model is a versatile geomorphological tool that could be used to predict threshold-related transitions in a range of dryland ecogeomorphic systems.

  16. Operational prediction of air quality for the United States: applications of satellite observations

    NASA Astrophysics Data System (ADS)

    Stajner, Ivanka; Lee, Pius; Tong, Daniel; Pan, Li; McQueen, Jeff; Huang, Jianping; Huang, Ho-Chun; Draxler, Roland; Kondragunta, Shobha; Upadhayay, Sikchya

    2015-04-01

    Operational predictions of ozone and wildfire smoke over United States (U.S.) and predictions of airborne dust over the contiguous 48 states are provided by NOAA at http://airquality.weather.gov/. North American Mesoscale (NAM) weather predictions with inventory based emissions estimates from the U.S. Environmental Protection Agency (EPA) and chemical processes within the Community Multiscale Air Quality (CMAQ) model are combined together to produce ozone predictions. Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model is used to predict wildfire smoke and dust storm predictions. Routine verification of ozone predictions relies on AIRNow compilation of observations from surface monitors. Retrievals of smoke column integrals from GOES satellites and dust column integrals from MODIS satellite instruments are used for verification of smoke and dust predictions. Recent updates of NOAA's operational air quality predictions have focused on mobile emissions using the projections of mobile sources for 2012. Since emission inventories are complex and take years to assemble and evaluate causing a lag of information, we recently began combing inventory information with projections of mobile sources. In order to evaluate this emission update, these changes in projected NOx emissions from 2005-2012 were compared with observed changes in Ozone Monitoring Instrument (OMI) NO2 observations and NOx measured by surface monitors over large U.S. cities over the same period. Comparisons indicate that projected decreases in NOx emissions from 2005 to 2012 are similar, but not as strong as the decreases in the observed NOx concentrations and in OMI NO2 retrievals. Nevertheless, the use of projected mobile NOx emissions in the predictions reduced biases in predicted NOx concentrations, with the largest improvement in the urban areas. Ozone biases are reduced as well, with the largest improvement seen in rural areas. Recent testing of PM2.5 predictions is relying on emissions inventories augmented by real time sources from wildfires and dust storms. The evaluation of these test predictions relies on surface monitor data, but efforts are in progress to include comparisons with satellite observed aerosol optical depth (AOD) products. Testing of PM2.5 predictions continues to exhibit seasonal biases: overprediction in the winter and underprediction in the summer. The current efforts focus on bias correction and development of linkages with global atmospheric composition predictions.

  17. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    NASA Technical Reports Server (NTRS)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2011-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours

  18. Modeling interfacial fracture in Sierra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang

    2013-09-01

    This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less

  19. Cassini's RTGs undergo mechanical and electrical verification testing in the PHSF

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Jet Propulsion Laboratory (JPL) engineers examine the interface surface on the Cassini spacecraft prior to installation of the third radioisotope thermoelectric generator (RTG). The other two RTGs, at left, already are installed on Cassini. The three RTGs will be used to power Cassini on its mission to the Saturnian system. They are undergoing mechanical and electrical verification testing in the Payload Hazardous Servicing Facility. RTGs use heat from the natural decay of plutonium to generate electric power. The generators enable spacecraft to operate far from the Sun where solar power systems are not feasible. The Cassini mission is scheduled for an Oct. 6 launch aboard a Titan IVB/Centaur expendable launch vehicle. Cassini is built and managed for NASA by JPL.

  20. Heterogeneity of activated carbons in adsorption of phenols from aqueous solutions—Comparison of experimental isotherm data and simulation predictions

    NASA Astrophysics Data System (ADS)

    Podkościelny, P.; Nieszporek, K.

    2007-01-01

    Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.

  1. Array automated assembly task, phase 2. Low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Rhee, S. S.; Jones, G. T.; Allison, K. T.

    1978-01-01

    Several modifications instituted in the wafer surface preparation process served to significantly reduce the process cost to 1.55 cents per peak watt in 1975 cents. Performance verification tests of a laser scanning system showed a limited capability to detect hidden cracks or defects, but with potential equipment modifications this cost effective system could be rendered suitable for applications. Installation of electroless nickel plating system was completed along with an optimization of the wafer plating process. The solder coating and flux removal process verification test was completed. An optimum temperature range of 500-550 C was found to produce uniform solder coating with the restriction that a modified dipping procedure is utilized. Finally, the construction of the spray-on dopant equipment was completed.

  2. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    NASA Technical Reports Server (NTRS)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  3. High-Assurance Spiral

    DTIC Science & Technology

    2017-11-01

    Public Release; Distribution Unlimited. PA# 88ABW-2017-5388 Date Cleared: 30 OCT 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Cyber- physical systems... physical processes that interact in intricate manners. This makes verification of the software complex and unwieldy. In this report, an approach towards...resulting implementations. 15. SUBJECT TERMS Cyber- physical systems, Formal guarantees, Code generation 16. SECURITY CLASSIFICATION OF: 17

  4. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  5. Gender identity and sport: is the playing field level?

    PubMed

    Reeser, J C

    2005-10-01

    This review examines gender identity issues in competitive sports, focusing on the evolution of policies relating to female gender verification and transsexual participation in sport. The issues are complex and continue to challenge sport governing bodies, including the International Olympic Committee, as they strive to provide a safe environment in which female athletes may compete fairly and equitably.

  6. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  7. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    NASA Astrophysics Data System (ADS)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  8. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers

    PubMed Central

    Ospina, Raydonal; Frery, Alejandro C.

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups. PMID:27907014

  9. Comparison of fingerprint and facial biometric verification technologies for user access and patient identification in a clinical environment

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Zhang, Yu; Documet, Jorge; Liu, Brent; Lee, Jasper; Shrestha, Rasu; Wang, Kevin; Huang, H. K.

    2007-03-01

    As clinical imaging and informatics systems continue to integrate the healthcare enterprise, the need to prevent patient mis-identification and unauthorized access to clinical data becomes more apparent especially under the Health Insurance Portability and Accountability Act (HIPAA) mandate. Last year, we presented a system to track and verify patients and staff within a clinical environment. This year, we further address the biometric verification component in order to determine which Biometric system is the optimal solution for given applications in the complex clinical environment. We install two biometric identification systems including fingerprint and facial recognition systems at an outpatient imaging facility, Healthcare Consultation Center II (HCCII). We evaluated each solution and documented the advantages and pitfalls of each biometric technology in this clinical environment.

  10. Mission Control Center (MCC) System Specification for the Shuttle Orbital Flight Test (OFT) Timeframe

    NASA Technical Reports Server (NTRS)

    1976-01-01

    System specifications to be used by the mission control center (MCC) for the shuttle orbital flight test (OFT) time frame were described. The three support systems discussed are the communication interface system (CIS), the data computation complex (DCC), and the display and control system (DCS), all of which may interfere with, and share processing facilities with other applications processing supporting current MCC programs. The MCC shall provide centralized control of the space shuttle OFT from launch through orbital flight, entry, and landing until the Orbiter comes to a stop on the runway. This control shall include the functions of vehicle management in the area of hardware configuration (verification), flight planning, communication and instrumentation configuration management, trajectory, software and consumables, payloads management, flight safety, and verification of test conditions/environment.

  11. Interpretation of Passive Microwave Imagery of Surface Snow and Ice: Harding Lake, Alaska

    DTIC Science & Technology

    1991-06-01

    Circle conditions in microwave imagery depends on the char- (Fig. 1). The lake is roughly circular in shape and has a acteristics of the sensor system...local oscillator frequency 33.6 0Hz IF bandwidth Greaterthan 500 MHz cracks in the ice sheet. The incursion process is de - video bandwidth 1.7 kHz...using pas- surface snow had oct.urred on these similarly sized sive microwave sensors . IEEE/Transactions on Geo- lakes. Additional field verifications

  12. Field Verification Program (Upland Disposal): Prediction of Surface Runoff Water Quality from Black Rock Harbor Dredged Material Placed in an Upland Disposal Site.

    DTIC Science & Technology

    1987-03-01

    Simulator was similar to the original rotating disk-type rainfall simulator but had several important design modifications ( Westerdahl and Skogerboe...exist- ing vegetation on the soil surface ( Westerdahl and Skogerboe 1982). A multiple-peaked natural storm event was selected from field data and pro... Westerdahl and Skogerboe 1982) and has been used as a standard storm event for comparison to natural storm events (Laws and Parsons 1943). Similar

  13. Development of a Response Surface Thermal Model for Orion Mated to the International Space Station

    NASA Technical Reports Server (NTRS)

    Miller, Stephen W.; Meier, Eric J.

    2010-01-01

    A study was performed to determine if a Design of Experiments (DOE)/Response Surface Methodology could be applied to on-orbit thermal analysis and produce a set of Response Surface Equations (RSE) that accurately predict vehicle temperatures. The study used an integrated thermal model of the International Space Station and the Orion Outer mold line model. Five separate factors were identified for study: yaw, pitch, roll, beta angle, and the environmental parameters. Twenty external Orion temperatures were selected as the responses. A DOE case matrix of 110 runs was developed. The data from these cases were analyzed to produce an RSE for each of the temperature responses. The initial agreement between the engineering data and the RSE predictions was encouraging, although many RSEs had large uncertainties on their predictions. Fourteen verification cases were developed to test the predictive powers of the RSEs. The verification showed mixed results with some RSE predicting temperatures matching the engineering data within the uncertainty bands, while others had very large errors. While this study to not irrefutably prove that the DOE/RSM approach can be applied to on-orbit thermal analysis, it does demonstrate that technique has the potential to predict temperatures. Additional work is needed to better identify the cases needed to produce the RSEs

  14. [Optimization of vacuum belt drying process of Gardeniae Fructus in Reduning injection by Box-Behnken design-response surface methodology].

    PubMed

    Huang, Dao-sheng; Shi, Wei; Han, Lei; Sun, Ke; Chen, Guang-bo; Wu Jian-xiong; Xu, Gui-hong; Bi, Yu-an; Wang, Zhen-zhong; Xiao, Wei

    2015-06-01

    To optimize the belt drying process conditions optimization of Gardeniae Fructus extract from Reduning injection by Box-Behnken design-response surface methodology, on the basis of single factor experiment, a three-factor and three-level Box-Behnken experimental design was employed to optimize the drying technology of Gardeniae Fructus extract from Reduning injection. With drying temperature, drying time, feeding speed as independent variables and the content of geniposide as dependent variable, the experimental data were fitted to a second order polynomial equation, establishing the mathematical relationship between the content of geniposide and respective variables. With the experimental data analyzed by Design-Expert 8. 0. 6, the optimal drying parameter was as follows: the drying temperature was 98.5 degrees C , the drying time was 89 min, the feeding speed was 99.8 r x min(-1). Three verification experiments were taked under this technology and the measured average content of geniposide was 564. 108 mg x g(-1), which was close to the model prediction: 563. 307 mg x g(-1). According to the verification test, the Gardeniae Fructus belt drying process is steady and feasible. So single factor experiments combined with response surface method (RSM) could be used to optimize the drying technology of Reduning injection Gardenia extract.

  15. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor

    PubMed Central

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-01-01

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified. PMID:29649173

  16. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor.

    PubMed

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-04-12

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified.

  17. Verification, Validation and Accreditation

    DTIC Science & Technology

    2011-05-03

    5512 digital oscillatorABC_9230 Warning Module PWB component h component, c r2 socsr hhh  max. height (surface relative), hsr r1 pwbsra thh  max...Evacuation Codes Egress, Exodus, … 0.500 in.0.060 in. 20135-5512 digital oscillatorABC_9230 Warning Module PWB component component, c r2 hhh max. height

  18. Model Verification and Validation Using Graphical Information Systems Tools

    DTIC Science & Technology

    2013-07-31

    Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be...12 Geomorphic Measurements...to a model. Ocean flows, which are organized E-2 current systems, transport heat and salinity and cause water to pile up as a water surface

  19. Evaluating aerosol impacts on Numerical Weather Prediction in two extreme dust and biomass-burning events

    NASA Astrophysics Data System (ADS)

    Remy, Samuel; Benedetti, Angela; Jones, Luke; Razinger, Miha; Haiden, Thomas

    2014-05-01

    The WMO-sponsored Working Group on Numerical Experimentation (WGNE) set up a project aimed at understanding the importance of aerosols for numerical weather prediction (NWP). Three cases are being investigated by several NWP centres with aerosol capabilities: a severe dust case that affected Southern Europe in April 2012, a biomass burning case in South America in September 2012, and an extreme pollution event in Beijing (China) which took place in January 2013. At ECMWF these cases are being studied using the MACC-II system with radiatively interactive aerosols. Some preliminary results related to the dust and the fire event will be presented here. A preliminary verification of the impact of the aerosol-radiation direct interaction on surface meteorological parameters such as 2m Temperature and surface winds over the region of interest will be presented. Aerosol optical depth (AOD) verification using AERONET data will also be discussed. For the biomass burning case, the impact of using injection heights estimated by a Plume Rise Model (PRM) for the biomass burning emissions will be presented.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less

  1. Studies of the net surface radiative flux from satellite radiances during FIFE

    NASA Technical Reports Server (NTRS)

    Frouin, Robert

    1993-01-01

    Studies of the net surface radiative flux from satellite radiances during First ISLSCP Field Experiment (FIFE) are presented. Topics covered include: radiative transfer model validation; calibration of VISSR and AVHRR solar channels; development and refinement of algorithms to estimate downward solar and terrestrial irradiances at the surface, including photosynthetically available radiation (PAR) and surface albedo; verification of these algorithms using in situ measurements; production of maps of shortwave irradiance, surface albedo, and related products; analysis of the temporal variability of shortwave irradiance over the FIFE site; development of a spectroscopy technique to estimate atmospheric total water vapor amount; and study of optimum linear combinations of visible and near-infrared reflectances for estimating the fraction of PAR absorbed by plants.

  2. Freeform surface measurement and characterisation using a toolmakers microscope

    NASA Astrophysics Data System (ADS)

    Seung-yin Wong, Francis; Chauh, Kong-Bieng; Venuvinod, Patri K.

    2014-03-01

    Current freeform surface (FFS) characterization systems mainly cover aspects related to computer-aided design/manufacture (CAD/CAM). This paper describes a new approach that extends into computer-aided inspection (CAI).The following novel features are addressed: blacksquare Feature recognition and extraction from surface data blacksquare Characterisation of properties of the surface's M and N vectors at individual vertex blacksquare Development of a measuring plan using a toolmakers microscope for the inspection of the FFS blacksquare Inspection of the actual FFS produced by CNC milling blacksquare Verification of the measurement results and comparison with the CAD design data Tests have shown that the deviations between the CAI and CAD data were within the estimated uncertainty limits.

  3. Geometric Verification of Dynamic Wave Arc Delivery With the Vero System Using Orthogonal X-ray Fluoroscopic Imaging.

    PubMed

    Burghelea, Manuela; Verellen, Dirk; Poels, Kenneth; Gevaert, Thierry; Depuydt, Tom; Tournel, Koen; Hung, Cecilia; Simon, Viorica; Hiraoka, Masahiro; de Ridder, Mark

    2015-07-15

    The purpose of this study was to define an independent verification method based on on-board orthogonal fluoroscopy to determine the geometric accuracy of synchronized gantry-ring (G/R) rotations during dynamic wave arc (DWA) delivery available on the Vero system. A verification method for DWA was developed to calculate O-ring-gantry (G/R) positional information from ball-bearing positions retrieved from fluoroscopic images of a cubic phantom acquired during DWA delivery. Different noncoplanar trajectories were generated in order to investigate the influence of path complexity on delivery accuracy. The G/R positions detected from the fluoroscopy images (DetPositions) were benchmarked against the G/R angulations retrieved from the control points (CP) of the DWA RT plan and the DWA log files recorded by the treatment console during DWA delivery (LogActed). The G/R rotational accuracy was quantified as the mean absolute deviation ± standard deviation. The maximum G/R absolute deviation was calculated as the maximum 3-dimensional distance between the CP and the closest DetPositions. In the CP versus DetPositions comparison, an overall mean G/R deviation of 0.13°/0.16° ± 0.16°/0.16° was obtained, with a maximum G/R deviation of 0.6°/0.2°. For the LogActed versus DetPositions evaluation, the overall mean deviation was 0.08°/0.15° ± 0.10°/0.10° with a maximum G/R of 0.3°/0.4°. The largest decoupled deviations registered for gantry and ring were 0.6° and 0.4° respectively. No directional dependence was observed between clockwise and counterclockwise rotations. Doubling the dose resulted in a double number of detected points around each CP, and an angular deviation reduction in all cases. An independent geometric quality assurance approach was developed for DWA delivery verification and was successfully applied on diverse trajectories. Results showed that the Vero system is capable of following complex G/R trajectories with maximum deviations during DWA below 0.6°. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  5. Toward Improved Land Surface Initialization in Support of Regional WRF Forecasts at the Kenya Meteorological Department

    NASA Technical Reports Server (NTRS)

    Case. Jonathan; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.

    2014-01-01

    Flooding and drought are two key forecasting challenges for the Kenya Meteorological Department (KMD). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the boundary layer of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-end events over east Africa. KMD currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Nonhydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over eastern Africa. Two organizations at the National Aeronautics and Space Administration Marshall Space Flight Center in Huntsville, AL, SERVIR and the Short-term Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMD for enhancing its regional modeling capabilities. To accomplish this goal, SPoRT and SERVIR will provide experimental land surface initialization datasets and model verification capabilities to KMD. To produce a land-surface initialization more consistent with the resolution of the KMD-WRF runs, the NASA Land Information System (LIS) will be run at a comparable resolution to provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Additionally, real-time green vegetation fraction data from the Visible Infrared Imaging Radiometer Suite will be incorporated into the KMD-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service. Finally, model verification capabilities will be transitioned to KMD using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. The transition of these MET tools will enable KMD to monitor model forecast accuracy in near real time. This presentation will highlight preliminary verification results of WRF runs over east Africa using the LIS land surface initialization.

  6. Critique of Macro Flow/Damage Surface Representations for Metal Matrix Composites Using Micromechanics

    NASA Technical Reports Server (NTRS)

    Lissenden, Cliff J.; Arnold, Steven M.

    1996-01-01

    Guidance for the formulation of robust, multiaxial, constitutive models for advanced materials is provided by addressing theoretical and experimental issues using micromechanics. The multiaxial response of metal matrix composites, depicted in terms of macro flow/damage surfaces, is predicted at room and elevated temperatures using an analytical micromechanical model that includes viscoplastic matrix response as well as fiber-matrix debonding. Macro flow/damage surfaces (i.e., debonding envelopes, matrix threshold surfaces, macro 'yield' surfaces, surfaces of constant inelastic strain rate, and surfaces of constant dissipation rate) are determined for silicon carbide/titanium in three stress spaces. Residual stresses are shown to offset the centers of the flow/damage surfaces from the origin and their shape is significantly altered by debonding. The results indicate which type of flow/damage surfaces should be characterized and what loadings applied to provide the most meaningful experimental data for guiding theoretical model development and verification.

  7. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  8. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  9. Verification on spray simulation of a pintle injector for liquid rocket engine

    NASA Astrophysics Data System (ADS)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  10. Firefly: an optical lithographic system for the fabrication of holographic security labels

    NASA Astrophysics Data System (ADS)

    Calderón, Jorge; Rincón, Oscar; Amézquita, Ricardo; Pulido, Iván.; Amézquita, Sebastián.; Bernal, Andrés.; Romero, Luis; Agudelo, Viviana

    2016-03-01

    This paper introduces Firefly, an optical lithography origination system that has been developed to produce holographic masters of high quality. This mask-less lithography system has a resolution of 418 nm half-pitch, and generates holographic masters with the optical characteristics required for security applications of level 1 (visual verification), level 2 (pocket reader verification) and level 3 (forensic verification). The holographic master constitutes the main core of the manufacturing process of security holographic labels used for the authentication of products and documents worldwide. Additionally, the Firefly is equipped with a software tool that allows for the hologram design from graphic formats stored in bitmaps. The software is capable of generating and configuring basic optical effects such as animation and color, as well as effects of high complexity such as Fresnel lenses, engraves and encrypted images, among others. The Firefly technology gathers together optical lithography, digital image processing and the most advanced control systems, making possible a competitive equipment that challenges the best technologies in the industry of holographic generation around the world. In this paper, a general description of the origination system is provided as well as some examples of its capabilities.

  11. Monitoring/Verification using DMS: TATP Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan Weeks, Kevin Kyle, Manuel Manard

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less

  12. Hand Grasping Synergies As Biometrics

    PubMed Central

    Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K.; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana

    2017-01-01

    Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies—postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security. PMID:28512630

  13. Monitoring/Verification Using DMS: TATP Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Kyle; Stephan Weeks

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less

  14. Cleaning and Cleanliness Measurement of Additive Manufactured Parts

    NASA Technical Reports Server (NTRS)

    Mitchell, Mark A.; Edwards, Kevin; Fox, Eric; Boothe, Richard

    2017-01-01

    Additive Manufacturing processes allow for the manufacture of complex three dimensional components that otherwise could not be manufactured. Post treatment processes require the removal of any remnant bulk powder that may become entrapped within small cavities and channels within a component. This project focuses on several gross cleaning methods and the verification metrics associated with additive manufactured parts for oxygen propulsion usage.

  15. Runtime verification of embedded real-time systems.

    PubMed

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  16. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  17. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  18. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    NASA Astrophysics Data System (ADS)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  19. Aircraft geometry verification with enhanced computer generated displays

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1982-01-01

    A method for visual verification of aerodynamic geometries using computer generated, color shaded images is described. The mathematical models representing aircraft geometries are created for use in theoretical aerodynamic analyses and in computer aided manufacturing. The aerodynamic shapes are defined using parametric bi-cubic splined patches. This mathematical representation is then used as input to an algorithm that generates a color shaded image of the geometry. A discussion of the techniques used in the mathematical representation of the geometry and in the rendering of the color shaded display is presented. The results include examples of color shaded displays, which are contrasted with wire frame type displays. The examples also show the use of mapped surface pressures in terms of color shaded images of V/STOL fighter/attack aircraft and advanced turboprop aircraft.

  20. Intervalence transfer of ferrocene moieties adsorbed on electrode surfaces by a conjugated linkage

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Brown, Lauren E.; Konopelski, Joseph P.; Chen, Shaowei

    2009-03-01

    Effective intervalence transfer occurred between the metal centers of ferrocene moieties that were adsorbed onto a ruthenium thin film surface by ruthenium-carbene π bonds, a direct verification of Hush's four-decade-old prediction. Electrochemical measurements showed two pairs of voltammetric peaks where the separation of the formal potentials suggested a Class II behavior. Additionally, the potential spacing increased with increasing ferrocene surface coverage, most probably as a consequence of the enhanced contribution from through-space electronic interactions between the metal centers. In contrast, the incorporation of a sp 3 carbon spacer into the ferrocene-ruthenium linkage led to the diminishment of interfacial electronic communication.

  1. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  2. RIACS Workshop on the Verification and Validation of Autonomous and Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Visser, Willem; Simmons, Reid

    2001-01-01

    The long-term future of space exploration at NASA is dependent on the full exploitation of autonomous and adaptive systems: careful monitoring of missions from earth, as is the norm now, will be infeasible due to the sheer number of proposed missions and the communication lag for deep-space missions. Mission managers are however worried about the reliability of these more intelligent systems. The main focus of the workshop was to address these worries and hence we invited NASA engineers working on autonomous and adaptive systems and researchers interested in the verification and validation (V&V) of software systems. The dual purpose of the meeting was to: (1) make NASA engineers aware of the V&V techniques they could be using; and (2) make the V&V community aware of the complexity of the systems NASA is developing.

  3. European Train Control System: A Case Study in Formal Verification

    NASA Astrophysics Data System (ADS)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  4. VINCI: the VLT Interferometer commissioning instrument

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre; Coudé du Foresto, Vincent; Glindemann, Andreas; Hofmann, Reiner

    2000-07-01

    The Very Large Telescope Interferometer (VLTI) is a complex system, made of a large number of separated elements. To prepare an early successful operation, it will require a period of extensive testing and verification to ensure that the many devices involved work properly together, and can produce meaningful data. This paper describes the concept chosen for the VLTI commissioning instrument, LEONARDO da VINCI, and details its functionalities. It is a fiber based two-way beam combiner, associated with an artificial star and an alignment verification unit. The technical commissioning of the VLTI is foreseen as a stepwise process: fringes will first be obtained with the commissioning instrument in an autonomous mode (no other parts of the VLTI involved); then the VLTI telescopes and optical trains will be tested in autocollimation; finally fringes will be observed on the sky.

  5. Formal semantics for a subset of VHDL and its use in analysis of the FTPP scoreboard circuit

    NASA Technical Reports Server (NTRS)

    Bickford, Mark

    1994-01-01

    In the first part of the report, we give a detailed description of an operational semantics for a large subset of VHDL, the VHSIC Hardware Description Language. The semantics is written in the functional language Caliban, similar to Haskell, used by the theorem prover Clio. We also describe a translator from VHDL into Caliban semantics and give some examples of its use. In the second part of the report, we describe our experience in using the VHDL semantics to try to verify a large VHDL design. We were not able to complete the verification due to certain complexities of VHDL which we discuss. We propose a VHDL verification method that addresses the problems we encountered but which builds on the operational semantics described in the first part of the report.

  6. Graphics enhanced computer emulation for improved timing-race and fault tolerance control system analysis. [of Centaur liquid-fuel booster

    NASA Technical Reports Server (NTRS)

    Szatkowski, G. P.

    1983-01-01

    A computer simulation system has been developed for the Space Shuttle's advanced Centaur liquid fuel booster rocket, in order to conduct systems safety verification and flight operations training. This simulation utility is designed to analyze functional system behavior by integrating control avionics with mechanical and fluid elements, and is able to emulate any system operation, from simple relay logic to complex VLSI components, with wire-by-wire detail. A novel graphics data entry system offers a pseudo-wire wrap data base that can be easily updated. Visual subsystem operations can be selected and displayed in color on a six-monitor graphics processor. System timing and fault verification analyses are conducted by injecting component fault modes and min/max timing delays, and then observing system operation through a red line monitor.

  7. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.

  8. Optimum surface roughness prediction for titanium alloy by adopting response surface methodology

    NASA Astrophysics Data System (ADS)

    Yang, Aimin; Han, Yang; Pan, Yuhang; Xing, Hongwei; Li, Jinze

    Titanium alloy has been widely applied in industrial engineering products due to its advantages of great corrosion resistance and high specific strength. This paper investigated the processing parameters for finish turning of titanium alloy TC11. Firstly, a three-factor central composite design of experiment, considering the cutting speed, feed rate and depth of cut, are conducted in titanium alloy TC11 and the corresponding surface roughness are obtained. Then a mathematic model is constructed by the response surface methodology to fit the relationship between the process parameters and the surface roughness. The prediction accuracy was verified by the one-way ANOVA. Finally, the contour line of the surface roughness under different combination of process parameters are obtained and used for the optimum surface roughness prediction. Verification experimental results demonstrated that material removal rate (MRR) at the obtained optimum can be significantly improved without sacrificing the surface roughness.

  9. RF verification tasks underway at the Harris Corporation for multiple aperture reflector system

    NASA Technical Reports Server (NTRS)

    Gutwein, T. A.

    1982-01-01

    Mesh effects on gain and patterns and adjacent aperture coupling effects for "pie" and circular apertures are discussed. Wire effects for Harris model with Langley scale model results included for assessing D/lamda effects, and wire effects with adjacent aperture coupling were determined. Reflector surface distortion effects (pillows and manufacturing roughness) were studied.

  10. Crystal structure of the DNA polymerase III β subunit (β-clamp) from the extremophile Deinococcus radiodurans.

    PubMed

    Niiranen, Laila; Lian, Kjersti; Johnson, Kenneth A; Moe, Elin

    2015-02-27

    Deinococcus radiodurans is an extremely radiation and desiccation resistant bacterium which can tolerate radiation doses up to 5,000 Grays without losing viability. We are studying the role of DNA repair and replication proteins for this unusual phenotype by a structural biology approach. The DNA polymerase III β subunit (β-clamp) acts as a sliding clamp on DNA, promoting the binding and processivity of many DNA-acting proteins, and here we report the crystal structure of D. radiodurans β-clamp (Drβ-clamp) at 2.0 Å resolution. The sequence verification process revealed that at the time of the study the gene encoding Drβ-clamp was wrongly annotated in the genome database, encoding a protein of 393 instead of 362 amino acids. The short protein was successfully expressed, purified and used for crystallisation purposes in complex with Cy5-labeled DNA. The structure, which was obtained from blue crystals, shows a typical ring-shaped bacterial β-clamp formed of two monomers, each with three domains of identical topology, but with no visible DNA in electron density. A visualisation of the electrostatic surface potential reveals a highly negatively charged outer surface while the inner surface and the dimer forming interface have a more even charge distribution. The structure of Drβ-clamp was determined to 2.0 Å resolution and shows an evenly distributed electrostatic surface charge on the DNA interacting side. We hypothesise that this charge distribution may facilitate efficient movement on encircled DNA and help ensure efficient DNA metabolism in D. radiodurans upon exposure to high doses of ionizing irradiation or desiccation.

  11. Towards SMOS: The 2006 National Airborne Field Experiment Plan

    NASA Astrophysics Data System (ADS)

    Walker, J. P.; Merlin, O.; Panciera, R.; Kalma, J. D.

    2006-05-01

    The 2006 National Airborne Field Experiment (NAFE) is the second in a series of two intensive experiments to be conducted in different parts of Australia. The NAFE'05 experiment was undertaken in the Goulburn River catchment during November 2005, with the objective to provide high resolution data for process level understanding of soil moisture retrieval, scaling and data assimilation. The NAFE'06 experiment will be undertaken in the Murrumbidgee catchment during November 2006, with the objective to provide data for SMOS (Soil Moisture and Ocean Salinity) level soil moisture retrieval, downscaling and data assimilation. To meet this objective, PLMR (Polarimetric L-band Multibeam Radiometer) and supporting instruments (TIR and NDVI) will be flown at an altitude of 10,000 ft AGL to provide 1km resolution passive microwave data (and 20m TIR) across a 50km x 50km area every 2-3 days. This will both simulate a SMOS pixel and provide the 1km soil moisture data required for downscale verification, allowing downscaling and near-surface soil moisture assimilation techniques to be tested with remote sensing data which is consistent with that from current (MODIS) and planned (SMOS) satellite sensors.. Additionally, two transects will be flown across the area to provide both 1km multi-angular passive microwave data for SMOS algorithm development, and on the same day, 50m resolution passive microwave data for algorithm verification. The study area contains a total of 13 soil moisture profile and rainfall monitoring sites for assimilation verification, and the transect fight lines are planned to go through 5 of these. Ground monitoring of surface soil moisture and vegetation for algorithm verification will be targeted at these 5 focus farms, with soil moisture measurements made at 250m spacing for 1km resolution flights and 50m spacing for 50m resolution flights. While this experiment has a particular emphasis on the remote sensing of soil moisture, it is open for collaboration from interested scientists from all disciplines of environmental remote sensing and its application. See www.nafe.unimelb.edu.au for more detailed information on these experiments.

  12. SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jinfeng; Cao, Ruifen; Dai, Yumei

    Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and speed satisfy the clinical requirement, and it is expectable to be a Monte Carlo dose verification tool for IMRT Plan. Strategic Priority Research Program of the China Academy of Science(XDA03040000); National Natural Science Foundation of China (81101132)« less

  13. Estimation of Ecosystem Parameters of the Community Land Model with DREAM: Evaluation of the Potential for Upscaling Net Ecosystem Exchange

    NASA Astrophysics Data System (ADS)

    Hendricks Franssen, H. J.; Post, H.; Vrugt, J. A.; Fox, A. M.; Baatz, R.; Kumbhar, P.; Vereecken, H.

    2015-12-01

    Estimation of net ecosystem exchange (NEE) by land surface models is strongly affected by uncertain ecosystem parameters and initial conditions. A possible approach is the estimation of plant functional type (PFT) specific parameters for sites with measurement data like NEE and application of the parameters at other sites with the same PFT and no measurements. This upscaling strategy was evaluated in this work for sites in Germany and France. Ecosystem parameters and initial conditions were estimated with NEE-time series of one year length, or a time series of only one season. The DREAM(zs) algorithm was used for the estimation of parameters and initial conditions. DREAM(zs) is not limited to Gaussian distributions and can condition to large time series of measurement data simultaneously. DREAM(zs) was used in combination with the Community Land Model (CLM) v4.5. Parameter estimates were evaluated by model predictions at the same site for an independent verification period. In addition, the parameter estimates were evaluated at other, independent sites situated >500km away with the same PFT. The main conclusions are: i) simulations with estimated parameters reproduced better the NEE measurement data in the verification periods, including the annual NEE-sum (23% improvement), annual NEE-cycle and average diurnal NEE course (error reduction by factor 1,6); ii) estimated parameters based on seasonal NEE-data outperformed estimated parameters based on yearly data; iii) in addition, those seasonal parameters were often also significantly different from their yearly equivalents; iv) estimated parameters were significantly different if initial conditions were estimated together with the parameters. We conclude that estimated PFT-specific parameters improve land surface model predictions significantly at independent verification sites and for independent verification periods so that their potential for upscaling is demonstrated. However, simulation results also indicate that possibly the estimated parameters mask other model errors. This would imply that their application at climatic time scales would not improve model predictions. A central question is whether the integration of many different data streams (e.g., biomass, remotely sensed LAI) could solve the problems indicated here.

  14. Supersonic gas-liquid cleaning system

    NASA Technical Reports Server (NTRS)

    Caimi, Raoul E. B.; Thaxton, Eric A.

    1994-01-01

    A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.

  15. Supersonic gas-liquid cleaning system

    NASA Astrophysics Data System (ADS)

    Caimi, Raoul E. B.; Thaxton, Eric A.

    1994-02-01

    A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mokhov, N. V.; Eidelman, Yu. I.; Rakhno, I. L.

    Comprehensive studies with the MARS15(2016) Monte-Carlo code are described on evaluation of prompt and residual radiation levels induced by nominal and accidental beam losses in the 5-MW, 2-GeV European Spallation Source (ESS) Linac. These are to provide a basis for radiation shielding design verification through the accelerator complex. The calculation model is based on the latest engineering design and includes a sophisticated algorithm for particle tracking in the machine RF cavities as well as a well-established model of the beam loss. Substantial efforts were put in solving the deep-penetration problem for the thick shielding around the tunnel with numerous complexmore » penetrations. It allowed us to study in detail not only the prompt dose, but also component and air activation, radiation loads on the soil outside the tunnel, and skyshine studies for the complicated 3-D surface above the machine. Among the other things, the newest features in MARS15 (2016), such as a ROOT-based beamline builder and a TENDL-based event generator for nuclear interactions below 100 MeV, were very useful in this challenging application« less

  17. Combining Mechanistic Approaches for Studying Eco-Hydro-Geomorphic Coupling

    NASA Astrophysics Data System (ADS)

    Francipane, A.; Ivanov, V.; Akutina, Y.; Noto, V.; Istanbullouglu, E.

    2008-12-01

    Vegetation interacts with hydrology and geomorphic form and processes of a river basin in profound ways. Despite recent advances in hydrological modeling, the dynamic coupling between these processes is yet to be adequately captured at the basin scale to elucidate key features of process interaction and their role in the organization of vegetation and landscape morphology. In this study, we present a blueprint for integrating a geomorphic component into the physically-based, spatially distributed ecohydrological model, tRIBS- VEGGIE, which reproduces essential water and energy processes over the complex topography of a river basin and links them to the basic plant life regulatory processes. We present a preliminary design of the integrated modeling framework in which hillslope and channel erosion processes at the catchment scale, will be coupled with vegetation-hydrology dynamics. We evaluate the developed framework by applying the integrated model to Lucky Hills basin, a sub-catchment of the Walnut Gulch Experimental Watershed (Arizona). The evaluation is carried out by comparing sediment yields at the basin outlet, that follows a detailed verification of simulated land-surface energy partition, biomass dynamics, and soil moisture states.

  18. Empirical Model for Predicting Rockfall Trajectory Direction

    NASA Astrophysics Data System (ADS)

    Asteriou, Pavlos; Tsiambaos, George

    2016-03-01

    A methodology for the experimental investigation of rockfall in three-dimensional space is presented in this paper, aiming to assist on-going research of the complexity of a block's response to impact during a rockfall. An extended laboratory investigation was conducted, consisting of 590 tests with cubical and spherical blocks made of an artificial material. The effects of shape, slope angle and the deviation of the post-impact trajectory are examined as a function of the pre-impact trajectory direction. Additionally, an empirical model is proposed that estimates the deviation of the post-impact trajectory as a function of the pre-impact trajectory with respect to the slope surface and the slope angle. This empirical model is validated by 192 small-scale field tests, which are also presented in this paper. Some important aspects of the three-dimensional nature of rockfall phenomena are highlighted that have been hitherto neglected. The 3D space data provided in this study are suitable for the calibration and verification of rockfall analysis software that has become increasingly popular in design practice.

  19. Extension of the XGC code for global gyrokinetic simulations in stellarator geometry

    NASA Astrophysics Data System (ADS)

    Cole, Michael; Moritaka, Toseo; White, Roscoe; Hager, Robert; Ku, Seung-Hoe; Chang, Choong-Seock

    2017-10-01

    In this work, the total-f, gyrokinetic particle-in-cell code XGC is extended to treat stellarator geometries. Improvements to meshing tools and the code itself have enabled the first physics studies, including single particle tracing and flux surface mapping in the magnetic geometry of the heliotron LHD and quasi-isodynamic stellarator Wendelstein 7-X. These have provided the first successful test cases for our approach. XGC is uniquely placed to model the complex edge physics of stellarators. A roadmap to such a global confinement modeling capability will be presented. Single particle studies will include the physics of energetic particles' global stochastic motions and their effect on confinement. Good confinement of energetic particles is vital for a successful stellarator reactor design. These results can be compared in the core region with those of other codes, such as ORBIT3d. In subsequent work, neoclassical transport and turbulence can then be considered and compared to results from codes such as EUTERPE and GENE. After sufficient verification in the core region, XGC will move into the stellarator edge region including the material wall and neutral particle recycling.

  20. Electro-thermal modelling of anode and cathode in micro-EDM

    NASA Astrophysics Data System (ADS)

    Yeo, S. H.; Kurnia, W.; Tan, P. C.

    2007-04-01

    Micro-electrical discharge machining is an evolution of conventional EDM used for fabricating three-dimensional complex micro-components and microstructure with high precision capabilities. However, due to the stochastic nature of the process, it has not been fully understood. This paper proposes an analytical model based on electro-thermal theory to estimate the geometrical dimensions of micro-crater. The model incorporates voltage, current and pulse-on-time during material removal to predict the temperature distribution on the workpiece as a result of single discharges in micro-EDM. It is assumed that the entire superheated area is ejected from the workpiece surface while only a small fraction of the molten area is expelled. For verification purposes, single discharge experiments using RC pulse generator are performed with pure tungsten as the electrode and AISI 4140 alloy steel as the workpiece. For the pulse-on-time range up to 1000 ns, the experimental and theoretical results are found to be in close agreement with average volume approximation errors of 2.7% and 6.6% for the anode and cathode, respectively.

  1. Toward Improved Land Surface Initialization in Support of Regional WRF Forecasts at the Kenya Meteorological Service (KMS)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.

    2014-01-01

    SPoRT/SERVIR/RCMRD/KMS Collaboration: Builds off strengths of each organization. SPoRT: Transition of satellite, modeling and verification capabilities; SERVIR-Africa/RCMRD: International capacity-building expertise; KMS: Operational organization with regional weather forecasting expertise in East Africa. Hypothesis: Improved land-surface initialization over Eastern Africa can lead to better temperature, moisture, and ultimately precipitation forecasts in NWP models. KMS currently initializes Weather Research and Forecasting (WRF) model with NCEP/Global Forecast System (GFS) model 0.5-deg initial / boundary condition data. LIS will provide much higher-resolution land-surface data at a scale more representative to regional WRF configuration. Future implementation of real-time NESDIS/VIIRS vegetation fraction to further improve land surface representativeness.

  2. USB environment measurements based on full-scale static engine ground tests. [Upper Surface Blowing for YC-14

    NASA Technical Reports Server (NTRS)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.

  3. Thermal Pollution Mathematical Model. Volume 6; Verification of Three-Dimensional Free-Surface Model at Anclote Anchorage; [environment impact of thermal discharges from power plants

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1980-01-01

    The free-surface model presented is for tidal estuaries and coastal regions where ambient tidal forces play an important role in the dispersal of heated water. The model is time dependent, three dimensional, and can handle irregular bottom topography. The vertical stretching coordinate is adopted for better treatment of kinematic condition at the water surface. The results include surface elevation, velocity, and temperature. The model was verified at the Anclote Anchorage site of Florida Power Company. Two data bases at four tidal stages for winter and summer conditions were used to verify the model. Differences between measured and predicted temperatures are on an average of less than 1 C.

  4. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  5. TH-AB-202-02: Real-Time Verification and Error Detection for MLC Tracking Deliveries Using An Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E

    2016-06-15

    Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less

  6. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  7. Crewed Space Vehicle Battery Safety Requirements

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.; Darcy, Eric C.

    2014-01-01

    This requirements document is applicable to all batteries on crewed spacecraft, including vehicle, payload, and crew equipment batteries. It defines the specific provisions required to design a battery that is safe for ground personnel and crew members to handle and/or operate during all applicable phases of crewed missions, safe for use in the enclosed environment of a crewed space vehicle, and safe for use in launch vehicles, as well as in unpressurized spaces adjacent to the habitable portion of a space vehicle. The required provisions encompass hazard controls, design evaluation, and verification. The extent of the hazard controls and verification required depends on the applicability and credibility of the hazard to the specific battery design and applicable missions under review. Evaluation of the design and verification program results shall be completed prior to certification for flight and ground operations. This requirements document is geared toward the designers of battery systems to be used in crewed vehicles, crew equipment, crew suits, or batteries to be used in crewed vehicle systems and payloads (or experiments). This requirements document also applies to ground handling and testing of flight batteries. Specific design and verification requirements for a battery are dependent upon the battery chemistry, capacity, complexity, charging, environment, and application. The variety of battery chemistries available, combined with the variety of battery-powered applications, results in each battery application having specific, unique requirements pertinent to the specific battery application. However, there are basic requirements for all battery designs and applications, which are listed in section 4. Section 5 includes a description of hazards and controls and also includes requirements.

  8. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  9. Development of an inpatient operational pharmacy productivity model.

    PubMed

    Naseman, Ryan W; Lopez, Ben R; Forrey, Ryan A; Weber, Robert J; Kipp, Kris M

    2015-02-01

    An innovative model for measuring the operational productivity of medication order management in inpatient settings is described. Order verification within a computerized prescriber order-entry system was chosen as the pharmacy workload driver. To account for inherent variability in the tasks involved in processing different types of orders, pharmaceutical products were grouped by class, and each class was assigned a time standard, or "medication complexity weight" reflecting the intensity of pharmacist and technician activities (verification of drug indication, verification of appropriate dosing, adverse-event prevention and monitoring, medication preparation, product checking, product delivery, returns processing, nurse/provider education, and problem-order resolution). The resulting "weighted verifications" (WV) model allows productivity monitoring by job function (pharmacist versus technician) to guide hiring and staffing decisions. A 9-month historical sample of verified medication orders was analyzed using the WV model, and the calculations were compared with values derived from two established models—one based on the Case Mix Index (CMI) and the other based on the proprietary Pharmacy Intensity Score (PIS). Evaluation of Pearson correlation coefficients indicated that values calculated using the WV model were highly correlated with those derived from the CMI-and PIS-based models (r = 0.845 and 0.886, respectively). Relative to the comparator models, the WV model offered the advantage of less period-to-period variability. The WV model yielded productivity data that correlated closely with values calculated using two validated workload management models. The model may be used as an alternative measure of pharmacy operational productivity. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. Verification of Algebra Step Problems: A Chronometric Study of Human Problem Solving. Technical Report No. 253. Psychology and Education Series.

    ERIC Educational Resources Information Center

    Matthews, Paul G.; Atkinson, Richard C.

    This paper reports an experiment designed to test theoretical relations among fast problem solving, more complex and slower problem solving, and research concerning fundamental memory processes. Using a cathode ray tube, subjects were presented with propositions of the form "Y is in list X" which they memorized. In later testing they were asked to…

  11. Assimilation of GOES Land Surface Data into a Mesoscale Models

    NASA Technical Reports Server (NTRS)

    Lapenta, William M.; Suggs, Ron; McNider, Richard T.; Jedlovec, Gary; Dembek, Scott; Goodman, H. Michael (Technical Monitor)

    2001-01-01

    A technique has been developed for assimilating Geostationary Operational Environmental Satellite (GOES)-derived skin temperature tendencies and insolation into the surface energy budget equation of a mesoscale model so that the simulated rate of temperature change closely agrees with the satellite observations. A critical assumption of the technique is that the availability of moisture (either from the soil or vegetation) is the least known term in the model's surface energy budget. Therefore, the simulated latent heat flux, which is a function of surface moisture availability, is adjusted based upon differences between the modeled and satellite-observed skin temperature tendencies. An advantage of this technique is that satellite temperature tendencies are assimilated in an energetically consistent manner that avoids energy imbalances and surface stability problems that arise from direct assimilation of surface shelter temperatures. The fact that the rate of change of the satellite skin temperature is used rather than the absolute temperature means that sensor calibration is not as critical. The assimilation technique has been applied to the Oklahoma-Kansas region during the spring-summer 2000 time period when dynamic changes in vegetation cover occur. In April, central Oklahoma is characterized by large NDVI associated with winter wheat while surrounding areas are primarily rangeland with lower NDVI. In July the vegetation pattern reverses as the central wheat area changes to low NDVI due to harvesting and the surrounding rangeland is greener than it was in April. The goal of this study is to determine if assimilating satellite land surface data can improve simulation of the complex spatial distribution of surface energy and water fluxes across this region. The PSU/NCAR NM5 V3 system is used in this study. The grid configuration consists of a 36-km CONUS domain and a 12-km nest over the area of interest. Bulk verification statistics (BIAS and RMSE) of surface air temperature and dewpoint indicates that assimilation of the satellite data results reduces both the bias and RMSE for both state variables. In addition, comparison of model data with ARM/CART EBBR flux observations reveals that the assimilation technique adjusts the bowen ratio in a realistic fashion.

  12. Handling Qualities of Model Reference Adaptive Controllers with Varying Complexity for Pitch-Roll Coupled Failures

    NASA Technical Reports Server (NTRS)

    Schaefer, Jacob; Hanson, Curt; Johnson, Marcus A.; Nguyen, Nhan

    2011-01-01

    Three model reference adaptive controllers (MRAC) with varying levels of complexity were evaluated on a high performance jet aircraft and compared along with a baseline nonlinear dynamic inversion controller. The handling qualities and performance of the controllers were examined during failure conditions that induce coupling between the pitch and roll axes. Results from flight tests showed with a roll to pitch input coupling failure, the handling qualities went from Level 2 with the baseline controller to Level 1 with the most complex MRAC tested. A failure scenario with the left stabilator frozen also showed improvement with the MRAC. Improvement in performance and handling qualities was generally seen as complexity was incrementally added; however, added complexity usually corresponds to increased verification and validation effort required for certification. The tradeoff between complexity and performance is thus important to a controls system designer when implementing an adaptive controller on an aircraft. This paper investigates this relation through flight testing of several controllers of vary complexity.

  13. Evaluation and clinical implementation of in vivo dosimetry for kV radiotherapy using radiochromic film and micro-silica bead thermoluminescent detectors.

    PubMed

    Palmer, Antony L; Jafari, Shakardokht M; Mone, Ioanna; Muscat, Sarah

    2017-10-01

    kV radiotherapy treatment calculations are based on flat, homogenous, full-scatter reference conditions. However, clinical treatments often include surface irregularities and inhomogeneities, causing uncertainty. Therefore, confirmation of actual delivered doses in vivo is valuable. The current study evaluates, and implements, radiochromic film and micro silica bead TLD for in vivo kV dosimetry. The kV energy and dose response of EBT3 film and silica bead TLD was established and uncertainty budgets determined. In vivo dosimetry measurements were made for a consecutive series of 30 patients using the two dosimetry systems. Energy dependent calibration factors were required for both dosimetry systems. The standard uncertainty estimate for in vivo measurement with film was 1.7% and for beads was 1.5%. The mean measured dose was -2.1% for film and -2.6% for beads compared to prescription. Deviations up to -9% were found in cases of large surface irregularity, or with underlying air cavities or bone. Dose shielding by beads could be clinically relevant at low kV energies and superficial depths. Both film and beads may be used to provide in vivo verification of delivered doses in kV radiotherapy, particularly for complex situations that are not well represented by standard reference condition calculations. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Verification and Analysis of Formulation 4 of Langley for the Study of Noise From High Speed Surfaces

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Farris, Mark

    1999-01-01

    There are several approaches to the prediction of the noise from sources on high speed surfaces. Two of these are the Kirchhoff and the Ffowcs williams-Hawkings methods. It can be shown that both of these methods depend on the solution of the wave equation with mathematically similar inhomogeneous source terms. Two subsonic solutions known as Formulation 1 and 1A of Langley are simple and efficient for noise prediction. The supersonic solution known as Formulation 3 is very complicated and difficult to code. Because of the complexity of the result, the computation time is longer than the subsonic formulas. Furthermore, it is difficult to assess the accuracy of noise prediction. We have been searching for a new and simpler supersonic formulation without these shortcomings. In the last AIAA Aeroacoustics Conference in Toulouse, Farassat, Dunn and Brentner presented a paper in which such a result was presented and called Formulation 4 of Langley. In this paper we will present two analytic tests of the validity this Formulation: 1) the noise from dipole distribution on the unit circle whose strength varies radially with the square of the distance from the center and 2) the noise from dipole distribution on the unit sphere whose strength varies with the cosine of the angle from the polar axis. We will discuss the question of singularities of Formulation 4.

  15. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  16. Fault management for the Space Station Freedom control center

    NASA Technical Reports Server (NTRS)

    Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet

    1992-01-01

    This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.

  17. Rendezvous Integration Complexities of NASA Human Flight Vehicles

    NASA Technical Reports Server (NTRS)

    Brazzel, Jack P.; Goodman, John L.

    2009-01-01

    Propellant-optimal trajectories, relative sensors and navigation, and docking/capture mechanisms are rendezvous disciplines that receive much attention in the technical literature. However, other areas must be considered. These include absolute navigation, maneuver targeting, attitude control, power generation, software development and verification, redundancy management, thermal control, avionics integration, robotics, communications, lighting, human factors, crew timeline, procedure development, orbital debris risk mitigation, structures, plume impingement, logistics, and in some cases extravehicular activity. While current and future spaceflight programs will introduce new technologies and operations concepts, the complexity of integrating multiple systems on multiple spacecraft will remain. The systems integration task may become more difficult as increasingly complex software is used to meet current and future automation, autonomy, and robotic operation requirements.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanz Rodrigo, Javier; Chávez Arroyo, Roberto Aurelio; Moriarty, Patrick

    The increasing size of wind turbines, with rotors already spanning more than 150 m diameter and hub heights above 100 m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer structure with unique physics. This poses significant challenges to traditional wind engineering models that rely on surface-layer theories and engineering wind farm models to simulate the flow in and around wind farms. However, adopting an ABL approach offers the opportunity to better integrate wind farm design tools and meteorological models. The challenge ismore » how to build the bridge between atmospheric and wind engineering model communities and how to establish a comprehensive evaluation process that identifies relevant physical phenomena for wind energy applications with modeling and experimental requirements. A framework for model verification, validation, and uncertainty quantification is established to guide this process by a systematic evaluation of the modeling system at increasing levels of complexity. In terms of atmospheric physics, 'building the bridge' means developing models for the so-called 'terra incognita,' a term used to designate the turbulent scales that transition from mesoscale to microscale. This range of scales within atmospheric research deals with the transition from parameterized to resolved turbulence and the improvement of surface boundary-layer parameterizations. The coupling of meteorological and wind engineering flow models and the definition of a formal model evaluation methodology, is a strong area of research for the next generation of wind conditions assessment and wind farm and wind turbine design tools. Some fundamental challenges are identified in order to guide future research in this area.« less

  19. VERIFICATION OF SURFACE LAYER OZONE FORECASTS IN THE NOAA/EPA AIR QUALITY FORECAST SYSTEM IN DIFFERENT REGIONS UNDER DIFFERENT SYNOPTIC SCENARIOS

    EPA Science Inventory

    An air quality forecast (AQF) system has been established at NOAA/NCEP since 2003 as a collaborative effort of NOAA and EPA. The system is based on NCEP's Eta mesoscale meteorological model and EPA's CMAQ air quality model (Davidson et al, 2004). The vision behind this system is ...

  20. EMC MODEL FORECAST VERIFICATION STATS

    Science.gov Websites

    48-H FCST 54-H FCST 60-H FCST 72-H FCST 84-H FCST Loop 500 mb Height BIAS and RMSE CONUS VALID 00Z sub-regions) Surface Wind Vector BIAS and RMSE REGION VALID 00Z VALID 12Z VALID 00Z (loop) VALID 12Z (loop) GMC (Gulf of Mexico Coast) * * * * SEC (Southeast Coast) * * * * NEC (Northeast Coast

  1. 40 CFR 1066.225 - Roll runout and diameter verification procedure.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... section. (2) Measure roll diameter using a Pi Tape®. Orient the Pi Tape® to the marker line at the desired measurement location with the Pi Tape® hook pointed outward. Temporarily secure the Pi Tape® to the roll near the hook end with adhesive tape. Slowly turn the roll, wrapping the Pi Tape® around the roll surface...

  2. 40 CFR 1066.225 - Roll runout and diameter verification procedure.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Measure roll diameter using a Pi Tape®. Orient the Pi Tape® to the marker line at the desired measurement location with the Pi Tape® hook pointed outward. Temporarily secure the Pi Tape® to the roll near the hook end with adhesive tape. Slowly turn the roll, wrapping the Pi Tape® around the roll surface. Ensure...

  3. 40 CFR 1066.225 - Roll runout and diameter verification procedure.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... section. (2) Measure roll diameter using a Pi Tape®. Orient the Pi Tape® to the marker line at the desired measurement location with the Pi Tape® hook pointed outward. Temporarily secure the Pi Tape® to the roll near the hook end with adhesive tape. Slowly turn the roll, wrapping the Pi Tape® around the roll surface...

  4. Supersonic Gas-Liquid Cleaning System

    NASA Technical Reports Server (NTRS)

    Kinney, Frank

    1996-01-01

    The Supersonic Gas-Liquid Cleaning System Research Project consisted mainly of a feasibility study, including theoretical and engineering analysis, of a proof-of-concept prototype of this particular cleaning system developed by NASA-KSC. The cleaning system utilizes gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the device to be cleaned. The cleaning fluid being accelerated to these high velocities may consist of any solvent or liquid, including water. Compressed air or any inert gas is used to provide the conveying medium for the liquid, as well as substantially reduce the total amount of liquid needed to perform adequate surface cleaning and cleanliness verification. This type of aqueous cleaning system is considered to be an excellent way of conducting cleaning and cleanliness verification operations as replacements for the use of CFC 113 which must be discontinued by 1995. To utilize this particular cleaning system in various cleaning applications for both the Space Program and the commercial market, it is essential that the cleaning system, especially the supersonic nozzle, be characterized for such applications. This characterization consisted of performing theoretical and engineering analysis, identifying desirable modifications/extensions to the basic concept, evaluating effects of variations in operating parameters, and optimizing hardware design for specific applications.

  5. Figures of Merit for Control Verification

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Goesu. Daniel P.

    2008-01-01

    This paper proposes a methodology for evaluating a controller's ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.

  6. On the use of advanced numerical models for the evaluation of dosimetric parameters and the verification of exposure limits at workplaces.

    PubMed

    Catarinucci, L; Tarricone, L

    2009-12-01

    With the next transposition of the 2004/40/EC Directive, employers will become responsible for the electromagnetic field level at the workplace. To make this task easier, the scientific community is compiling practical guidelines to be followed. This work aims at enriching such guidelines, especially for the dosimetric issues. More specifically, some critical aspects related to the application of numerical dosimetric techniques for the verification of the safety limit compliance have been highlighted. In particular, three different aspects have been considered: the dosimetric parameter dependence on the shape and the inner characterisation of the exposed subject as well as on the numerical algorithm used, and the correlation between reference limits and basic restriction. Results and discussions demonstrate how, even by using sophisticated numerical techniques, in some cases a complex interpretation of the result is mandatory.

  7. The selected reaction monitoring/multiple reaction monitoring-based mass spectrometry approach for the accurate quantitation of proteins: clinical applications in the cardiovascular diseases.

    PubMed

    Gianazza, Erica; Tremoli, Elena; Banfi, Cristina

    2014-12-01

    Selected reaction monitoring, also known as multiple reaction monitoring, is a powerful targeted mass spectrometry approach for a confident quantitation of proteins/peptides in complex biological samples. In recent years, its optimization and application have become pivotal and of great interest in clinical research to derive useful outcomes for patient care. Thus, selected reaction monitoring/multiple reaction monitoring is now used as a highly sensitive and selective method for the evaluation of protein abundances and biomarker verification with potential applications in medical screening. This review describes technical aspects for the development of a robust multiplex assay and discussing its recent applications in cardiovascular proteomics: verification of promising disease candidates to select only the highest quality peptides/proteins for a preclinical validation, as well as quantitation of protein isoforms and post-translational modifications.

  8. EVA Design, Verification, and On-Orbit Operations Support Using Worksite Analysis

    NASA Technical Reports Server (NTRS)

    Hagale, Thomas J.; Price, Larry R.

    2000-01-01

    The International Space Station (ISS) design is a very large and complex orbiting structure with thousands of Extravehicular Activity (EVA) worksites. These worksites are used to assemble and maintain the ISS. The challenge facing EVA designers was how to design, verify, and operationally support such a large number of worksites within cost and schedule. This has been solved through the practical use of computer aided design (CAD) graphical techniques that have been developed and used with a high degree of success over the past decade. The EVA design process allows analysts to work concurrently with hardware designers so that EVA equipment can be incorporated and structures configured to allow for EVA access and manipulation. Compliance with EVA requirements is strictly enforced during the design process. These techniques and procedures, coupled with neutral buoyancy underwater testing, have proven most valuable in the development, verification, and on-orbit support of planned or contingency EVA worksites.

  9. First Order Reliability Application and Verification Methods for Semistatic Structures

    NASA Technical Reports Server (NTRS)

    Verderaime, Vincent

    1994-01-01

    Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored by conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments, its stress audits are shown to be arbitrary and incomplete, and it compromises high strength materials performance. A reliability method is proposed which combines first order reliability principles with deterministic design variables and conventional test technique to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety index expression. The application is reduced to solving for a factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and with the pace of semistatic structural designs.

  10. On Crowd-verification of Biological Networks

    PubMed Central

    Ansari, Sam; Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Hayes, William; Hoeng, Julia; Iskandar, Anita; Kleiman, Robin; Norel, Raquel; O’Neel, Bruce; Peitsch, Manuel C.; Poussin, Carine; Pratt, Dexter; Rhrissorrakrai, Kahn; Schlage, Walter K.; Stolovitzky, Gustavo; Talikka, Marja

    2013-01-01

    Biological networks with a structured syntax are a powerful way of representing biological information generated from high density data; however, they can become unwieldy to manage as their size and complexity increase. This article presents a crowd-verification approach for the visualization and expansion of biological networks. Web-based graphical interfaces allow visualization of causal and correlative biological relationships represented using Biological Expression Language (BEL). Crowdsourcing principles enable participants to communally annotate these relationships based on literature evidences. Gamification principles are incorporated to further engage domain experts throughout biology to gather robust peer-reviewed information from which relationships can be identified and verified. The resulting network models will represent the current status of biological knowledge within the defined boundaries, here processes related to human lung disease. These models are amenable to computational analysis. For some period following conclusion of the challenge, the published models will remain available for continuous use and expansion by the scientific community. PMID:24151423

  11. LIHE Spectral Dynamics and Jaguar Data Acquisition System Measurement Assurance Results 2014.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covert, Timothy T.; Willis, Michael David; Radtke, Gregg Arthur

    2015-06-01

    The Light Initiated High Explosive (LIHE) facility performs high rigor, high consequence impulse testing for the nuclear weapons (NW) community. To support the facility mission, LIHE's extensive data acquisition system (DAS) is comprised of several discrete components as well as a fully integrated system. Due to the high consequence and high rigor of the testing performed at LIHE, a measurement assurance plan (MAP) was developed in collaboration with NW system customers to meet their data quality needs and to provide assurance of the robustness of the LIHE DAS. While individual components of the DAS have been calibrated by the SNLmore » Primary Standards Laboratory (PSL), the integrated nature of this complex system requires verification of the complete system, from end-to-end. This measurement assurance plan (MAP) report documents the results of verification and validation procedures used to ensure that the data quality meets customer requirements.« less

  12. Gender verification: a term whose time has come and gone.

    PubMed

    Hercher, Laura

    2010-12-01

    The process of testing to determine gender in putatively female athletes was developed in order to prevent cheating, but has devolved instead into a clumsy mechanism for detecting disorders of sexual development (DSD's). In over thirty years of compulsory testing, individuals with DSD's have been stigmatized and some have been denied the right to compete, although frequently their condition provided no competitive advantage. More recent guidelines require testing only on a case-by-case basis; the South African runner Caster Semenya was the first major test of this policy, and her experience points to the need for a more sensitive and confidential process. In addition, her case dramatizes the inadequacy of the term "gender verification." Gender identity is a complex entity and resists simple classification. Sports authorities may set guidelines for who can compete, but they should refrain from taking on themselves the authority to decide who is and who is not a female.

  13. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  14. Anatomy-corresponding method of IMRT verification.

    PubMed

    Winiecki, Janusz; Zurawski, Zbigniew; Drzewiecka, Barbara; Slosarek, Krzysztof

    2010-01-01

    During a proper execution of dMLC plans, there occurs an undesired but frequent effect of the dose locally accumulated by tissue being significantly different than expected. The conventional dosimetric QA procedures give only a partial picture of the quality of IMRT treatment, because their solely quantitative outcomes usually correspond more to the total area of the detector than the actually irradiated volume. The aim of this investigation was to develop a procedure of dynamic plans verification which would be able to visualize the potential anomalies of dose distribution and specify which tissue they exactly refer to. The paper presents a method discovered and clinically examined in our department. It is based on a Gamma Evaluation concept and allows accurate localization of deviations between predicted and acquired dose distributions, which were registered by portal as well as film dosimetry. All the calculations were performed on the self-made software GammaEval, the γ-images (2-dimensional distribution of γ-values) and γ-histograms were created as quantitative outcomes of verification. Over 150 maps of dose distribution have been analyzed and the cross-examination of the gamma images with DRRs was performed. It seems, that the complex monitoring of treatment would be possible owing to the images obtained as a cross-examination of γ-images and corresponding DRRs.

  15. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    NASA Astrophysics Data System (ADS)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  16. Strand-specific Recognition of DNA Damages by XPD Provides Insights into Nucleotide Excision Repair Substrate Versatility*

    PubMed Central

    Buechner, Claudia N.; Heil, Korbinian; Michels, Gudrun; Carell, Thomas; Kisker, Caroline; Tessmer, Ingrid

    2014-01-01

    Recognition and removal of DNA damages is essential for cellular and organismal viability. Nucleotide excision repair (NER) is the sole mechanism in humans for the repair of carcinogenic UV irradiation-induced photoproducts in the DNA, such as cyclobutane pyrimidine dimers. The broad substrate versatility of NER further includes, among others, various bulky DNA adducts. It has been proposed that the 5′-3′ helicase XPD (xeroderma pigmentosum group D) protein plays a decisive role in damage verification. However, despite recent advances such as the identification of a DNA-binding channel and central pore in the protein, through which the DNA is threaded, as well as a dedicated lesion recognition pocket near the pore, the exact process of target site recognition and verification in eukaryotic NER still remained elusive. Our single molecule analysis by atomic force microscopy reveals for the first time that XPD utilizes different recognition strategies to verify structurally diverse lesions. Bulky fluorescein damage is preferentially detected on the translocated strand, whereas the opposite strand preference is observed for a cyclobutane pyrimidine dimer lesion. Both states, however, lead to similar conformational changes in the resulting specific complexes, indicating a merge to a “final” verification state, which may then trigger the recruitment of further NER proteins. PMID:24338567

  17. Property-Based Monitoring of Analog and Mixed-Signal Systems

    NASA Astrophysics Data System (ADS)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  18. Surface plasmon resonances in liquid metal nanoparticles

    NASA Astrophysics Data System (ADS)

    Ershov, A. E.; Gerasimov, V. S.; Gavrilyuk, A. P.; Karpov, S. V.

    2017-06-01

    We have shown significant suppression of resonant properties of metallic nanoparticles at the surface plasmon frequency during the phase transition "solid-liquid" in the basic materials of nanoplasmonics (Ag, Au). Using experimental values of the optical constants of liquid and solid metals, we have calculated nanoparticle plasmonic absorption spectra. The effect was demonstrated for single particles, dimers and trimers, as well as for the large multiparticle colloidal aggregates. Experimental verification was performed for single Au nanoparticles heated to the melting temperature and above up to full suppression of the surface plasmon resonance. It is emphasized that this effect may underlie the nonlinear optical response of composite materials containing plasmonic nanoparticles and their aggregates.

  19. USB environment measurements based on full-scale static engine ground tests

    NASA Technical Reports Server (NTRS)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.

  20. First-Principles Framework to Compute Sum-Frequency Generation Vibrational Spectra of Semiconductors and Insulators.

    PubMed

    Wan, Quan; Galli, Giulia

    2015-12-11

    We present a first-principles framework to compute sum-frequency generation (SFG) vibrational spectra of semiconductors and insulators. The method is based on density functional theory and the use of maximally localized Wannier functions to compute the response to electric fields, and it includes the effect of electric field gradients at surfaces. In addition, it includes quadrupole contributions to SFG spectra, thus enabling the verification of the dipole approximation, whose validity determines the surface specificity of SFG spectroscopy. We compute the SFG spectra of ice I_{h} basal surfaces and identify which spectra components are affected by bulk contributions. Our results are in good agreement with experiments at low temperature.

  1. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  2. Evaluating the performance of the two-phase flow solver interFoam

    NASA Astrophysics Data System (ADS)

    Deshpande, Suraj S.; Anumolu, Lakshman; Trujillo, Mario F.

    2012-01-01

    The performance of the open source multiphase flow solver, interFoam, is evaluated in this work. The solver is based on a modified volume of fluid (VoF) approach, which incorporates an interfacial compression flux term to mitigate the effects of numerical smearing of the interface. It forms a part of the C + + libraries and utilities of OpenFOAM and is gaining popularity in the multiphase flow research community. However, to the best of our knowledge, the evaluation of this solver is confined to the validation tests of specific interest to the users of the code and the extent of its applicability to a wide range of multiphase flow situations remains to be explored. In this work, we have performed a thorough investigation of the solver performance using a variety of verification and validation test cases, which include (i) verification tests for pure advection (kinematics), (ii) dynamics in the high Weber number limit and (iii) dynamics of surface tension-dominated flows. With respect to (i), the kinematics tests show that the performance of interFoam is generally comparable with the recent algebraic VoF algorithms; however, it is noticeably worse than the geometric reconstruction schemes. For (ii), the simulations of inertia-dominated flows with large density ratios {\\sim }\\mathscr {O}(10^3) yielded excellent agreement with analytical and experimental results. In regime (iii), where surface tension is important, consistency of pressure-surface tension formulation and accuracy of curvature are important, as established by Francois et al (2006 J. Comput. Phys. 213 141-73). Several verification tests were performed along these lines and the main findings are: (a) the algorithm of interFoam ensures a consistent formulation of pressure and surface tension; (b) the curvatures computed by the solver converge to a value slightly (10%) different from the analytical value and a scope for improvement exists in this respect. To reduce the disruptive effects of spurious currents, we followed the analysis of Galusinski and Vigneaux (2008 J. Comput. Phys. 227 6140-64) and arrived at the following criterion for stable capillary simulations for interFoam: \\Delta t\\leqslant \\max (10\\tau _\\mu , 0.1\\tau _\\rho) where \\tau _\\mu =\\mu \\Delta x/\\sigma ,~ {and}~\\tau _\\rho =\\sqrt {\\rho \\Delta x^3/\\sigma } . Finally, some capillary flows relevant to atomization were simulated, resulting in good agreement with the results from the literature.

  3. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E.

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reactionmore » rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)« less

  4. Development and Verification of a Mobile Shelter Assessment System "Rapid Assessment System of Evacuation Center Condition Featuring Gonryo and Miyagi (RASECC-GM)" for Major Disasters.

    PubMed

    Ishii, Tadashi; Nakayama, Masaharu; Abe, Michiaki; Takayama, Shin; Kamei, Takashi; Abe, Yoshiko; Yamadera, Jun; Amito, Koichiro; Morino, Kazuma

    2016-10-01

    Introduction There were 5,385 deceased and 710 missing in the Ishinomaki medical zone following the Great East Japan Earthquake that occurred in Japan on March 11, 2011. The Ishinomaki Zone Joint Relief Team (IZJRT) was formed to unify the relief teams of all organizations joining in support of the Ishinomaki area. The IZJRT expanded relief activity as they continued to manually collect and analyze assessments of essential information for maintaining health in all 328 shelters using a paper-type survey. However, the IZJRT spent an enormous amount of time and effort entering and analyzing these data because the work was vastly complex. Therefore, an assessment system must be developed that can tabulate shelter assessment data correctly and efficiently. The objective of this report was to describe the development and verification of a system to rapidly assess evacuation centers in preparation for the next major disaster. Report Based on experiences with the complex work during the disaster, software called the "Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi" (RASECC-GM) was developed to enter, tabulate, and manage the shelter assessment data. Further, a verification test was conducted during a large-scale Self-Defense Force (SDF) training exercise to confirm its feasibility, usability, and accuracy. The RASECC-GM comprises three screens: (1) the "Data Entry screen," allowing for quick entry on tablet devices of 19 assessment items, including shelter administrator, living and sanitary conditions, and a tally of the injured and sick; (2) the "Relief Team/Shelter Management screen," for registering information on relief teams and shelters; and (3) the "Data Tabulation screen," which allows tabulation of the data entered for each shelter, as well as viewing and sorting from a disaster headquarters' computer. During the verification test, data of mock shelters entered online were tabulated quickly and accurately on a mock disaster headquarters' computer. Likewise, data entered offline also were tabulated quickly on the mock disaster headquarters' computer when the tablet device was moved into an online environment. The RASECC-GM, a system for rapidly assessing the condition of evacuation centers, was developed. Tests verify that users of the system would be able to easily, quickly, and accurately assess vast quantities of data from multiple shelters in a major disaster and immediately manage the inputted data at the disaster headquarters. Ishii T , Nakayama M , Abe M , Takayama S , Kamei T , Abe Y , Yamadera J , Amito K , Morino K . Development and verification of a mobile shelter assessment system "Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi (RASECC-GM)" for major disasters. Prehosp Disaster Med. 2016;31(5):539-546.

  5. Plasma Metamaterials for Arbitrary Complex-Amplitude Wave Filters

    DTIC Science & Technology

    2013-09-10

    plasmas as reflectors , 4 absorbers, 4,5 and antennae 6 of electromagnetic waves. In contrast with the other materials in these devices, parameters...are controlled using launching antenna and high-power wave sources. One of the fundamental facts we have learned in microwave plasmas is that...metamaterials.” 29 In this report, we demonstrate the functional composites of plasmas and metamaterials, and the focusing point is verification of

  6. Assessing the Potential of Societal Verification by Means of New Media

    DTIC Science & Technology

    2014-01-01

    the Defense Advanced Research Projects Agency (DARPA) “Red Balloon Challenge” in 2009 by finding 10 tethered weather balloons scattered across the...Institute of Technology (MIT) managed to locate 10 weather balloons tethered at undisclosed locations across the continental United States in less than...suited for complex problem solving, and the 2009 Defense Advanced Research Projects Agency’s (DARPA) “Red Balloon Challenge” has already demonstrated

  7. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    This NASA Engineering and Safety Center (NESC) assessment was established to develop a set of time histories for the flight behavior of increasingly complex example aerospacecraft that could be used to partially validate various simulation frameworks. The assessment was conducted by representatives from several NASA Centers and an open-source simulation project. This document contains details on models, implementation, and results.

  8. A Study on Run Time Assurance for Complex Cyber Physical Systems

    DTIC Science & Technology

    2013-04-18

    safety verification approach was applied to synchronization of distributed local clocks of the nodes on a CAN bus by Jiang et al. [36]. The class of...mode of interaction between the instrumented system and the checker, we distin- guish between synchronous and asynchronous monitoring. In synchronous ...occurred. Synchronous monitoring may deliver a higher degree of assurance than the asynchronous one, because it can block a dangerous action. However

  9. CFD and ventilation research.

    PubMed

    Li, Y; Nielsen, P V

    2011-12-01

    There has been a rapid growth of scientific literature on the application of computational fluid dynamics (CFD) in the research of ventilation and indoor air science. With a 1000-10,000 times increase in computer hardware capability in the past 20 years, CFD has become an integral part of scientific research and engineering development of complex air distribution and ventilation systems in buildings. This review discusses the major and specific challenges of CFD in terms of turbulence modelling, numerical approximation, and boundary conditions relevant to building ventilation. We emphasize the growing need for CFD verification and validation, suggest ongoing needs for analytical and experimental methods to support the numerical solutions, and discuss the growing capacity of CFD in opening up new research areas. We suggest that CFD has not become a replacement for experiment and theoretical analysis in ventilation research, rather it has become an increasingly important partner. We believe that an effective scientific approach for ventilation studies is still to combine experiments, theory, and CFD. We argue that CFD verification and validation are becoming more crucial than ever as more complex ventilation problems are solved. It is anticipated that ventilation problems at the city scale will be tackled by CFD in the next 10 years. © 2011 John Wiley & Sons A/S.

  10. Power System Test and Verification at Satellite Level

    NASA Astrophysics Data System (ADS)

    Simonelli, Giulio; Mourra, Olivier; Tonicello, Ferdinando

    2008-09-01

    Most of the articles on Power Systems deal with the architecture and technical solutions related to the functionalities of the power system and their performances. Very few articles, if none, address integration and verification aspects of the Power System at satellite level and the related issues with the Power EGSE (Electrical Ground Support Equipment), which, also, have to support the AIT/AIV (Assembly Integration Test and Verification) program of the satellite and, eventually, the launch campaign. In the last years a more complex development and testing concept based on MDVE (Model Based Development and Verification Environment) has been introduced. In the MDVE approach the simulation software is used to simulate the Satellite environment and, in the early stages, the satellites units. This approach changed significantly the Power EGSE requirements. Power EGSEs or, better, Power SCOEs (Special Check Out Equipment) are now requested to provide the instantaneous power generated by the solar array throughout the orbit. To achieve that, the Power SCOE interfaces to the RTS (Real Time Simulator) of the MDVE. The RTS provides the instantaneous settings, which belong to that point along the orbit, to the Power SCOE so that the Power SCOE generates the instantaneous {I,V} curve of the SA (Solar Array). That means a real time test for the power system, which is even more valuable for EO (Earth Observation) satellites where the Solar Array aspect angle to the sun is rarely fixed, and the power load profile can be particularly complex (for example, in radar applications). In this article the major issues related to integration and testing of Power Systems will be discussed taking into account different power system topologies (i.e. regulated bus, unregulated bus, battery bus, based on MPPT or S3R…). Also aspects about Power System AIT I/Fs (interfaces) and Umbilical I/Fs with the launcher and the Power SCOE I/Fs will be addressed. Last but not least, protection strategy of the Power System during AIT/AIV program will also be discussed. The objective of this discussion is also to provide the Power System Engineer with a checklist of key aspects linked to the satellite AIT/AIV program, that have to be considered in the early phases of a new power system development.

  11. An Investigation of the Compatibility of Radiation and Convection Heat Flux Measurements

    NASA Technical Reports Server (NTRS)

    Liebert, Curt H.

    1996-01-01

    A method for determining time-resolved absorbed surface heat flux and surface temperature in radiation and convection environments is described. The method is useful for verification of aerodynamic, heat transfer and durability models. A practical heat flux gage fabrication procedure and a simple one-dimensional inverse heat conduction model and calculation procedure are incorporated in this method. The model provides an estimate of the temperature and heat flux gradient in the direction of heat transfer through the gage. This paper discusses several successful time-resolved tests of this method in hostile convective heating and cooling environments.

  12. Development of optical ground verification method for μm to sub-mm reflectors

    NASA Astrophysics Data System (ADS)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2017-11-01

    Large reflectors and antennas for the IR to mm wavelength range are being planned for many Earth observation and astronomical space missions and for commercial communication satellites as well. Scientific observatories require large telescopes with precisely shaped reflectors for collecting the electro-magnetic radiation from faint sources. The challenging tasks of on-ground testing are to achieve the required accuracy in the measurement of the reflector shapes and antenna structures and to verify their performance under simulated space conditions (vacuum, low temperatures). Due to the specific surface characteristics of reflectors operating in these spectral regions, standard optical metrology methods employed in the visible spectrum do not provide useful measurement results. The current state-of-the-art commercial metrology systems are not able to measure these types of reflectors because they have to face the measurement of shape and waviness over relatively large areas with a large deformation dynamic range and encompassing a wide range of spatial frequencies. 3-D metrology (tactile coordinate measurement) machines are generally used during the manufacturing process. Unfortunately, these instruments cannot be used in the operational environmental conditions of the reflector. The application of standard visible wavelength interferometric methods is very limited or impossible due to the large relative surface roughnesses involved. A small number of infrared interferometers have been commercially developed over the last 10 years but their applications have also been limited due to poor dynamic range and the restricted spatial resolution of their detectors. These restrictions affect also the surface error slopes that can be captured and makes their application to surfaces manufactured using CRFP honeycomb technologies rather difficult or impossible. It has therefore been considered essential, from the viewpoint of supporting future ESA exploration missions, to develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. Two methods and techniques are developed at CSL. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Appel

    This cleanup verification package documents completion of remedial action for the 118-F-3, Minor Construction Burial Ground waste site. This site was an open field covered with cobbles, with no vegetation growing on the surface. The site received irradiated reactor parts that were removed during conversion of the 105-F Reactor from the Liquid 3X to the Ball 3X Project safety systems and received mostly vertical safety rod thimbles and step plugs.

  14. Experimental verification of the spectral shift between near- and far-field peak intensities of plasmonic infrared nanoantennas.

    PubMed

    Alonso-González, P; Albella, P; Neubrech, F; Huck, C; Chen, J; Golmar, F; Casanova, F; Hueso, L E; Pucci, A; Aizpurua, J; Hillenbrand, R

    2013-05-17

    Theory predicts a distinct spectral shift between the near- and far-field optical response of plasmonic antennas. Here we combine near-field optical microscopy and far-field spectroscopy of individual infrared-resonant nanoantennas to verify experimentally this spectral shift. Numerical calculations corroborate our experimental results. We furthermore discuss the implications of this effect in surface-enhanced infrared spectroscopy.

  15. Microwave scattering models and basic experiments

    NASA Technical Reports Server (NTRS)

    Fung, Adrian K.

    1989-01-01

    Progress is summarized which has been made in four areas of study: (1) scattering model development for sparsely populated media, such as a forested area; (2) scattering model development for dense media, such as a sea ice medium or a snow covered terrain; (3) model development for randomly rough surfaces; and (4) design and conduct of basic scattering and attenuation experiments suitable for the verification of theoretical models.

  16. Establish an Agent-Simulant Technology Relationship (ASTR)

    DTIC Science & Technology

    2017-04-14

    for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT

  17. Atmospheric observations for STS-1 landing

    NASA Technical Reports Server (NTRS)

    Turner, R. E.; Arnold, J. E.; Wilson, G. S.

    1981-01-01

    A summary of synoptic weather conditions existing over the western United States is given for the time of shuttle descent into Edwards Air Force Base, California. The techniques and methods used to furnish synoptic atmospheric data at the surface and aloft for flight verification of the STS-1 orbiter during its descent into Edwards Air Force Base are specified. Examples of the upper level data set are given.

  18. Climate modeling. [for use in understanding earth's radiation budget

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The requirements for radiation measurements suitable for the understanding, improvement, and verification of models used in performing climate research are considered. Both zonal energy balance models and three dimensional general circulation models are considered, and certain problems are identified as common to all models. Areas of emphasis include regional energy balance observations, spectral band observations, cloud-radiation interaction, and the radiative properties of the earth's surface.

  19. Modeling the urban boundary layer

    NASA Technical Reports Server (NTRS)

    Bergstrom, R. W., Jr.

    1976-01-01

    A summary and evaluation is given of the Workshop on Modeling the Urban Boundary Layer; held in Las Vegas on May 5, 1975. Edited summaries from each of the session chairpersons are also given. The sessions were: (1) formulation and solution techniques, (2) K-theory versus higher order closure, (3) surface heat and moisture balance, (4) initialization and boundary problems, (5) nocturnal boundary layer, and (6) verification of models.

  20. The use of robots for arms control treaty verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalowski, S.J.

    1991-01-01

    Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less

  1. Ramifications of the Children's Surgery Verification Program for Patients and Hospitals.

    PubMed

    Baxter, Katherine J; Gale, Bonnie F; Travers, Curtis D; Heiss, Kurt F; Raval, Mehul V

    2018-05-01

    The American College of Surgeons in 2015 instituted the Children's Surgery Verification program delineating requirements for hospitals providing pediatric surgical care. Our purpose was to examine possible effects of the Children's Surgery Verification program by evaluating neonates undergoing high-risk operations. Using the Kid's Inpatient Database 2009, we identified infants undergoing operations for 5 high-risk neonatal conditions. We considered all children's hospitals and children's units Level I centers and considered all others Level II/III. We estimated the number of neonates requiring relocation and the additional distance traveled. We used propensity score adjusted logistic regression to model mortality at Level I vs Level II/III hospitals. Overall, 7,938 neonates were identified across 21 states at 91 Level I and 459 Level II/III hospitals. Based on our classifications, 2,744 (34.6%) patients would need to relocate to Level I centers. The median additional distance traveled was 6.6 miles. The maximum distance traveled varied by state, from <55 miles (New Jersey and Rhode Island) to >200 miles (Montana, Oregon, Colorado, and California). The adjusted odds of mortality at Level II/III vs Level I centers was 1.67 (95% CI 1.44 to 1.93). We estimate 1 life would be saved for every 32 neonates moved. Although this conservative estimate demonstrates that more than one-third of complex surgical neonates in 2009 would have needed to relocate under the Children's Surgery Verification program, the additional distance traveled is relatively short for most but not all, and this program might improve mortality. Local level ramifications of this novel national program require additional investigation. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Beard, Bernard B.

    2010-01-01

    This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.

  3. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  4. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  5. Superradiance Transition and Nonphotochemical Quenching in Photosynthetic Complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berman, Gennady Petrovich; Nesterov, Alexander; Lopez, Gustavo

    2015-04-23

    Photosynthetic organisms have evolved protective strategies to allow them to survive in cases of intense sunlight fluctuation with the development of nonphotochemical quenching (NPQ). This process allows light harvesting complexes to transfer the excess sunlight energy to non-damaging quenching channels. This report compares the NPQ process with the superradiance transition (ST). We demonstrated that the maximum of the NPQ efficiency is caused by the ST to the sink associated with the CTS. However, experimental verifications are required in order to determine whether or not the NPQ regime is associated with the ST transition for real photosynthetic complexes. Indeed, it canmore » happen that, in the photosynthetic apparatus, the NPQ regime occurs in the “non-optimal” region of parameters, and it could be independent of the ST.« less

  6. Forecasting of cyanobacterial density in Torrão reservoir using artificial neural networks.

    PubMed

    Torres, Rita; Pereira, Elisa; Vasconcelos, Vítor; Teles, Luís Oliva

    2011-06-01

    The ability of general regression neural networks (GRNN) to forecast the density of cyanobacteria in the Torrão reservoir (Tâmega river, Portugal), in a period of 15 days, based on three years of collected physical and chemical data, was assessed. Several models were developed and 176 were selected based on their correlation values for the verification series. A time lag of 11 was used, equivalent to one sample (periods of 15 days in the summer and 30 days in the winter). Several combinations of the series were used. Input and output data collected from three depths of the reservoir were applied (surface, euphotic zone limit and bottom). The model that presented a higher average correlation value presented the correlations 0.991; 0.843; 0.978 for training, verification and test series. This model had the three series independent in time: first test series, then verification series and, finally, training series. Only six input variables were considered significant to the performance of this model: ammonia, phosphates, dissolved oxygen, water temperature, pH and water evaporation, physical and chemical parameters referring to the three depths of the reservoir. These variables are common to the next four best models produced and, although these included other input variables, their performance was not better than the selected best model.

  7. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  8. Verification and implementation of microburst day potential index (MDPI) and wind INDEX (WINDEX) forecasting tools at Cape Canaveral Air Station

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark

    1996-01-01

    This report details the research, development, utility, verification and transition on wet microburst forecasting and detection the Applied Meteorology Unit (AMU) did in support of ground and launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS). The unforecasted wind event on 16 August 1994 of 33.5 ms-1 (65 knots) at the Shuttle Landing Facility raised the issue of wet microburst detection and forecasting. The AMU researched and analyzed the downburst wind event and determined it was a wet microburst event. A program was developed for operational use on the Meteorological Interactive Data Display System (MIDDS) weather system to analyze, compute and display Theta(epsilon) profiles, the microburst day potential index (MDPI), and wind index (WINDEX) maximum wind gust value. Key microburst nowcasting signatures using the WSR-88D data were highlighted. Verification of the data sets indicated that the MDPI has good potential in alerting the duty forecaster to the potential of wet microburst and the WINDEX values computed from the hourly surface data do have potential in showing a trend for the maximum gust potential. WINDEX should help in filling in the temporal hole between the MDPI on the last Cape Canaveral rawinsonde and the nowcasting radar data tools.

  9. Development and verification of a novel device for dental intra-oral 3D scanning using chromatic confocal technology

    NASA Astrophysics Data System (ADS)

    Zint, M.; Stock, K.; Graser, R.; Ertl, T.; Brauer, E.; Heyninck, J.; Vanbiervliet, J.; Dhondt, S.; De Ceuninck, P.; Hibst, R.

    2015-03-01

    The presented work describes the development and verification of a novel optical, powder-free intra-oral scanner based on chromatic confocal technology combined with a multifocal approach. The proof of concept for a chromatic confocal area scanner for intra-oral scanning is given. Several prototype scanners passed a verification process showing an average accuracy (distance deviation on flat surfaces) of less than 31μm +/- 21μm and a reproducibility of less than 4μm +/- 3μm. Compared to a tactile measurement on a full jaw model fitted with 4mm ceramic spheres the measured average distance deviation between the spheres was 49μm +/- 12μm for scans of up to 8 teeth (3- unit bridge, single Quadrant) and 104μm +/- 82μm for larger scans and full jaws. The average deviation of the measured sphere diameter compared to the tactile measurement was 27μm +/- 14μm. Compared to μCT scans of plaster models equipped with human teeth the average standard deviation on up to 3 units was less than 55μm +/- 49μm whereas the reproducibility of the scans was better than 22μm +/- 10μm.

  10. Category V Compliant Container for Mars Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.

    2000-01-01

    A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.

  11. Holographic aids for internal combustion engine flow studies

    NASA Technical Reports Server (NTRS)

    Regan, C.

    1984-01-01

    Worldwide interest in improving the fuel efficiency of internal combustion (I.C.) engines has sparked research efforts designed to learn more about the flow processes of these engines. The flow fields must be understood prior to fuel injection in order to design efficient valves, piston geometries, and fuel injectors. Knowledge of the flow field is also necessary to determine the heat transfer to combustion chamber surfaces. Computational codes can predict velocity and turbulence patterns, but experimental verification is mandatory to justify their basic assumptions. Due to their nonintrusive nature, optical methods are ideally suited to provide the necessary velocity verification data. Optical sytems such as Schlieren photography, laser velocimetry, and illuminated particle visualization are used in I.C. engines, and now their versatility is improved by employing holography. These holographically enhanced optical techniques are described with emphasis on their applications in I.C. engines.

  12. Coherent Lidar Design and Performance Verification

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1996-01-01

    This final report summarizes the investigative results from the 3 complete years of funding and corresponding publications are listed. The first year saw the verification of beam alignment for coherent Doppler lidar in space by using the surface return. The second year saw the analysis and computerized simulation of using heterodyne efficiency as an absolute measure of performance of coherent Doppler lidar. A new method was proposed to determine the estimation error for Doppler lidar wind measurements without the need for an independent wind measurement. Coherent Doppler lidar signal covariance, including wind shear and turbulence, was derived and calculated for typical atmospheric conditions. The effects of wind turbulence defined by Kolmogorov spatial statistics were investigated theoretically and with simulations. The third year saw the performance of coherent Doppler lidar in the weak signal regime determined by computer simulations using the best velocity estimators. Improved algorithms for extracting the performance of velocity estimators with wind turbulence included were also produced.

  13. Tests of high-resolution simulations over a region of complex terrain in Southeast coast of Brazil

    NASA Astrophysics Data System (ADS)

    Chou, Sin Chan; Luís Gomes, Jorge; Ristic, Ivan; Mesinger, Fedor; Sueiro, Gustavo; Andrade, Diego; Lima-e-Silva, Pedro Paulo

    2013-04-01

    The Eta Model is used operationally by INPE at the Centre for Weather Forecasts and Climate Studies (CPTEC) to produce weather forecasts over South America since 1997. The model has gone through upgrades along these years. In order to prepare the model for operational higher resolution forecasts, the model is configured and tested over a region of complex topography located near the coast of Southeast Brazil. The model domain includes the two Brazilians cities, Rio de Janeiro and Sao Paulo, urban areas, preserved tropical forest, pasture fields, and complex terrain where it can rise from sea level up to about 1000 m. Accurate near-surface wind direction and magnitude are needed for the power plant emergency plan. Besides, the region suffers from frequent events of floods and landslides, therefore accurate local forecasts are required for disaster warnings. The objective of this work is to carry out a series of numerical experiments to test and evaluate high resolution simulations in this complex area. Verification of model runs uses observations taken from the nuclear power plant and higher resolution reanalyses data. The runs were tested in a period when flow was predominately forced by local conditions and in a period forced by frontal passage. The Eta Model was configured initially with 2-km horizontal resolution and 50 layers. The Eta-2km is a second nesting, it is driven by Eta-15km, which in its turn is driven by Era-Interim reanalyses. The series of experiments consists of replacing surface layer stability function, adjusting cloud microphysics scheme parameters, further increasing vertical and horizontal resolutions. By replacing the stability function for the stable conditions substantially increased the katabatic winds and verified better against the tower wind data. Precipitation produced by the model was excessive in the region. Increasing vertical resolution to 60 layers caused a further increase in precipitation production. This excessive precipitation was reduced by adjusting some parameters in the cloud microphysics scheme. Precipitation overestimate still occurs and further tests are still necessary. The increase of horizontal resolution to 1 km required adjusting model diffusion parameters and refining divergence calculations. Available observations in the region for a thorough evaluation is a major constraint.

  14. High Speed PC Based Data Acquisition and Instrumentation for Measurement of Simulated Low Earth Orbit Thermally Induced Disturbances

    NASA Technical Reports Server (NTRS)

    Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.

  15. Characterizing flow in oil reservoir rock using SPH: absolute permeability

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Williams, John R.; Tilke, Peter; Leonardi, Christopher R.

    2016-04-01

    In this paper, a three-dimensional smooth particle hydrodynamics (SPH) simulator for modeling grain scale fluid flow in porous rock is presented. The versatility of the SPH method has driven its use in increasingly complex areas of flow analysis, including flows related to permeable rock for both groundwater and petroleum reservoir research. While previous approaches to such problems using SPH have involved the use of idealized pore geometries (cylinder/sphere packs etc), in this paper we detail the characterization of flow in models with geometries taken from 3D X-ray microtomographic imaging of actual porous rock; specifically 25.12 % porosity dolomite. This particular rock type has been well characterized experimentally and described in the literature, thus providing a practical `real world' means of verification of SPH that will be key to its acceptance by industry as a viable alternative to traditional reservoir modeling tools. The true advantages of SPH are realized when adding the complexity of multiple fluid phases, however, the accuracy of SPH for single phase flow is, as yet, under developed in the literature and will be the primary focus of this paper. Flow in reservoir rock will typically occur in the range of low Reynolds numbers, making the enforcement of no-slip boundary conditions an important factor in simulation. To this end, we detail the development of a new, robust, and numerically efficient method for implementing no-slip boundary conditions in SPH that can handle the degree of complexity of boundary surfaces, characteristic of an actual permeable rock sample. A study of the effect of particle density is carried out and simulation results for absolute permeability are presented and compared to those from experimentation showing good agreement and validating the method for such applications.

  16. Space shuttle main engine controller assembly, phase C-D. [with lagging system design and analysis

    NASA Technical Reports Server (NTRS)

    1973-01-01

    System design and system analysis and simulation are slightly behind schedule, while design verification testing has improved. Input/output circuit design has improved, but digital computer unit (DCU) and mechanical design continue to lag. Part procurement was impacted by delays in printed circuit board, assembly drawing releases. These are the result of problems in generating suitable printed circuit artwork for the very complex and high density multilayer boards.

  17. A20 Functional Domains Regulate Subcellular Localization and NF-Kappa B Activation

    DTIC Science & Technology

    2013-08-15

    that the first function to be described for A20 was that of an anti -apoptotic protein (55). They based their choice of experiments and preliminary...mediated apoptosis (55). After positive selection of the resulting clones with neomycin and verification of A20 expression, they compared the...Karposi sarcoma herpesvirus (KSHV) mediated cell transformation (72). K13 can directly activate NF-κB by interacting with the IKK complex and is

  18. Verification of Disarmament or Limitation of Armaments: Instruments, Negotiations, Proposals

    DTIC Science & Technology

    1992-05-01

    explosions and may complicate the process of detection. An even greater difficulty faced by seismologists is the ambient background of seismic "noise...suspected event would be a complex operation. It would consist of surveys of the area of the presumed nuclear explosion in order to measure ambient ...Draft Resolution to the OAS General Assembly, June 1991 and OAS Resolution "Cooperacion para la seguridad en el hemisferio. Limitacion de la

  19. Evaluation of Nonlinear Constitutive Properties of Concrete

    DTIC Science & Technology

    1990-02-01

    34 " \\ 19:#BSTRACT (Continue on reverse if necesuar 4nd identify by block number) 3his report describes the development of a methodology that allows for...Continued). The method of evaluation, as developed herein, consists of the following steps: 1. The design and execution of a series of material... developed in Step L. 3. Design and execution of the series of verification tests which provide data suffi- cient for defining key complex material

  20. Advanced Spectroscopic and Thermal Imaging Instrumentation for Shock Tube and Ballistic Range Facilities

    DTIC Science & Technology

    2010-04-01

    the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose...made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis . We also show how our theory relates to, and...of the most recent investigations for Earth and Mars atmospheres will be discussed in the following sections. 2.4.1 Earth: lunar return NASA’s

  1. Development of 1-m primary mirror for a spaceborne camera

    NASA Astrophysics Data System (ADS)

    Kihm, Hagyong; Yang, Ho-Soon; Rhee, Hyug-Gyo; Lee, Yun-Woo

    2015-09-01

    We present the development of a 1-m lightweight mirror system for a spaceborne electro-optical camera. The mirror design was optimized to satisfy the performance requirements under launch loads and space environment. The mirror made of Zerodur® has pockets at the back surface and three square bosses at the rim. Metallic bipod flexures support the mirror at the bosses and adjust the mirror's surface distortion due to gravity. We also show an analytical formulation of the bipod flexure, where compliance and stiffness matrices of the bipod flexure are derived to estimate theoretical performance and to make initial design guidelines. Optomechanical performances such as surface distortions due to gravity is explained. Environmental verification of the mirror is achieved by vibration tests.

  2. Development of the 15 meter diameter hoop column antenna

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The building of a deployable 15-meter engineering model of the 100 meter antenna based on the point-design of an earlier task of this contract, complete with an RF-capable surface is described. The 15 meter diameter was selected so that the model could be tested in existing manufacturing, near-field RF, thermal vacuum, and structural dynamics facilities. The antenna was designed with four offset paraboloidal reflector surfaces with a focal length of 366.85 in and a primary surface accuracy goal of .069 in rms. Surface adjustment capability was provided by manually resetting the length of 96 surface control cords which emanated from the lower column extremity. A detailed description of the 15-meter Hoop/Column Antenna, major subassemblies, and a history of its fabrication, assembly, deployment testing, and verification measurements are given. The deviation for one aperture surface (except the outboard extremity) was measured after adjustments in follow-on tests at the Martin Marietta Near-field Facility to be .061 in; thus the primary surface goal was achieved.

  3. The process development of laser surface modification of commercially pure titanium (Grade 2) with rhenium

    NASA Astrophysics Data System (ADS)

    Kobiela, K.; Smolina, I.; Dziedzic, R.; Szymczyk, P.; Kurzynowski, T.; Chlebus, E.

    2016-12-01

    The paper presents the results of the process development of laser surface modification of commercially pure titanium with rhenium. The criterion of the successful/optimal process is the repetitive geometry of the surface, characterized by predictable and repetitive chemical composition over its entire surface as well as special mechanical properties (hardness and wear resistance). The analysis of surface geometry concluded measurements of laser penetration depth and heat affected zone (HAZ), the width of a single track as well as width of a clad. The diode laser installed on the industrial robot carried out the laser treatment. This solution made possible the continuous supply of powder to the substrate during the process. The aim of an investigation is find out the possibility of improving the tribological characteristics of the surface due to the rhenium alloying. The verification of the surface properties (tribological) concluded geometry measurements, microstructure observation, hardness tests and evaluation of wear resistance.

  4. Multijunction Solar Cell Technology for Mars Surface Applications

    NASA Technical Reports Server (NTRS)

    Stella, Paul M.; Mardesich, Nick; Ewell, Richard C.; Mueller, Robert L.; Endicter, Scott; Aiken, Daniel; Edmondson, Kenneth; Fetze, Chris

    2006-01-01

    Solar cells used for Mars surface applications have been commercial space qualified AM0 optimized devices. Due to the Martian atmosphere, these cells are not optimized for the Mars surface and as a result operate at a reduced efficiency. A multi-year program, MOST (Mars Optimized Solar Cell Technology), managed by JPL and funded by NASA Code S, was initiated in 2004, to develop tools to modify commercial AM0 cells for the Mars surface solar spectrum and to fabricate Mars optimized devices for verification. This effort required defining the surface incident spectrum, developing an appropriate laboratory solar simulator measurement capability, and to develop and test commercial cells modified for the Mars surface spectrum. This paper discusses the program, including results for the initial modified cells. Simulated Mars surface measurements of MER cells and Phoenix Lander cells (2007 launch) are provided to characterize the performance loss for those missions. In addition, the performance of the MER rover solar arrays is updated to reflect their more than two (2) year operation.

  5. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  6. On Calculation Methods and Results for Straight Cylindrical Roller Bearing Deflection, Stiffness, and Stress

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2011-01-01

    The purpose of this study was to assess some calculation methods for quantifying the relationships of bearing geometry, material properties, load, deflection, stiffness, and stress. The scope of the work was limited to two-dimensional modeling of straight cylindrical roller bearings. Preparations for studies of dynamic response of bearings with damaged surfaces motivated this work. Studies were selected to exercise and build confidence in the numerical tools. Three calculation methods were used in this work. Two of the methods were numerical solutions of the Hertz contact approach. The third method used was a combined finite element surface integral method. Example calculations were done for a single roller loaded between an inner and outer raceway for code verification. Next, a bearing with 13 rollers and all-steel construction was used as an example to do additional code verification, including an assessment of the leading order of accuracy of the finite element and surface integral method. Results from that study show that the method is at least first-order accurate. Those results also show that the contact grid refinement has a more significant influence on precision as compared to the finite element grid refinement. To explore the influence of material properties, the 13-roller bearing was modeled as made from Nitinol 60, a material with very different properties from steel and showing some potential for bearing applications. The codes were exercised to compare contact areas and stress levels for steel and Nitinol 60 bearings operating at equivalent power density. As a step toward modeling the dynamic response of bearings having surface damage, static analyses were completed to simulate a bearing with a spall or similar damage.

  7. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  8. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  9. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  10. Processing of positive-causal and negative-causal coherence relations in primary school children and adults: a test of the cumulative cognitive complexity approach in German.

    PubMed

    Knoepke, Julia; Richter, Tobias; Isberner, Maj-Britt; Naumann, Johannes; Neeb, Yvonne; Weinert, Sabine

    2017-03-01

    Establishing local coherence relations is central to text comprehension. Positive-causal coherence relations link a cause and its consequence, whereas negative-causal coherence relations add a contrastive meaning (negation) to the causal link. According to the cumulative cognitive complexity approach, negative-causal coherence relations are cognitively more complex than positive-causal ones. Therefore, they require greater cognitive effort during text comprehension and are acquired later in language development. The present cross-sectional study tested these predictions for German primary school children from Grades 1 to 4 and adults in reading and listening comprehension. Accuracy data in a semantic verification task support the predictions of the cumulative cognitive complexity approach. Negative-causal coherence relations are cognitively more demanding than positive-causal ones. Moreover, our findings indicate that children's comprehension of negative-causal coherence relations continues to develop throughout the course of primary school. Findings are discussed with respect to the generalizability of the cumulative cognitive complexity approach to German.

  11. Soil actinomycetes in the National Forest Park in northeastern China

    NASA Astrophysics Data System (ADS)

    Shirokikh, I. G.; Shirokikh, A. A.

    2017-01-01

    The taxonomic and functional structure of actinomycete complexes in the litters and upper horizons of the soils under an artificial coniferous-broad-leaved forest located around the town of Chanchun (Tszilin province, PRC). The complex of actinomycetes included representatives of the Streptomyces, Micromonospora, Streptosporangium, and Streptoverticillium genera and oligosporous forms. In the actinomycete complexes, streptomycetes prevailed in the abundance (61-95%) and frequency of occurrence (100%). In the parcels of Korean pine ( Pinus koraiensis) and Mongolian oak ( Quercus mongolica), streptomycetes of 19 species from 8 series and 4 sections were isolated. The most representative, as in European forest biomes, was the Cinereus Achromogenes series. A distinguishing feature of the streptomycete complex in the biomes studied was the high participation of species from the Imperfectus series. The verification of the functional activity of natural isolates made it possible to reveal strains with high antagonistic and cellulolytic abilities. A high similarity of actinomycete complexes was found in Eurasian forest ecosystems remote from each other, probably due to the similarity of plant polymers decomposable by actinomycetes.

  12. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  13. Satellite and aerial data as a tool for digs localisation and their verification using geophysical methods

    NASA Astrophysics Data System (ADS)

    Pavelka, Karel; Faltynova, Martina; Bila, Zdenka

    2013-04-01

    The Middle Europe such as next world cultural centres are inhabited by humans tens of thousands years. In the last ten years, new methods are implemented in archaeology. It means new sensitive geophysical methods, very high resolution remote sensing and Airborne Laser Scanning (ALS). This contribution will refer about new technological possibilities for archaeology in the Czech Republic to two project examples. VHR satellite data or aerial image data can be used for searching of potential archaeological sites. In some cases, orthophoto mosaic is very useful; nowadays, different aerial orthophotomosaic layers are available in the Czech Republic (2002-3, 2006 and 2009) with pixel resolution 25cm. The archaeological findings are best visible in the Czech Republic by their vegetation indices. For this reason, the best time for data acquiring is mid of spring, in rapid vegetation process. Another option is the soil indices - the best time is early spring or autumn, after crop. A new progressive method is ALS, which can be used for spatial indices. Since autumn 2009 the entire area of the Czech Republic is mapped by technology of ALS. The aim of mapping is to get authentic and detailed digital terrain model (DTM) of the Czech Republic. About 80% (autumn 2012) of the Czech territory is currently covered by the DTM based on ALS. The standard deviation of model points in altitude is better than 20cm. The DTM displayed in appropriate form (as shaded surface) can be used as a data source for searching and description of archaeological sites - mainly in forested areas. By using of above mentioned methods a lot of interesting historical sites were discovered. The logical next step is a verification of these findings by using terrestrial methods - in this case by using of geophysical instruments. At the CTU Prague, the walking gradiometer GSM-19 and georadar SIR-3000 are at disposal. In first example the former fortification from Prussia - Austrian was localized on orthophoto mosaic and QuickBird satellite data. Normally is not visible from the surface. Secondary was fortification localized on shaded relief by spatial indices. Last verification has been made by walking magnetometer. Second example is joining of both magnetometers and GPR data. These technology and 3D modelling was used for localisation and verification of unknown tomb in the neighbourhood of church ruins in Panensky Tynec.

  14. Joint Estimation of Source Range and Depth Using a Bottom-Deployed Vertical Line Array in Deep Water

    PubMed Central

    Li, Hui; Yang, Kunde; Duan, Rui; Lei, Zhixiong

    2017-01-01

    This paper presents a joint estimation method of source range and depth using a bottom-deployed vertical line array (VLA). The method utilizes the information on the arrival angle of direct (D) path in space domain and the interference characteristic of D and surface-reflected (SR) paths in frequency domain. The former is related to a ray tracing technique to backpropagate the rays and produces an ambiguity surface of source range. The latter utilizes Lloyd’s mirror principle to obtain an ambiguity surface of source depth. The acoustic transmission duct is the well-known reliable acoustic path (RAP). The ambiguity surface of the combined estimation is a dimensionless ad hoc function. Numerical efficiency and experimental verification show that the proposed method is a good candidate for initial coarse estimation of source position. PMID:28590442

  15. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  16. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  17. Verification of the NWP models operated at ICM, Poland

    NASA Astrophysics Data System (ADS)

    Melonek, Malgorzata

    2010-05-01

    Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw (ICM) started its activity in the field of NWP in May 1997. Since this time the numerical weather forecasts covering Central Europe have been routinely published on our publicly available website. First NWP model used in ICM was hydrostatic Unified Model developed by the UK Meteorological Office. It was a mesoscale version with horizontal resolution of 17 km and 31 levels in vertical. At present two NWP non-hydrostatic models are running in quasi-operational regime. The main new UM model with 4 km horizontal resolution, 38 levels in vertical and forecats range of 48 hours is running four times a day. Second, the COAMPS model (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by the US Naval Research Laboratory, configured with the three nested grids (with coresponding resolutions of 39km, 13km and 4.3km, 30 vertical levels) are running twice a day (for 00 and 12 UTC). The second grid covers Central Europe and has forecast range of 84 hours. Results of the both NWP models, ie. COAMPS computed on 13km mesh resolution and UM, are verified against observations from the Polish synoptic stations. Verification uses surface observations and nearest grid point forcasts. Following meteorological elements are verified: air temperature at 2m, mean sea level pressure, wind speed and wind direction at 10 m and 12 hours accumulated precipitation. There are presented different statistical indices. For continous variables Mean Error(ME), Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) in 6 hours intervals are computed. In case of precipitation the contingency tables for different thresholds are computed and some of the verification scores such as FBI, ETS, POD, FAR are graphically presented. The verification sample covers nearly one year.

  18. Dose distribution verification for GYN brachytherapy using EBT Gafchromic film and TG-43 calculation.

    PubMed

    Gholami, Somayeh; Mirzaei, Hamid Reza; Jabbary Arfaee, Ali; Jaberi, Ramin; Nedaie, Hassan Ali; Rabi Mahdavi, Seied; Rajab Bolookat, Eftekhar; Meigooni, Ali S

    2016-01-01

    Verification of dose distributions for gynecological (GYN) brachytherapy implants using EBT Gafchromic film. One major challenge in brachytherapy is to verify the accuracy of dose distributions calculated by a treatment planning system. A new phantom was designed and fabricated using 90 slabs of 18 cm × 16 cm × 0.2 cm Perspex to accommodate a tandem and Ovoid assembly, which is normally used for GYN brachytherapy treatment. This phantom design allows the use of EBT Gafchromic films for dosimetric verification of GYN implants with a cobalt-60 HDR system or a LDR Cs-137 system. Gafchromic films were exposed using a plan that was designed to deliver 1.5 Gy of dose to 0.5 cm distance from the lateral surface of ovoids from a pair of ovoid assembly that was used for treatment vaginal cuff. For a quantitative analysis of the results for both LDR and HDR systems, the measured dose values at several points of interests were compared with the calculated data from a commercially available treatment planning system. This planning system was utilizing the TG-43 formalism and parameters for calculation of dose distributions around a brachytherapy implant. The results of these investigations indicated that the differences between the calculated and measured data at different points were ranging from 2.4% to 3.8% for the LDR Cs-137 and HDR Co-60 systems, respectively. The EBT Gafchromic films combined with the newly designed phantom could be utilized for verification of the dose distributions around different GYN implants treated with either LDR or HDR brachytherapy procedures.

  19. Verification Study - Wah Wah Valley, Utah. Volume I. Synthesis.

    DTIC Science & Technology

    1981-03-24

    Paleozoic limestone and dolomite , with lesser amounts of Precambrian and Cambrian quartzites and phyllites. Tertiary volcanic rocks, consisting of...of fracture along which there has been gdisplacement. FAULT BLOCK MOUNTAINS - Mountains that are formed by normal faulting in which the surface crust...sample (ASTM D 2850-70). To conduct the test, a cylindrical specimen of soil is surrounded by a fluid in a pressure chamber and subjected to an

  20. Lunar Analog Feasibility Study Results

    NASA Technical Reports Server (NTRS)

    Cromwell, Ronita L.; Neigut, Joe

    2009-01-01

    This slide presentation reviews a study designed to determine the feasibility of using a 9.5 deg head-up tilt bed rest model to simulate the effects of the 1/6 g load to the human body that exists on the lunar surface. The effect of different types of compression stockings, the pre-bed rest diet, and the use of a specific exercise program were reviewed for comfort, force verification and plasma volume shift

  1. An Extended Objective Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program

    NASA Technical Reports Server (NTRS)

    Nutter, Paul; Manobianco, John

    1998-01-01

    This report describes the Applied Meteorology Unit's objective verification of the National Centers for Environmental Prediction 29-km eta model during separate warm and cool season periods from May 1996 through January 1998. The verification of surface and upper-air point forecasts was performed at three selected stations important for 45th Weather Squadron, Spaceflight Meteorology Group, and National Weather Service, Melbourne operational weather concerns. The statistical evaluation identified model biases that may result from inadequate parameterization of physical processes. Since model biases are relatively small compared to the random error component, most of the total model error results from day-to-day variability in the forecasts and/or observations. To some extent, these nonsystematic errors reflect the variability in point observations that sample spatial and temporal scales of atmospheric phenomena that cannot be resolved by the model. On average, Meso-Eta point forecasts provide useful guidance for predicting the evolution of the larger scale environment. A more substantial challenge facing model users in real time is the discrimination of nonsystematic errors that tend to inflate the total forecast error. It is important that model users maintain awareness of ongoing model changes. Such changes are likely to modify the basic error characteristics, particularly near the surface.

  2. Verification of National Weather Service spot forecasts using surface observations

    NASA Astrophysics Data System (ADS)

    Lammers, Matthew Robert

    Software has been developed to evaluate National Weather Service spot forecasts issued to support prescribed burns and early-stage wildfires. Fire management officials request spot forecasts from National Weather Service Weather Forecast Offices to provide detailed guidance as to atmospheric conditions in the vicinity of planned prescribed burns as well as wildfires that do not have incident meteorologists on site. This open source software with online display capabilities is used to examine an extensive set of spot forecasts of maximum temperature, minimum relative humidity, and maximum wind speed from April 2009 through November 2013 nationwide. The forecast values are compared to the closest available surface observations at stations installed primarily for fire weather and aviation applications. The accuracy of the spot forecasts is compared to those available from the National Digital Forecast Database (NDFD). Spot forecasts for selected prescribed burns and wildfires are used to illustrate issues associated with the verification procedures. Cumulative statistics for National Weather Service County Warning Areas and for the nation are presented. Basic error and accuracy metrics for all available spot forecasts and the entire nation indicate that the skill of the spot forecasts is higher than that available from the NDFD, with the greatest improvement for maximum temperature and the least improvement for maximum wind speed.

  3. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  4. Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.

    PubMed

    Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M

    2013-05-21

    This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.

  5. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less

  6. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  7. Research on registration algorithm for check seal verification

    NASA Astrophysics Data System (ADS)

    Wang, Shuang; Liu, Tiegen

    2008-03-01

    Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.

  8. Numerical verification of two-component dental implant in the context of fatigue life for various load cases.

    PubMed

    Szajek, Krzysztof; Wierszycki, Marcin

    2016-01-01

    Dental implant designing is a complex process which considers many limitations both biological and mechanical in nature. In earlier studies, a complete procedure for improvement of two-component dental implant was proposed. However, the optimization tasks carried out required assumption on representative load case, which raised doubts on optimality for the other load cases. This paper deals with verification of the optimal design in context of fatigue life and its main goal is to answer the question if the assumed load scenario (solely horizontal occlusal load) leads to the design which is also "safe" for oblique occlussal loads regardless the angle from an implant axis. The verification is carried out with series of finite element analyses for wide spectrum of physiologically justified loads. The design of experiment methodology with full factorial technique is utilized. All computations are done in Abaqus suite. The maximal Mises stress and normalized effective stress amplitude for various load cases are discussed and compared with the assumed "safe" limit (equivalent of fatigue life for 5e6 cycles). The obtained results proof that coronial-appical load component should be taken into consideration in the two component dental implant when fatigue life is optimized. However, its influence in the analyzed case is small and does not change the fact that the fatigue life improvement is observed for all components within whole range of analyzed loads.

  9. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong; Andrs, David; Martineau, Richard Charles

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for timemore » integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.« less

  10. Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit

    NASA Technical Reports Server (NTRS)

    Sartori, John

    2005-01-01

    The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.

  11. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  12. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  13. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  14. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  15. Low-Computation Strategies for Extracting CO2 Emission Trends from Surface-Level Mixing Ratio Observations

    NASA Astrophysics Data System (ADS)

    Shusterman, A.; Kim, J.; Lieschke, K.; Newman, C.; Cohen, R. C.

    2017-12-01

    Global momentum is building for drastic, regulated reductions in greenhouse gas emissions over the coming decade. With this increasing regulation comes a clear need for increasingly sophisticated monitoring, reporting, and verification (MRV) strategies capable of enforcing and optimizing emissions-related policy, particularly as it applies to urban areas. Remote sensing and/or activity-based emission inventories can offer MRV insights for entire sectors or regions, but are not yet sophisticated enough to resolve unexpected trends in specific emitters. Urban surface monitors can offer the desired proximity to individual greenhouse gas sources, but due to the densely-packed nature of typical urban landscapes, surface observations are rarely representative of a single source. Most previous efforts to decompose these complex signals into their contributing emission processes have involved inverse atmospheric modeling techniques, which are computationally intensive and believed to depend heavily on poorly understood a priori estimates of error covariance. Here we present a number of transparent, low-computation approaches for extracting source-specific emissions estimates from signals with a variety of nearfield influences. Using observations from the first several years of the BErkeley Atmospheric CO2 Observation Network (BEACO2N), we demonstrate how to exploit strategic pairings of monitoring "nodes," anomalous wind conditions, and well-understood temporal variations to hone in on specific CO2 sources of interest. When evaluated against conventional, activity-based bottom-up emission inventories, these strategies are seen to generate quantitatively rigorous emission estimates. With continued application as the BEACO2N data set grows in time and space, these approaches offer a promising avenue for optimizing greenhouse gas mitigation strategies into the future.

  16. Multidecadal climate variability of global lands and oceans

    USGS Publications Warehouse

    McCabe, G.J.; Palecki, M.A.

    2006-01-01

    Principal components analysis (PCA) and singular value decomposition (SVD) are used to identify the primary modes of decadal and multidecadal variability in annual global Palmer Drought Severity Index (PDSI) values and sea-surface temperature (SSTs). The PDSI and SST data for 1925-2003 were detrended and smoothed (with a 10-year moving average) to isolate the decadal and multidecadal variability. The first two principal components (PCs) of the PDSI PCA explained almost 38% of the decadal and multidecadal variance in the detrended and smoothed global annual PDSI data. The first two PCs of detrended and smoothed global annual SSTs explained nearly 56% of the decadal variability in global SSTs. The PDSI PCs and the SST PCs are directly correlated in a pairwise fashion. The first PDSI and SST PCs reflect variability of the detrended and smoothed annual Pacific Decadal Oscillation (PDO), as well as detrended and smoothed annual Indian Ocean SSTs. The second set of PCs is strongly associated with the Atlantic Multidecadal Oscillation (AMO). The SVD analysis of the cross-covariance of the PDSI and SST data confirmed the close link between the PDSI and SST modes of decadal and multidecadal variation and provided a verification of the PCA results. These findings indicate that the major modes of multidecadal variations in SSTs and land-surface climate conditions are highly interrelated through a small number of spatially complex but slowly varying teleconnections. Therefore, these relations may be adaptable to providing improved baseline conditions for seasonal climate forecasting. Published in 2006 by John Wiley & Sons, Ltd.

  17. An IoT-Enabled Stroke Rehabilitation System Based on Smart Wearable Armband and Machine Learning.

    PubMed

    Yang, Geng; Deng, Jia; Pang, Gaoyang; Zhang, Hao; Li, Jiayi; Deng, Bin; Pang, Zhibo; Xu, Juan; Jiang, Mingzhe; Liljeberg, Pasi; Xie, Haibo; Yang, Huayong

    2018-01-01

    Surface electromyography signal plays an important role in hand function recovery training. In this paper, an IoT-enabled stroke rehabilitation system was introduced which was based on a smart wearable armband (SWA), machine learning (ML) algorithms, and a 3-D printed dexterous robot hand. User comfort is one of the key issues which should be addressed for wearable devices. The SWA was developed by integrating a low-power and tiny-sized IoT sensing device with textile electrodes, which can measure, pre-process, and wirelessly transmit bio-potential signals. By evenly distributing surface electrodes over user's forearm, drawbacks of classification accuracy poor performance can be mitigated. A new method was put forward to find the optimal feature set. ML algorithms were leveraged to analyze and discriminate features of different hand movements, and their performances were appraised by classification complexity estimating algorithms and principal components analysis. According to the verification results, all nine gestures can be successfully identified with an average accuracy up to 96.20%. In addition, a 3-D printed five-finger robot hand was implemented for hand rehabilitation training purpose. Correspondingly, user's hand movement intentions were extracted and converted into a series of commands which were used to drive motors assembled inside the dexterous robot hand. As a result, the dexterous robot hand can mimic the user's gesture in a real-time manner, which shows the proposed system can be used as a training tool to facilitate rehabilitation process for the patients after stroke.

  18. Long-term safety assessment of trench-type surface repository at Chernobyl, Ukraine - computer model and comparison with results from simplified models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    2013-07-01

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, the surface repository for solid low and intermediate level waste (LILW) is still being operated but its maximum capacity is nearly reached. Long-existing plans for increasing the capacity of the facility shall be implemented in the framework of the European Commission INSC Programme (Instrument for Nuclear Safety Co-operation). Within the first phase of this project, DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safetymore » analysis report (PSAR) for a future extended facility based on the planned enlargement. In addition to a detailed mathematical model, also simplified models have been developed to verify results of the former one and enhance confidence in the results. Comparison of the results show that - depending on the boundary conditions - simplifications like modeling the multi trench repository as one generic trench might have very limited influence on the overall results compared to the general uncertainties associated with respective long-term calculations. In addition to their value in regard to verification of more complex models which is important to increase confidence in the overall results, such simplified models can also offer the possibility to carry out time consuming calculations like probabilistic calculations or detailed sensitivity analysis in an economic manner. (authors)« less

  19. SEU System Analysis: Not Just the Sum of All Parts

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth

    2014-01-01

    Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.

  20. On improving the performance of nonphotochemical quenching in CP29 light-harvesting antenna complex

    NASA Astrophysics Data System (ADS)

    Berman, Gennady P.; Nesterov, Alexander I.; Sayre, Richard T.; Still, Susanne

    2016-03-01

    We model and simulate the performance of charge-transfer in nonphotochemical quenching (NPQ) in the CP29 light-harvesting antenna-complex associated with photosystem II (PSII). The model consists of five discrete excitonic energy states and two sinks, responsible for the potentially damaging processes and charge-transfer channels, respectively. We demonstrate that by varying (i) the parameters of the chlorophyll-based dimer, (ii) the resonant properties of the protein-solvent environment interaction, and (iii) the energy transfer rates to the sinks, one can significantly improve the performance of the NPQ. Our analysis suggests strategies for improving the performance of the NPQ in response to environmental changes, and may stimulate experimental verification.

Top