Sample records for experimental verification techniques

  1. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  2. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  3. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  4. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  5. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    NASA Technical Reports Server (NTRS)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  6. Bistatic radar sea state monitoring system design

    NASA Technical Reports Server (NTRS)

    Ruck, G. T.; Krichbaum, C. K.; Everly, J. O.

    1975-01-01

    Remote measurement of the two-dimensional surface wave height spectrum of the ocean by the use of bistatic radar techniques was examined. Potential feasibility and experimental verification by field experiment are suggested. The required experimental hardware is defined along with the designing, assembling, and testing of several required experimental hardware components.

  7. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  8. Heterogeneity of activated carbons in adsorption of phenols from aqueous solutions—Comparison of experimental isotherm data and simulation predictions

    NASA Astrophysics Data System (ADS)

    Podkościelny, P.; Nieszporek, K.

    2007-01-01

    Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.

  9. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  10. Application of additive laser technologies in the gas turbine blades design process

    NASA Astrophysics Data System (ADS)

    Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.

    2017-11-01

    An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.

  11. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  12. Development of automated optical verification technologies for control systems

    NASA Astrophysics Data System (ADS)

    Volegov, Peter L.; Podgornov, Vladimir A.

    1999-08-01

    The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.

  13. Heat-straightening effects on the behavior of plates and rolled shapes : volume 2 : second interim report of phase 1.

    DOT National Transportation Integrated Search

    1987-08-01

    One of the primary reasons that highway departments are hesitant to use heat-straightening techniques to repair damaged steel girders is the lack of experimental verification of the process. A comprehensive experimental program on the subject has bee...

  14. Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.

    ERIC Educational Resources Information Center

    Kaya, Azmi

    1982-01-01

    Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…

  15. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  16. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  17. Optical detection of random features for high security applications

    NASA Astrophysics Data System (ADS)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  18. Behavioral biometrics for verification and recognition of malicious software agents

    NASA Astrophysics Data System (ADS)

    Yampolskiy, Roman V.; Govindaraju, Venu

    2008-04-01

    Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.

  19. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  20. National Centers for Environmental Prediction

    Science.gov Websites

    Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  1. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  2. A novel method for determining calibration and behavior of PVDF ultrasonic hydrophone probes in the frequency range up to 100 MHz.

    PubMed

    Bleeker, H J; Lewin, P A

    2000-01-01

    A new calibration technique for PVDF ultrasonic hydrophone probes is described. Current implementation of the technique allows determination of hydrophone frequency response between 2 and 100 MHz and is based on the comparison of theoretically predicted and experimentally determined pressure-time waveforms produced by a focused, circular source. The simulation model was derived from the time domain algorithm that solves the non linear KZK (Khokhlov-Zabolotskaya-Kuznetsov) equation describing acoustic wave propagation. The calibration technique data were experimentally verified using independent calibration procedures in the frequency range from 2 to 40 MHz using a combined time delay spectrometry and reciprocity approach or calibration data provided by the National Physical Laboratory (NPL), UK. The results of verification indicated good agreement between the results obtained using KZK and the above-mentioned independent calibration techniques from 2 to 40 MHz, with the maximum discrepancy of 18% at 30 MHz. The frequency responses obtained using different hydrophone designs, including several membrane and needle probes, are presented, and it is shown that the technique developed provides a desirable tool for independent verification of primary calibration techniques such as those based on optical interferometry. Fundamental limitations of the presented calibration method are also examined.

  3. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    NASA Astrophysics Data System (ADS)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  4. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  5. Fog dispersion. [charged particle technique

    NASA Technical Reports Server (NTRS)

    Christensen, L. S.; Frost, W.

    1980-01-01

    The concept of using the charged particle technique to disperse warm fog at airports is investigated and compared with other techniques. The charged particle technique shows potential for warm fog dispersal, but experimental verification of several significant parameters, such as particle mobility and charge density, is needed. Seeding and helicopter downwash techniques are also effective for warm fog disperals, but presently are not believed to be viable techniques for routine airport operations. Thermal systems are currently used at a few overseas airports; however, they are expensive and pose potential environmental problems.

  6. Experimental verification of a new laminar airfoil: A project for the graduate program in aeronautics

    NASA Technical Reports Server (NTRS)

    Nicks, Oran W.; Korkan, Kenneth D.

    1991-01-01

    Two reports on student activities to determine the properties of a new laminar airfoil which were delivered at a conference on soaring technology are presented. The papers discuss a wind tunnel investigation and analysis of the SM701 airfoil and verification of the SM701 airfoil aerodynamic charcteristics utilizing theoretical techniques. The papers are based on a combination of analytical design, hands-on model fabrication, wind tunnel calibration and testing, data acquisition and analysis, and comparison of test results and theory.

  7. Verification of elastic-wave static displacement in solids. [using ultrasonic techniques on Ge single crystals

    NASA Technical Reports Server (NTRS)

    Cantrell, J. H., Jr.; Winfree, W. P.

    1980-01-01

    The solution of the nonlinear differential equation which describes an initially sinusoidal finite-amplitude elastic wave propagating in a solid contains a static-displacement term in addition to the harmonic terms. The static-displacement amplitude is theoretically predicted to be proportional to the product of the squares of the driving-wave amplitude and the driving-wave frequency. The first experimental verification of the elastic-wave static displacement in a solid (the 111 direction of single-crystal germanium) is reported, and agreement is found with the theoretical predictions.

  8. Experimental investigation of practical unforgeable quantum money

    NASA Astrophysics Data System (ADS)

    Bozzio, Mathieu; Orieux, Adeline; Trigo Vidarte, Luis; Zaquine, Isabelle; Kerenidis, Iordanis; Diamanti, Eleni

    2018-01-01

    Wiesner's unforgeable quantum money scheme is widely celebrated as the first quantum information application. Based on the no-cloning property of quantum mechanics, this scheme allows for the creation of credit cards used in authenticated transactions offering security guarantees impossible to achieve by classical means. However, despite its central role in quantum cryptography, its experimental implementation has remained elusive because of the lack of quantum memories and of practical verification techniques. Here, we experimentally implement a quantum money protocol relying on classical verification that rigorously satisfies the security condition for unforgeability. Our system exploits polarization encoding of weak coherent states of light and operates under conditions that ensure compatibility with state-of-the-art quantum memories. We derive working regimes for our system using a security analysis taking into account all practical imperfections. Our results constitute a major step towards a real-world realization of this milestone protocol.

  9. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NAM > EXPERIMENTAL DATA Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION

  10. Experimental evidence for a new single-event upset (SEU) mode in a CMOS SRAM obtained from model verification

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Lo, R. Y.

    1987-01-01

    Modeling of SEU has been done in a CMOS static RAM containing 1-micron-channel-length transistors fabricated from a p-well epilayer process using both circuit-simulation and numerical-simulation techniques. The modeling results have been experimentally verified with the aid of heavy-ion beams obtained from a three-stage tandem van de Graaff accelerator. Experimental evidence for a novel SEU mode in an ON n-channel device is presented.

  11. Investigation of optical/infrared sensor techniques for application satellites

    NASA Technical Reports Server (NTRS)

    Kaufman, I.

    1972-01-01

    A method of scanning an optical sensor array by acoustic surface waves is discussed. Data cover detailed computer based analysis of the operation of a multielement acoustic surface-wave-scanned optical sensor, the development of design and operation techniques that were used to show the feasibility of an integrated array to design several such arrays, and experimental verification of a number of the calculations with discrete sensor devices.

  12. A method of atmospheric density measurements during space shuttle entry using ultraviolet-laser Rayleigh scattering

    NASA Technical Reports Server (NTRS)

    Mckenzie, Robert L.

    1988-01-01

    An analytical study and its experimental verification are described which show the performance capabilities and the hardware requirements of a method for measuring atmospheric density along the Space Shuttle flightpath during entry. Using onboard instrumentation, the technique relies on Rayleigh scattering of light from a pulsed ArF excimer laser operating at a wavelength of 193 nm. The method is shown to be capable of providing density measurements with an uncertainty of less than 1 percent and with a spatial resolution along the flightpath of 1 km, over an altitude range from 50 to 90 km. Experimental verification of the signal linearity and the expected signal-to-noise ratios is demonstrated in a simulation facility at conditions that duplicate the signal levels of the flight environment.

  13. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  14. Bullying in School: Case Study of Prevention and Psycho-Pedagogical Correction

    ERIC Educational Resources Information Center

    Ribakova, Laysan A.; Valeeva, Roza A.; Merker, Natalia

    2016-01-01

    The purpose of the study was the theoretical justification and experimental verification of content, complex forms and methods to ensure effective prevention and psycho-pedagogical correction of bullying in school. 53 teenage students from Kazan took part in the experiment. A complex of diagnostic techniques for the detection of violence and…

  15. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing.

    PubMed

    Yassin, Ali A

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.

  16. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing

    PubMed Central

    Yassin, Ali A.

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051

  17. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  18. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  19. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  20. Experimental verification of PSM polarimetry: monitoring polarization at 193nm high-NA with phase shift masks

    NASA Astrophysics Data System (ADS)

    McIntyre, Gregory; Neureuther, Andrew; Slonaker, Steve; Vellanki, Venu; Reynolds, Patrick

    2006-03-01

    The initial experimental verification of a polarization monitoring technique is presented. A series of phase shifting mask patterns produce polarization dependent signals in photoresist and are capable of monitoring the Stokes parameters of any arbitrary illumination scheme. Experiments on two test reticles have been conducted. The first reticle consisted of a series of radial phase gratings (RPG) and employed special apertures to select particular illumination angles. Measurement sensitivities of about 0.3 percent of the clear field per percent change in polarization state were observed. The second test reticle employed the more sensitive proximity effect polarization analyzers (PEPA), a more robust experimental setup, and a backside pinhole layer for illumination angle selection and to enable characterization of the full illuminator. Despite an initial complication with the backside pinhole alignment, the results correlate with theory. Theory suggests that, once the pinhole alignment is corrected in the near future, the second reticle should achieve a measurement sensitivity of about 1 percent of the clear field per percent change in polarization state. This corresponds to a measurement of the Stokes parameters after test mask calibration, to within about 0.02 to 0.03. Various potential improvements to the design, fabrication of the mask, and experimental setup are discussed. Additionally, to decrease measurement time, a design modification and double exposure technique is proposed to enable electrical detection of the measurement signal.

  1. Palmprint verification using Lagrangian decomposition and invariant interest points

    NASA Astrophysics Data System (ADS)

    Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.

    2011-06-01

    This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.

  2. Arithmetic Circuit Verification Based on Symbolic Computer Algebra

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo

    This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.

  3. Experimental verification of arm-locking for LISA using electronic phase delay [rapid communication

    NASA Astrophysics Data System (ADS)

    Thorpe, J. I.; Mueller, G.

    2005-07-01

    We present results of an electronic model of arm-locking, a proposed technique for reducing the laser phase noise in the laser interferometer space antenna (LISA). The model is based on a delay of 500 ms, achieved using the electronic phase delay (EPD) method. The observed behavior is consistent with predictions.

  4. Multichannel forward scattering meter for oceanography

    NASA Technical Reports Server (NTRS)

    Mccluney, W. R.

    1974-01-01

    An instrument was designed and built that measures the light scattered at several angles in the forward direction simultaneously. The instrument relies on an optical multiplexing technique for frequency encoding of the different channels suitable for detection by a single photodetector. A Mie theory computer program was used to calculate the theoretical volume scattering function for a suspension of polystyrene latex spheres. The agreement between the theoretical and experimental volume scattering functions is taken as a verification of the calibration technique used.

  5. Propeller flow visualization techniques

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.

    1982-01-01

    Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.

  6. Manipulation strategies for massive space payloads

    NASA Technical Reports Server (NTRS)

    Book, Wayne J.

    1989-01-01

    Control for the bracing strategy is being examined. It was concluded earlier that trajectory planning must be improved to best achieve the bracing motion. Very interesting results were achieved which enable the inverse dynamics of flexible arms to be calculated for linearized motion in a more efficient manner than previously published. The desired motion of the end point beginning at t=0 and ending at t=t sub f is used to calculate the required torque at the joint. The solution is separated into a causal function that is zero for t is less than 0 and an accusal function which is zero for t is greater than t sub f. A number of alternative end point trajectories were explored in terms of the peak torque required, the amount of anticipatory action, and other issues. The single link case is the immediate subject and an experimental verification of that case is being performed. Modeling with experimental verification of closed chain dynamics continues. Modeling effort has pointed out inaccuracies that result from the choice of numerical techniques used to incorporate the closed chain constraints when modeling our experimental prototype RALF (Robotic Arm Large and Flexible). Results were compared to TREETOPS, a multi body code. The experimental verification work is suggesting new ways to make comparisons with systems having structural linearity and joint and geometric nonlinearity. The generation of inertial forces was studied with a small arm that will damp the large arm's vibration.

  7. Random technique to encode complex valued holograms with on axis reconstruction onto phase-only displays.

    PubMed

    Luis Martínez Fuentes, Jose; Moreno, Ignacio

    2018-03-05

    A new technique for encoding the amplitude and phase of diffracted fields in digital holography is proposed. It is based on a random spatial multiplexing of two phase-only diffractive patterns. The first one is the phase information of the intended pattern, while the second one is a diverging optical element whose purpose is the control of the amplitude. A random number determines the choice between these two diffractive patterns at each pixel, and the amplitude information of the desired field governs its discrimination threshold. This proposed technique is computationally fast and does not require iterative methods, and the complex field reconstruction appears on axis. We experimentally demonstrate this new encoding technique with holograms implemented onto a flicker-free phase-only spatial light modulator (SLM), which allows the axial generation of such holograms. The experimental verification includes the phase measurement of generated patterns with a phase-shifting polarization interferometer implemented in the same experimental setup.

  8. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  9. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  10. Square wave voltammetry at the dropping mercury electrode: Experimental

    USGS Publications Warehouse

    Turner, J.A.; Christie, J.H.; Vukovic, M.; Osteryoung, R.A.

    1977-01-01

    Experimental verification of earlier theoretical work for square wave voltammetry at the dropping mercury electrode is given. Experiments using ferric oxalate and cadmium(II) in HCl confirm excellent agreement with theory. Experimental peak heights and peak widths are found to be within 2% of calculated results. An example of trace analysis using square wave voltammetry at the DME is presented. The technique is shown to have the same order of sensitivity as differential pulse polarography but is much faster to perform. A detection limit for cadmium in 0.1 M HCl for the system used here was 7 ?? 10-8 M.

  11. Experimental verification of distributed piezoelectric actuators for use in precision space structures

    NASA Technical Reports Server (NTRS)

    Crawley, E. F.; De Luis, J.

    1986-01-01

    An analytic model for structures with distributed piezoelectric actuators is experimentally verified for the cases of both surface-bonded and embedded actuators. A technique for the selection of such piezoelectric actuators' location has been developed, and is noted to indicate that segmented actuators are always more effective than continuous ones, since the output of each can be individually controlled. Manufacturing techniques for the bonding or embedding of segmented piezoelectric actuators are also developed which allow independent electrical contact to be made with each actuator. Static tests have been conducted to determine how the elastic properties of the composite are affected by the presence of an embedded actuator, for the case of glass/epoxy laminates.

  12. Zone plate method for electronic holographic display using resolution redistribution technique.

    PubMed

    Takaki, Yasuhiro; Nakamura, Junya

    2011-07-18

    The resolution redistribution (RR) technique can increase the horizontal viewing-zone angle and screen size of electronic holographic display. The present study developed a zone plate method that would reduce hologram calculation time for the RR technique. This method enables calculation of an image displayed on a spatial light modulator by performing additions of the zone plates, while the previous calculation method required performing the Fourier transform twice. The derivation and modeling of the zone plate are shown. In addition, the look-up table approach was introduced for further reduction in computation time. Experimental verification using a holographic display module based on the RR technique is presented.

  13. Experimental verification of a GPC-LPV method with RLS and P1-TS fuzzy-based estimation for limiting the transient and residual vibration of a crane system

    NASA Astrophysics Data System (ADS)

    Smoczek, Jaroslaw

    2015-10-01

    The paper deals with the problem of reducing the residual vibration and limiting the transient oscillations of a flexible and underactuated system with respect to the variation of operating conditions. The comparative study of generalized predictive control (GPC) and fuzzy scheduling scheme developed based on the P1-TS fuzzy theory, local pole placement method and interval analysis of closed-loop system polynomial coefficients is addressed to the problem of flexible crane control. The two alternatives of a GPC-based method are proposed that enable to realize this technique either with or without a sensor of payload deflection. The first control technique is based on the recursive least squares (RLS) method applied to on-line estimate the parameters of a linear parameter varying (LPV) model of a crane dynamic system. The second GPC-based approach is based on a payload deflection feedback estimated using a pendulum model with the parameters interpolated using the P1-TS fuzzy system. Feasibility and applicability of the developed methods were confirmed through experimental verification performed on a laboratory scaled overhead crane.

  14. Neutron Scattering from Polymers: Five Decades of Developing Possibilities.

    PubMed

    Higgins, J S

    2016-06-07

    The first three decades of my research career closely map the development of neutron scattering techniques for the study of molecular behavior. At the same time, the theoretical understanding of organization and motion of polymer molecules, especially in the bulk state, was developing rapidly and providing many predictions crying out for experimental verification. Neutron scattering is an ideal technique for providing the necessary evidence. This autobiographical essay describes the applications by my research group and other collaborators of increasingly sophisticated neutron scattering techniques to observe and understand molecular behavior in polymeric materials. It has been a stimulating and rewarding journey.

  15. A fingerprint key binding algorithm based on vector quantization and error correction

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  16. Simulation, Model Verification and Controls Development of Brayton Cycle PM Alternator: Testing and Simulation of 2 KW PM Generator with Diode Bridge Output

    NASA Technical Reports Server (NTRS)

    Stankovic, Ana V.

    2003-01-01

    Professor Stankovic will be developing and refining Simulink based models of the PM alternator and comparing the simulation results with experimental measurements taken from the unit. Her first task is to validate the models using the experimental data. Her next task is to develop alternative control techniques for the application of the Brayton Cycle PM Alternator in a nuclear electric propulsion vehicle. The control techniques will be first simulated using the validated models then tried experimentally with hardware available at NASA. Testing and simulation of a 2KW PM synchronous generator with diode bridge output is described. The parameters of a synchronous PM generator have been measured and used in simulation. Test procedures have been developed to verify the PM generator model with diode bridge output. Experimental and simulation results are in excellent agreement.

  17. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  18. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    NASA Astrophysics Data System (ADS)

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  19. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  20. Experimental Technique and Assessment for Measuring the Convective Heat Transfer Coefficient from Natural Ice Accretions

    NASA Technical Reports Server (NTRS)

    Masiulaniec, K. Cyril; Vanfossen, G. James, Jr.; Dewitt, Kenneth J.; Dukhan, Nihad

    1995-01-01

    A technique was developed to cast frozen ice shapes that had been grown on a metal surface. This technique was applied to a series of ice shapes that were grown in the NASA Lewis Icing Research Tunnel on flat plates. Nine flat plates, 18 inches square, were obtained from which aluminum castings were made that gave good ice shape characterizations. Test strips taken from these plates were outfitted with heat flux gages, such that when placed in a dry wind tunnel, can be used to experimentally map out the convective heat transfer coefficient in the direction of flow from the roughened surfaces. The effects on the heat transfer coefficient for both parallel and accelerating flow will be studied. The smooth plate model verification baseline data as well as one ice roughened test case are presented.

  1. Design Methodology and Experimental Verification of Serpentine/Folded Waveguide TWTs

    DTIC Science & Technology

    2016-03-17

    FW), oscillation, serpentine, stopband, traveling -wave tube (TWT), vacuum electronics. I. INTRODUCTION DEVELOPMENT of high-power broadband vacuum elec...tron devices (VEDs) beyond Ka-band using conventional coupled-cavity and helix traveling -wave tube (TWT) RF cir- cuit fabrication techniques is...between the two positions is simply ks times the relative distance along the waveguide axis. However, from the beam–wave interaction standpoint, the

  2. An assessment of transient hydraulics phenomena and its characterization

    NASA Technical Reports Server (NTRS)

    Mortimer, R. W.

    1974-01-01

    A systematic search of the open literature was performed with the purpose of identifying the causes, effects, and characterization (modelling and solution techniques) of transient hydraulics phenomena. The governing partial differential equations are presented which were found to be used most often in the literature. Detail survey sheets are shown which contain the type of hydraulics problem, the cause, the modelling, the solution technique utilized, and experimental verification used for each paper. References and source documents are listed and a discussion of the purpose and accomplishments of the study is presented.

  3. Optically Pumped Coherent Mechanical Oscillators: The Laser Rate Equation Theory and Experimental Verification

    DTIC Science & Technology

    2012-10-23

    Naeini A H, Hill J T, Krause A, Groblacher S, Aspelmeyer M and Painter O 2011 Nature 478 89 [14] Siegman A E 1986 Lasers (Sausalito, CA: University... laser rate equation theory and experimental verification 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...coherent mechanical oscillators: the laser rate equation theory and experimental verification J B Khurgin1, M W Pruessner2,3, T H Stievater2 and W S

  4. Mathematical Modeling of Ni/H2 and Li-Ion Batteries

    NASA Technical Reports Server (NTRS)

    Weidner, John W.; White, Ralph E.; Dougal, Roger A.

    2001-01-01

    The modelling effort outlined in this viewgraph presentation encompasses the following topics: 1) Electrochemical Deposition of Nickel Hydroxide; 2) Deposition rates of thin films; 3) Impregnation of porous electrodes; 4) Experimental Characterization of Nickel Hydroxide; 5) Diffusion coefficients of protons; 6) Self-discharge rates (i.e., oxygen-evolution kinetics); 7) Hysteresis between charge and discharge; 8) Capacity loss on cycling; 9) Experimental Verification of the Ni/H2 Battery Model; 10) Mathematical Modeling Li-Ion Batteries; 11) Experimental Verification of the Li-Ion Battery Model; 11) Integrated Power System Models for Satellites; and 12) Experimental Verification of Integrated-Systems Model.

  5. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  6. Experimental quantum verification in the presence of temporally correlated noise

    NASA Astrophysics Data System (ADS)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  7. Status on the Verification of Combustion Stability for the J-2X Engine Thrust Chamber Assembly

    NASA Technical Reports Server (NTRS)

    Casiano, Matthew; Hinerman, Tim; Kenny, R. Jeremy; Hulka, Jim; Barnett, Greg; Dodd, Fred; Martin, Tom

    2013-01-01

    Development is underway of the J -2X engine, a liquid oxygen/liquid hydrogen rocket engine for use on the Space Launch System. The Engine E10001 began hot fire testing in June 2011 and testing will continue with subsequent engines. The J -2X engine main combustion chamber contains both acoustic cavities and baffles. These stability aids are intended to dampen the acoustics in the main combustion chamber. Verification of the engine thrust chamber stability is determined primarily by examining experimental data using a dynamic stability rating technique; however, additional requirements were included to guard against any spontaneous instability or rough combustion. Startup and shutdown chug oscillations are also characterized for this engine. This paper details the stability requirements and verification including low and high frequency dynamics, a discussion on sensor selection and sensor port dynamics, and the process developed to assess combustion stability. A status on the stability results is also provided and discussed.

  8. Very high power THz radiation sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, G.L.; Martin, Michael C.; McKinney, Wayne R.

    2002-10-31

    We report the production of high power (20 watts average, {approx} 1 Megawatt peak) broadband THz light based on coherent emission from relativistic electrons. Such sources are ideal for imaging, for high power damage studies and for studies of non-linear phenomena in this spectral range. We describe the source, presenting theoretical calculations and their experimental verification. For clarity we compare this source to one based on ultrafast laser techniques.

  9. National Centers for Environmental Prediction

    Science.gov Websites

    Reference List Table of Contents NCEP OPERATIONAL MODEL FORECAST GRAPHICS PARALLEL/EXPERIMENTAL MODEL Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS VERIFICATION (GRID VS.OBS) WEB PAGE (NCEP EXPERIMENTAL PAGE, INTERNAL USE ONLY) Interactive web page tool for

  10. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  11. Measurement of Plastic Stress and Strain for Analytical Method Verification (MSFC Center Director's Discretionary Fund Project No. 93-08)

    NASA Technical Reports Server (NTRS)

    Price, J. M.; Steeve, B. E.; Swanson, G. R.

    1999-01-01

    The analytical prediction of stress, strain, and fatigue life at locations experiencing local plasticity is full of uncertainties. Much of this uncertainty arises from the material models and their use in the numerical techniques used to solve plasticity problems. Experimental measurements of actual plastic strains would allow the validity of these models and solutions to be tested. This memorandum describes how experimental plastic residual strain measurements were used to verify the results of a thermally induced plastic fatigue failure analysis of a space shuttle main engine fuel pump component.

  12. An experimental/analytical program to assess the utility of lidar for pollution monitoring

    NASA Technical Reports Server (NTRS)

    Mills, F. S.; Allen, R. J.; Butler, C. F.; Kindle, E. C.

    1978-01-01

    The development and demonstration of lidar techniques for the remote measurement of atmospheric constituents and transport processes in the lower troposphere was carried out. Particular emphasis was given to techniques for monitoring SO2 and particulates, the principal pollutants in power plant and industrial plumes. Data from a plume dispersion study conducted in Maryland during September and October 1976 were reduced, and a data base was assembled which is available to the scientific community for plume model verification. A UV Differential Absorption Lidar (DIAL) was built, and preliminary testing was done.

  13. Bistatic radar sea state monitoring

    NASA Technical Reports Server (NTRS)

    Ruck, G. T.; Barrick, D. E.; Kaliszewski, T.

    1972-01-01

    Bistatic radar techniques were examined for remote measurement of the two-dimensional surface wave height spectrum of the ocean. One technique operates at high frequencies (HF), 3-30 MHz, and the other at ultrahigh frequencies (UHF), approximately 1 GHz. Only a preliminary theoretical examination of the UHF technique was performed; however the principle underlying the HF technique was demonstrated experimentally with results indicating that an HF bistatic system using a surface transmitter and an orbital receiver would be capable of measuring the two-dimensional wave height spectrum in the vicinity of the transmitter. An HF bistatic system could also be used with an airborne receiver for ground truth ocean wave spectrum measurements. Preliminary system requirements and hardware configurations are discussed for both an orbital system and an aircraft verification experiment.

  14. Verification of Internal Dose Calculations.

    NASA Astrophysics Data System (ADS)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous phantoms, such as the MIRD phantom and its physical representation, Mr. ADAM. The results indicated that the Reciprocity Theorem is valid within an average range of uncertainty of 8%.

  15. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  16. Artificial tektites: an experimental technique for capturing the shapes of spinning drops

    NASA Astrophysics Data System (ADS)

    Baldwin, Kyle A.; Butler, Samuel L.; Hill, Richard J. A.

    2015-01-01

    Determining the shapes of a rotating liquid droplet bound by surface tension is an archetypal problem in the study of the equilibrium shapes of a spinning and charged droplet, a problem that unites models of the stability of the atomic nucleus with the shapes of astronomical-scale, gravitationally-bound masses. The shapes of highly deformed droplets and their stability must be calculated numerically. Although the accuracy of such models has increased with the use of progressively more sophisticated computational techniques and increases in computing power, direct experimental verification is still lacking. Here we present an experimental technique for making wax models of these shapes using diamagnetic levitation. The wax models resemble splash-form tektites, glassy stones formed from molten rock ejected from asteroid impacts. Many tektites have elongated or `dumb-bell' shapes due to their rotation mid-flight before solidification, just as we observe here. Measurements of the dimensions of our wax `artificial tektites' show good agreement with equilibrium shapes calculated by our numerical model, and with previous models. These wax models provide the first direct experimental validation for numerical models of the equilibrium shapes of spinning droplets, of importance to fundamental physics and also to studies of tektite formation.

  17. Verification of passive cooling techniques in the Super-FRS beam collimators

    NASA Astrophysics Data System (ADS)

    Douma, C. A.; Gellanki, J.; Najafi, M. A.; Moeini, H.; Kalantar-Nayestanaki, N.; Rigollet, C.; Kuiken, O. J.; Lindemulder, M. F.; Smit, H. A. J.; Timersma, H. J.

    2016-08-01

    The Super FRagment Separator (Super-FRS) at the FAIR facility will be the largest in-flight separator of heavy ions in the world. One of the essential steps in the separation procedure is to stop the unwanted ions with beam collimators. In one of the most common situations, the heavy ions are produced by a fission reaction of a primary 238U-beam (1.5 GeV/u) hitting a 12C target (2.5 g/cm2). In this situation, some of the produced ions are highly charged states of 238U. These ions can reach the collimators with energies of up to 1.3 GeV/u and a power of up to 500 W. Under these conditions, a cooling system is required to prevent damage to the collimators and to the corresponding electronics. Due to the highly radioactive environment, both the collimators and the cooling system must be suitable for robot handling. Therefore, an active cooling system is undesirable because of the increased possibility of malfunctioning and other complications. By using thermal simulations (performed with NX9 of Siemens PLM), the possibility of passive cooling is explored. The validity of these simulations is tested by independent comparison with other simulation programs and by experimental verification. The experimental verification is still under analysis, but preliminary results indicate that the explored passive cooling option provides sufficient temperature reduction.

  18. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    NASA Technical Reports Server (NTRS)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  19. Study on verifying the angle measurement performance of the rotary-laser system

    NASA Astrophysics Data System (ADS)

    Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui

    2018-04-01

    An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.

  20. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  1. An Extension of the Split Window Technique for the Retrieval of Precipitable Water: Experimental Verification

    DTIC Science & Technology

    1988-09-23

    DOWNGRADING SCHEDULE D~istribution Unlimited 4. PERFORMING ORGANIZATiON REPORT NUMVBER(S) 5. MONITORiG ORGANIZATION REPORT NUMBER(S) AFGL-TR-88-0237...Collocations were performed on launch sites of the cloud contamination, aerosol problems, collocation 1200 UT radiosondes on 25 Aug 1987. Statistics were...al (1987) and Thomason, 1987). In this imagery opaque clouds to this problem appear white, low clouds and fog appear bright red against a brown

  2. Development of analysis technique to predict the material behavior of blowing agent

    NASA Astrophysics Data System (ADS)

    Hwang, Ji Hoon; Lee, Seonggi; Hwang, So Young; Kim, Naksoo

    2014-11-01

    In order to numerically simulate the foaming behavior of mastic sealer containing the blowing agent, a foaming and driving force model are needed which incorporate the foaming characteristics. Also, the elastic stress model is required to represent the material behavior of co-existing phase of liquid state and the cured polymer. It is important to determine the thermal properties such as thermal conductivity and specific heat because foaming behavior is heavily influenced by temperature change. In this study, three models are proposed to explain the foaming process and material behavior during and after the process. To obtain the material parameters in each model, following experiments and the numerical simulations are performed: thermal test, simple shear test and foaming test. The error functions are defined as differences between the experimental measurements and the numerical simulation results, and then the parameters are determined by minimizing the error functions. To ensure the validity of the obtained parameters, the confirmation simulation for each model is conducted by applying the determined parameters. The cross-verification is performed by measuring the foaming/shrinkage force. The results of cross-verification tended to follow the experimental results. Interestingly, it was possible to estimate the micro-deformation occurring in automobile roof surface by applying the proposed model to oven process analysis. The application of developed analysis technique will contribute to the design with minimized micro-deformation.

  3. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  4. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  5. Quantum money with nearly optimal error tolerance

    NASA Astrophysics Data System (ADS)

    Amiri, Ryan; Arrazola, Juan Miguel

    2017-06-01

    We present a family of quantum money schemes with classical verification which display a number of benefits over previous proposals. Our schemes are based on hidden matching quantum retrieval games and they tolerate noise up to 23 % , which we conjecture reaches 25 % asymptotically as the dimension of the underlying hidden matching states is increased. Furthermore, we prove that 25 % is the maximum tolerable noise for a wide class of quantum money schemes with classical verification, meaning our schemes are almost optimally noise tolerant. We use methods in semidefinite programming to prove security in a substantially different manner to previous proposals, leading to two main advantages: first, coin verification involves only a constant number of states (with respect to coin size), thereby allowing for smaller coins; second, the reusability of coins within our scheme grows linearly with the size of the coin, which is known to be optimal. Last, we suggest methods by which the coins in our protocol could be implemented using weak coherent states and verified using existing experimental techniques, even in the presence of detector inefficiencies.

  6. Investigation of high-strength bolt-tightening verification techniques.

    DOT National Transportation Integrated Search

    2016-03-01

    The current means and methods of verifying that high-strength bolts have been properly tightened are very laborious and time : consuming. In some cases, the techniques require special equipment and, in other cases, the verification itself may be some...

  7. An Alternative Approach to "Identification of Unknowns": Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria.

    PubMed

    Martinez-Vaz, Betsy M; Denny, Roxanne; Young, Nevin D; Sadowsky, Michael J

    2015-12-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students' knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students' verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students' knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001.

  8. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  9. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  10. Experimental verification of the Neuber relation at room and elevated temperatures. M.S. Thesis; [to predict stress-strain behavior in notched specimens of hastelloy x

    NASA Technical Reports Server (NTRS)

    Lucas, L. J.

    1982-01-01

    The accuracy of the Neuber equation at room temperature and 1,200 F as experimentally determined under cyclic load conditions with hold times. All strains were measured with an interferometric technique at both the local and remote regions of notched specimens. At room temperature, strains were obtained for the initial response at one load level and for cyclically stable conditions at four load levels. Stresses in notched members were simulated by subjecting smooth specimens to he same strains as were recorded on the notched specimen. Local stress-strain response was then predicted with excellent accuracy by subjecting a smooth specimen to limits established by the Neuber equation. Data at 1,200 F were obtained with the same experimental techniques but only in the cyclically stable conditions. The Neuber prediction at this temperature gave relatively accurate results in terms of predicting stress and strain points.

  11. Experimental verification of a model of a two-link flexible, lightweight manipulator. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Huggins, James David

    1988-01-01

    Experimental verification is presented for an assumed modes model of a large, two link, flexible manipulator design and constructed in the School of Mechanical Engineering at Georgia Institute of Technology. The structure was designed to have typical characteristics of a lightweight manipulator.

  12. Thermal noise in space-charge-limited hole current in silicon

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Golder, J.; Nicolet, M.

    1972-01-01

    Present theories on noise in single-carrier space-charge-limited currents in solids have not been quantitatively substantiated by experimental evidence. To obtain such experimental verification, the noise in specially fabricated silicon structures is being measured and analyzed. The first results of this verification effort are reported.

  13. Accelerated testing of space mechanisms

    NASA Technical Reports Server (NTRS)

    Murray, S. Frank; Heshmat, Hooshang

    1995-01-01

    This report contains a review of various existing life prediction techniques used for a wide range of space mechanisms. Life prediction techniques utilized in other non-space fields such as turbine engine design are also reviewed for applicability to many space mechanism issues. The development of new concepts on how various tribological processes are involved in the life of the complex mechanisms used for space applications are examined. A 'roadmap' for the complete implementation of a tribological prediction approach for complex mechanical systems including standard procedures for test planning, analytical models for life prediction and experimental verification of the life prediction and accelerated testing techniques are discussed. A plan is presented to demonstrate a method for predicting the life and/or performance of a selected space mechanism mechanical component.

  14. Comparison of dual and single exposure techniques in dual-energy chest radiography.

    PubMed

    Ho, J T; Kruger, R A; Sorenson, J A

    1989-01-01

    Conventional chest radiography is the most effective tool for lung cancer detection and diagnosis; nevertheless, a high percentage of lung cancer tumors are missed because of the overlap of lung nodule image contrast with bone image contrast in a chest radiograph. Two different energy subtraction strategies, dual exposure and single exposure techniques, were studied for decomposing a radiograph into bone-free and soft tissue-free images to address this problem. For comparing the efficiency of these two techniques in lung nodule detection, the performances of the techniques were evaluated on the basis of residual tissue contrast, energy separation, and signal-to-noise ratio. The evaluation was based on both computer simulation and experimental verification. The dual exposure technique was found to be better than the single exposure technique because of its higher signal-to-noise ratio and greater residual tissue contrast. However, x-ray tube loading and patient motion are problems.

  15. Investigation of high-strength bolt-tightening verification techniques : tech transfer summary.

    DOT National Transportation Integrated Search

    2016-03-01

    The primary objective of this project was to explore the current state-of-practice and the state-of-the-art techniques for high-strength bolt tightening and verification in structural steel connections. This project was completed so that insight coul...

  16. Quantified Event Automata: Towards Expressive and Efficient Runtime Monitors

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Falcone, Ylies; Havelund, Klaus; Reger, Giles; Rydeheard, David

    2012-01-01

    Runtime verification is the process of checking a property on a trace of events produced by the execution of a computational system. Runtime verification techniques have recently focused on parametric specifications where events take data values as parameters. These techniques exist on a spectrum inhabited by both efficient and expressive techniques. These characteristics are usually shown to be conflicting - in state-of-the-art solutions, efficiency is obtained at the cost of loss of expressiveness and vice-versa. To seek a solution to this conflict we explore a new point on the spectrum by defining an alternative runtime verification approach.We introduce a new formalism for concisely capturing expressive specifications with parameters. Our technique is more expressive than the currently most efficient techniques while at the same time allowing for optimizations.

  17. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    ERIC Educational Resources Information Center

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  18. Experimental Verification of Boyle's Law and the Ideal Gas Law

    ERIC Educational Resources Information Center

    Ivanov, Dragia Trifonov

    2007-01-01

    Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…

  19. In-line phase contrast micro-CT reconstruction for biomedical specimens.

    PubMed

    Fu, Jian; Tan, Renbo

    2014-01-01

    X-ray phase contrast micro computed tomography (micro-CT) can non-destructively provide the internal structure information of soft tissues and low atomic number materials. It has become an invaluable analysis tool for biomedical specimens. Here an in-line phase contrast micro-CT reconstruction technique is reported, which consists of a projection extraction method and the conventional filter back-projection (FBP) reconstruction algorithm. The projection extraction is implemented by applying the Fourier transform to the forward projections of in-line phase contrast micro-CT. This work comprises a numerical study of the method and its experimental verification using a biomedical specimen dataset measured at an X-ray tube source micro-CT setup. The numerical and experimental results demonstrate that the presented technique can improve the imaging contrast of biomedical specimens. It will be of interest for a wide range of in-line phase contrast micro-CT applications in medicine and biology.

  20. Experimental verification of a radiofrequency power model for Wi-Fi technology.

    PubMed

    Fang, Minyu; Malone, David

    2010-04-01

    When assessing the power emitted from a Wi-Fi network, it has been observed that these networks operate at a relatively low duty cycle. In this paper, we extend a recently introduced model of emitted power in Wi-Fi networks to cover conditions where devices do not always have packets to transmit. We present experimental results to validate the original model and its extension by developing approximate, but practical, testbed measurement techniques. The accuracy of the models is confirmed, with small relative errors: less than 5-10%. Moreover, we confirm that the greatest power is emitted when the network is saturated with traffic. Using this, we give a simple technique to quickly estimate power output based on traffic levels and give examples showing how this might be used in practice to predict current or future power output from a Wi-Fi network.

  1. Theory, simulation and experiments for precise deflection control of radiotherapy electron beams.

    PubMed

    Figueroa, R; Leiva, J; Moncada, R; Rojas, L; Santibáñez, M; Valente, M; Velásquez, J; Young, H; Zelada, G; Yáñez, R; Guillen, Y

    2018-03-08

    Conventional radiotherapy is mainly applied by linear accelerators. Although linear accelerators provide dual (electron/photon) radiation beam modalities, both of them are intrinsically produced by a megavoltage electron current. Modern radiotherapy treatment techniques are based on suitable devices inserted or attached to conventional linear accelerators. Thus, precise control of delivered beam becomes a main key issue. This work presents an integral description of electron beam deflection control as required for novel radiotherapy technique based on convergent photon beam production. Theoretical and Monte Carlo approaches were initially used for designing and optimizing device´s components. Then, dedicated instrumentation was developed for experimental verification of electron beam deflection due to the designed magnets. Both Monte Carlo simulations and experimental results support the reliability of electrodynamics models used to predict megavoltage electron beam control. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Poster - 16: Time-resolved diode dosimetry for in vivo proton therapy range verification: calibration through numerical modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toltz, Allison; Hoesl, Michaela; Schuemann, Jan

    Purpose: A method to refine the implementation of an in vivo, adaptive proton therapy range verification methodology was investigated. Simulation experiments and in-phantom measurements were compared to validate the calibration procedure of a time-resolved diode dosimetry technique. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification by correlating properties of the detector signal to the water equivalent path length (WEPL). The implementation of this system requires a set of calibration measurements to establish a beam-specific diode response to WEPL fit for the selected ‘scout’ beam in a solidmore » water phantom. This process is both tedious, as it necessitates a separate set of measurements for every ‘scout’ beam that may be appropriate to the clinical case, as well as inconvenient due to limited access to the clinical beamline. The diode response to WEPL relationship for a given ‘scout’ beam may be determined within a simulation environment, facilitating the applicability of this dosimetry technique. Measurements for three ‘scout’ beams were compared against simulated detector response with Monte Carlo methods using the Tool for Particle Simulation (TOPAS). Results: Detector response in water equivalent plastic was successfully validated against simulation for spread out Bragg peaks of range 10 cm, 15 cm, and 21 cm (168 MeV, 177 MeV, and 210 MeV) with adjusted R{sup 2} of 0.998. Conclusion: Feasibility has been shown for performing calibration of detector response for a given ‘scout’ beam through simulation for the time resolved diode dosimetry technique.« less

  3. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  4. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  5. Holographic aids for internal combustion engine flow studies

    NASA Technical Reports Server (NTRS)

    Regan, C.

    1984-01-01

    Worldwide interest in improving the fuel efficiency of internal combustion (I.C.) engines has sparked research efforts designed to learn more about the flow processes of these engines. The flow fields must be understood prior to fuel injection in order to design efficient valves, piston geometries, and fuel injectors. Knowledge of the flow field is also necessary to determine the heat transfer to combustion chamber surfaces. Computational codes can predict velocity and turbulence patterns, but experimental verification is mandatory to justify their basic assumptions. Due to their nonintrusive nature, optical methods are ideally suited to provide the necessary velocity verification data. Optical sytems such as Schlieren photography, laser velocimetry, and illuminated particle visualization are used in I.C. engines, and now their versatility is improved by employing holography. These holographically enhanced optical techniques are described with emphasis on their applications in I.C. engines.

  6. Verification of spatial and temporal pressure distributions in segmented solid rocket motors

    NASA Technical Reports Server (NTRS)

    Salita, Mark

    1989-01-01

    A wide variety of analytical tools are in use today to predict the history and spatial distributions of pressure in the combustion chambers of solid rocket motors (SRMs). Experimental and analytical methods are presented here that allow the verification of many of these predictions. These methods are applied to the redesigned space shuttle booster (RSRM). Girth strain-gage data is compared to the predictions of various one-dimensional quasisteady analyses in order to verify the axial drop in motor static pressure during ignition transients as well as quasisteady motor operation. The results of previous modeling of radial flows in the bore, slots, and around grain overhangs are supported by approximate analytical and empirical techniques presented here. The predictions of circumferential flows induced by inhibitor asymmetries, nozzle vectoring, and propellant slump are compared to each other and to subscale cold air and water tunnel measurements to ascertain their validity.

  7. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  8. Kinetic Model of Growth of Arthropoda Populations

    NASA Astrophysics Data System (ADS)

    Ershov, Yu. A.; Kuznetsov, M. A.

    2018-05-01

    Kinetic equations were derived for calculating the growth of crustacean populations ( Crustacea) based on the biological growth model suggested earlier using shrimp ( Caridea) populations as an example. The development cycle of successive stages for populations can be represented in the form of quasi-chemical equations. The kinetic equations that describe the development cycle of crustaceans allow quantitative prediction of the development of populations depending on conditions. In contrast to extrapolation-simulation models, in the developed kinetic model of biological growth the kinetic parameters are the experimental characteristics of population growth. Verification and parametric identification of the developed model on the basis of the experimental data showed agreement with experiment within the error of the measurement technique.

  9. Experimental verification of vapor deposition rate theory in high velocity burner rigs

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.; Santoro, Gilbert J.

    1985-01-01

    The main objective has been the experimental verification of the corrosive vapor deposition theory in high-temperature, high-velocity environments. Towards this end a Mach 0.3 burner-rig appartus was built to measure deposition rates from salt-seeded (mostly Na salts) combustion gases on the internally cooled cylindrical collector. Deposition experiments are underway.

  10. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence.

    PubMed

    Vavrek, Jayson R; Henderson, Brian S; Danagoulian, Areg

    2018-04-24

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618-8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal from the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy "genuine" and "hoax" objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.

  11. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  12. Detecting special nuclear material using muon-induced neutron emission

    NASA Astrophysics Data System (ADS)

    Guardincerri, Elena; Bacon, Jeffrey; Borozdin, Konstantin; Matthew Durham, J.; Fabritius, Joseph, II; Hecht, Adam; Milner, Edward C.; Miyadera, Haruo; Morris, Christopher L.; Perry, John; Poulson, Daniel

    2015-07-01

    The penetrating ability of cosmic ray muons makes them an attractive probe for imaging dense materials. Here, we describe experimental results from a new technique that uses neutrons generated by cosmic-ray muons to identify the presence of special nuclear material (SNM). Neutrons emitted from SNM are used to tag muon-induced fission events in actinides and laminography is used to form images of the stopping material. This technique allows the imaging of SNM-bearing objects tagged using muon tracking detectors located above or to the side of the objects, and may have potential applications in warhead verification scenarios. During the experiment described here we did not attempt to distinguish the type or grade of the SNM.

  13. Validation of scramjet exhaust simulation technique at Mach 6

    NASA Technical Reports Server (NTRS)

    Hopkins, H. B.; Konopka, W.; Leng, J.

    1979-01-01

    Current design philosophy for hydrogen-fueled, scramjet-powered hypersonic aircraft results in configurations with strong couplings between the engine plume and vehicle aerodynamics. The experimental verification of the scramjet exhaust simulation is described. The scramjet exhaust was reproduced for the Mach 6 flight condition by the detonation tube simulator. The exhaust flow pressure profiles, and to a large extent the heat transfer rate profiles, were then duplicated by cool gas mixtures of Argon and Freon 13B1 or Freon 12. The results of these experiments indicate that a cool gas simulation of the hot scramjet exhaust is a viable simulation technique except for phenomena which are dependent on the wall temperature relative to flow temperature.

  14. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  15. Asessment of adequacy of the monitoring method in the activity of a verification laboratory

    NASA Astrophysics Data System (ADS)

    Ivanov, R. N.; Grinevich, V. A.; Popov, A. A.; Shalay, V. V.; Malaja, L. D.

    2018-04-01

    Questions of assessing adequacy of a risk monitoring technique for a verification laboratory operation concerning the conformity to the accreditation criteria, and aimed at decision-making on advisability of a verification laboratory activities in the declared area of accreditation are considered.

  16. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latty, Drew, E-mail: drew.latty@health.nsw.gov.au; Stuart, Kirsty E; Westmead Breast Cancer Institute, Sydney, New South Wales

    Radiation treatment to the left breast is associated with increased cardiac morbidity and mortality. The deep inspiration breath-hold technique (DIBH) can decrease radiation dose delivered to the heart and this may facilitate the treatment of the internal mammary chain nodes. The aim of this review is to critically analyse the literature available in relation to breath-hold methods, implementation, utilisation, patient compliance, planning methods and treatment verification of the DIBH technique. Despite variation in the literature regarding the DIBH delivery method, patient coaching, visual feedback mechanisms and treatment verification, all methods of DIBH delivery reduce radiation dose to the heart. Furthermore » research is required to determine optimum protocols for patient training and treatment verification to ensure the technique is delivered successfully.« less

  18. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  19. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.

  20. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

  1. The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.

    ERIC Educational Resources Information Center

    National Evaluation Systems, Inc., Amherst, MA.

    National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…

  2. Alternative Nonvolatile Residue Analysis with Contaminant Identification Project

    NASA Technical Reports Server (NTRS)

    Loftin, Kathleen (Compiler); Summerfield, Burton (Compiler); Thompson, Karen (Compiler); Mullenix, Pamela (Compiler); Zeitlin, Nancy (Compiler)

    2015-01-01

    Cleanliness verification is required in numerous industries including spaceflight ground support, electronics, medical and aerospace. Currently at KSC requirement for cleanliness verification use solvents that environmentally unfriendly. This goal of this project is to produce an alternative cleanliness verification technique that is both environmentally friendly and more cost effective.

  3. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  4. Evaluation of Mesoscale Model Phenomenological Verification Techniques

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred

    2006-01-01

    Forecasters at the Spaceflight Meteorology Group, 45th Weather Squadron, and National Weather Service in Melbourne, FL use mesoscale numerical weather prediction model output in creating their operational forecasts. These models aid in forecasting weather phenomena that could compromise the safety of launch, landing, and daily ground operations and must produce reasonable weather forecasts in order for their output to be useful in operations. Considering the importance of model forecasts to operations, their accuracy in forecasting critical weather phenomena must be verified to determine their usefulness. The currently-used traditional verification techniques involve an objective point-by-point comparison of model output and observations valid at the same time and location. The resulting statistics can unfairly penalize high-resolution models that make realistic forecasts of a certain phenomena, but are offset from the observations in small time and/or space increments. Manual subjective verification can provide a more valid representation of model performance, but is time-consuming and prone to personal biases. An objective technique that verifies specific meteorological phenomena, much in the way a human would in a subjective evaluation, would likely produce a more realistic assessment of model performance. Such techniques are being developed in the research community. The Applied Meteorology Unit (AMU) was tasked to conduct a literature search to identify phenomenological verification techniques being developed, determine if any are ready to use operationally, and outline the steps needed to implement any operationally-ready techniques into the Advanced Weather Information Processing System (AWIPS). The AMU conducted a search of all literature on the topic of phenomenological-based mesoscale model verification techniques and found 10 different techniques in various stages of development. Six of the techniques were developed to verify precipitation forecasts, one to verify sea breeze forecasts, and three were capable of verifying several phenomena. The AMU also determined the feasibility of transitioning each technique into operations and rated the operational capability of each technique on a subjective 1-10 scale: (1) 1 indicates that the technique is only in the initial stages of development, (2) 2-5 indicates that the technique is still undergoing modifications and is not ready for operations, (3) 6-8 indicates a higher probability of integrating the technique into AWIPS with code modifications, and (4) 9-10 indicates that the technique was created for AWIPS and is ready for implementation. Eight of the techniques were assigned a rating of 5 or below. The other two received ratings of 6 and 7, and none of the techniques a rating of 9-10. At the current time, there are no phenomenological model verification techniques ready for operational use. However, several of the techniques described in this report may become viable techniques in the future and should be monitored for updates in the literature. The desire to use a phenomenological verification technique is widespread in the modeling community, and it is likely that other techniques besides those described herein are being developed, but the work has not yet been published. Therefore, the AMIU recommends that the literature continue to be monitored for updates to the techniques described in this report and for new techniques being developed whose results have not yet been published. 111

  5. The Multiple Doppler Radar Workshop, November 1979.

    NASA Astrophysics Data System (ADS)

    Carbone, R. E.; Harris, F. I.; Hildebrand, P. H.; Kropfli, R. A.; Miller, L. J.; Moninger, W.; Strauch, R. G.; Doviak, R. J.; Johnson, K. W.; Nelson, S. P.; Ray, P. S.; Gilet, M.

    1980-10-01

    The findings of the Multiple Doppler Radar Workshop are summarized by a series of six papers. Part I of this series briefly reviews the history of multiple Doppler experimentation, fundamental concepts of Doppler signal theory, and organization and objectives of the Workshop. Invited presentations by dynamicists and cloud physicists are also summarized.Experimental design and procedures (Part II) are shown to be of critical importance. Well-defined and limited experimental objectives are necessary in view of technological limitations. Specified radar scanning procedures that balance temporal and spatial resolution considerations are discussed in detail. Improved siting for suppression of ground clutter as well as scanning procedures to minimize errors at echo boundaries are discussed. The need for accelerated research using numerically simulated proxy data sets is emphasized.New technology to eliminate various sampling limitations is cited as an eventual solution to many current problems in Part III. Ground clutter contamination may be curtailed by means of full spectral processing, digital filters in real time, and/or variable pulse repetition frequency. Range and velocity ambiguities also may be minimized by various pulsing options as well as random phase transmission. Sidelobe contamination can be reduced through improvements in radomes, illumination patterns, and antenna feed types. Radar volume-scan time can be sharply reduced by means of wideband transmission, phased array antennas, multiple beam antennas, and frequency agility.Part IV deals with synthesis of data from several radars in the context of scientific requirements in cumulus clouds, widespread precipitation, and severe convective storms. The important temporal and spatial scales are examined together with the accuracy required for vertical air motion in each phenomenon. Factors that introduce errors in the vertical velocity field are identified and synthesis techniques are discussed separately for the dual Doppler and multiple Doppler cases. Various filters and techniques, including statistical and variational approaches, are mentioned. Emphasis is placed on the importance of experiment design and procedures, technological improvements, incorporation of all information from supporting sensors, and analysis priority for physically simple cases. Integrated reliability is proposed as an objective tool for radar siting.Verification of multiple Doppler-derived vertical velocity is discussed in Part V. Three categories of verification are defined as direct, deductive, and theoretical/numerical. Direct verification consists of zenith-pointing radar measurements (from either airborne or ground-based systems), air motion sensing aircraft, instrumented towers, and tracking of radar chaff. Deductive sources include mesonetworks, aircraft (thermodynamic and microphysical) measurements, satellite observations, radar reflectivity, multiple Doppler consistency, and atmospheric soundings. Theoretical/numerical sources of verification include proxy data simulation, momentum checking, and numerical cloud models. New technology, principally in the form of wide bandwidth radars, is seen as a development that may reduce the need for extensive verification of multiple Doppler-derived vertical air motions. Airborne Doppler radar is perceived as the single most important source of verification within the bounds of existing technology.Nine stages of data processing and display are identified in Part VI. The stages are identified as field checks, archival, selection, editing, coordinate transformation, synthesis of Cartesian fields, filtering, display, and physical analysis. Display of data is considered to be a problem critical to assimilation of data at all stages. Interactive computing systems and software are concluded to be very important, particularly for the editing stage. Three- and 4-dimensional displays are considered essential for data assimilation, particularly at the physical analysis stage. The concept of common data tape formats is approved both for data in radar spherical space as well as for synthesized Cartesian output.1169

  6. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  7. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  8. Geometry-constraint-scan imaging for in-line phase contrast micro-CT.

    PubMed

    Fu, Jian; Yu, Guangyuan; Fan, Dekai

    2014-01-01

    X-ray phase contrast computed tomography (CT) uses the phase shift that x-rays undergo when passing through matter, rather than their attenuation, as the imaging signal and may provide better image quality in soft-tissue and biomedical materials with low atomic number. Here a geometry-constraint-scan imaging technique for in-line phase contrast micro-CT is reported. It consists of two circular-trajectory scans with x-ray detector at different positions, the phase projection extraction method with the Fresnel free-propagation theory and the filter back-projection reconstruction algorithm. This method removes the contact-detector scan and the pure phase object assumption in classical in-line phase contrast Micro-CT. Consequently it relaxes the experimental conditions and improves the image contrast. This work comprises a numerical study of this technique and its experimental verification using a biomedical composite dataset measured at an x-ray tube source Micro-CT setup. The numerical and experimental results demonstrate the validity of the presented method. It will be of interest for a wide range of in-line phase contrast Micro-CT applications in biology and medicine.

  9. Posttest calculation of the PBF LOC-11B and LOC-11C experiments using RELAP4/MOD6. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, C.E.

    Comparisons between RELAP4/MOD6, Update 4 code-calculated and measured experimental data are presented for the PBF LOC-11C and LOC-11B experiments. Independent code verification techniques are now being developed and this study represents a preliminary effort applying structured criteria for developing computer models, selecting code input, and performing base-run analyses. Where deficiencies are indicated in the base-case representation of the experiment, methods of code and criteria improvement are developed and appropriate recommendations are made.

  10. Measurements of VLF polarization and wave normal direction on OGO-F

    NASA Technical Reports Server (NTRS)

    Helliwell, R. A.

    1973-01-01

    A major achievement of the F-24 experiment on OGO 6 was a verification of the theory of the polarization of proton whistlers. As predicted, the electron whistler was found to be right-hand polarized and the proton whistler left hand polarized. The transition from right- to left-hand polarization was found to occur very rapidly. Thus it appears that the experimental technique may allow great accuracy in the measurement of the cross-over frequency, a frequency that provides information on the ionic composition of the ionosphere.

  11. A study of the dynamics of rotating space stations with elastically connected counterweight and attached flexible appendages. Volume 1: Theory

    NASA Technical Reports Server (NTRS)

    Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.

    1973-01-01

    The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.

  12. Single-Event Upset (SEU) model verification and threshold determination using heavy ions in a bipolar static RAM

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.

    1985-01-01

    Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.

  13. Hyperproperties

    DTIC Science & Technology

    2016-01-14

    hyperproperty and a liveness hyperproperty. A verification technique for safety hyperproperties is given and is shown to generalize prior tech- niques for...liveness properties are affiliated with specific verification methods. An analogous theory for security policies would be appealing. The fact that security...verified by using invariance arguments. Our verification methodology generalizes prior work on using invariance arguments to verify information-flow

  14. Dosimetric Verification of IMRT Treatment Plans Using an Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruszyna, Marta

    This paper presents the procedures and results of dosimetric verification using an Electronic Portal Imaging Device as a tool for pre-treatment dosimetry in IMRT technique at the Greater Poland Cancer Centre in Poznan, Poland. The evaluation of dosimetric verification for various organ, during a 2 year period is given.

  15. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  16. An elementary tutorial on formal specification and verification using PVS

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1993-01-01

    A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.

  17. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  18. Experimental verification of the shape of the excitation depth distribution function for AES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tougaard, S.; Jablonski, A.; Institute of Physical Chemistry, Polish Academy of Sciences, ul. Kasprzaka 44/52, 01-224 Warsaw

    2011-09-15

    In the common formalism of AES, it is assumed that the in-depth distribution of ionizations is uniform. There are experimental indications that this assumption may not be true for certain primary electron energies and solids. The term ''excitation depth distribution function'' (EXDDF) has been introduced to describe the distribution of ionizations at energies used in AES. This function is conceptually equivalent to the Phi-rho-z function of electron microprobe analysis (EPMA). There are, however, experimental difficulties to determine this function in particular for energies below {approx} 10 keV. In the present paper, we investigate the possibility of determining the shape ofmore » the EXDDF from the background of inelastically scattered electrons on the low energy side of the Auger electron features in the electron energy spectra. The experimentally determined EXDDFs are compared with the EXDDFs determined from Monte Carlo simulations of electron trajectories in solids. It is found that this technique is useful for the experimental determination of the EXDDF function.« less

  19. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  20. The effect of A teacher questioning strategy training program on teaching behavior, student achievement, and retention

    NASA Astrophysics Data System (ADS)

    Otto, Paul B.; Schuck, Robert F.

    The use of questions in the classroom has been employed throughout the recorded history of teaching. One still hears the term Socratic method during discussions of questioning procedures. The use of teacher questions is presently viewed as a viable procedure for effective instruction. This study was conducted to investigate the feasibility of training teachers in the use of a questioning technique and the resultant effect upon student learning. The Post-Test Only Control Group Design was used in randomly assigning teachers and students to experimental and control groups. A group of teachers was trained in the use of a specific questioning technique. Follow-up periodic observations were made of questioning technique behavior while teaching science units to groups of students. Post-unit achievement tests were administered to the student groups to obtain evidence of a relationship between the implementation of specific types of teacher questions and student achievement and retention. Analysis of observation data indicated a higher use of managerial and rhetorical questions by the control group than the experimental group. The experimental group employed a greater number of recall and data gathering questions as well as higher order data processing and data verification type questions. The student posttest achievement scores for both units of instruction were greater for the experimental groups than for the control groups. The retention scores for both units were Beater for the experimental groups than for the control groups.

  1. A variable resolution x-ray detector for computed tomography: I. Theoretical basis and experimental verification.

    PubMed

    DiBianca, F A; Gupta, V; Zeman, H D

    2000-08-01

    A computed tomography imaging technique called variable resolution x-ray (VRX) detection provides detector resolution ranging from that of clinical body scanning to that of microscopy (1 cy/mm to 100 cy/mm). The VRX detection technique is based on a new principle denoted as "projective compression" that allows the detector resolution element to scale proportionally to the image field size. Two classes of VRX detector geometry are considered. Theoretical aspects related to x-ray physics and data sampling are presented. Measured resolution parameters (line-spread function and modulation-transfer function) are presented and discussed. A VRX image that resolves a pair of 50 micron tungsten hairs spaced 30 microns apart is shown.

  2. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vavrek, Jayson R.; Henderson, Brian S.; Danagoulian, Areg

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618–8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal frommore » the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here in this paper we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy “genuine” and “hoax” objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.« less

  3. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence

    DOE PAGES

    Vavrek, Jayson R.; Henderson, Brian S.; Danagoulian, Areg

    2018-04-10

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618–8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal frommore » the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here in this paper we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy “genuine” and “hoax” objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.« less

  4. A new technique for measuring listening and reading literacy in developing countries

    NASA Astrophysics Data System (ADS)

    Greene, Barbara A.; Royer, James M.; Anzalone, Stephen

    1990-03-01

    One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.

  5. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  6. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  7. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  8. Modeling and prediction of extraction profile for microwave-assisted extraction based on absorbed microwave energy.

    PubMed

    Chan, Chung-Hung; Yusoff, Rozita; Ngoh, Gek-Cheng

    2013-09-01

    A modeling technique based on absorbed microwave energy was proposed to model microwave-assisted extraction (MAE) of antioxidant compounds from cocoa (Theobroma cacao L.) leaves. By adapting suitable extraction model at the basis of microwave energy absorbed during extraction, the model can be developed to predict extraction profile of MAE at various microwave irradiation power (100-600 W) and solvent loading (100-300 ml). Verification with experimental data confirmed that the prediction was accurate in capturing the extraction profile of MAE (R-square value greater than 0.87). Besides, the predicted yields from the model showed good agreement with the experimental results with less than 10% deviation observed. Furthermore, suitable extraction times to ensure high extraction yield at various MAE conditions can be estimated based on absorbed microwave energy. The estimation is feasible as more than 85% of active compounds can be extracted when compared with the conventional extraction technique. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  10. Unravelling the electrochemical double layer by direct probing of the solid/liquid interface

    PubMed Central

    Favaro, Marco; Jeong, Beomgyun; Ross, Philip N.; Yano, Junko; Hussain, Zahid; Liu, Zhi; Crumlin, Ethan J.

    2016-01-01

    The electrochemical double layer plays a critical role in electrochemical processes. Whilst there have been many theoretical models predicting structural and electrical organization of the electrochemical double layer, the experimental verification of these models has been challenging due to the limitations of available experimental techniques. The induced potential drop in the electrolyte has never been directly observed and verified experimentally, to the best of our knowledge. In this study, we report the direct probing of the potential drop as well as the potential of zero charge by means of ambient pressure X-ray photoelectron spectroscopy performed under polarization conditions. By analyzing the spectra of the solvent (water) and a spectator neutral molecule with numerical simulations of the electric field, we discern the shape of the electrochemical double layer profile. In addition, we determine how the electrochemical double layer changes as a function of both the electrolyte concentration and applied potential. PMID:27576762

  11. Ethylene Decomposition Initiated by Ultraviolet Radiation from Low Pressure Mercury Lamps: Kinetics Model Prediction and Experimental Verification.

    NASA Astrophysics Data System (ADS)

    Jozwiak, Zbigniew Boguslaw

    1995-01-01

    Ethylene is an important auto-catalytic plant growth hormone. Removal of ethylene from the atmosphere surrounding ethylene-sensitive horticultural products may be very beneficial, allowing an extended period of storage and preventing or delaying the induction of disorders. Various ethylene removal techniques have been studied and put into practice. One technique is based on using low pressure mercury ultraviolet lamps as a source of photochemical energy to initiate chemical reactions that destroy ethylene. Although previous research showed that ethylene disappeared in experiments with mercury ultraviolet lamps, the reactions were not described and the actual cause of ethylene disappearance remained unknown. Proposed causes for this disappearance were the direct action of ultraviolet rays on ethylene, reaction of ethylene with ozone (which is formed when air or gas containing molecular oxygen is exposed to radiation emitted by this type of lamp), or reactions with atomic oxygen leading to formation of ozone. The objective of the present study was to determine the set of physical and chemical actions leading to the disappearance of ethylene from artificial storage atmosphere under conditions of ultraviolet irradiation. The goal was achieved by developing a static chemical model based on the physical properties of a commercially available ultraviolet lamp, the photochemistry of gases, and the kinetics of chemical reactions. The model was used to perform computer simulations predicting time dependent concentrations of chemical species included in the model. Development of the model was accompanied by the design of a reaction chamber used for experimental verification. The model provided a good prediction of the general behavior of the species involved in the chemistry under consideration; however the model predicted lower than measured rate of ethylene disappearance. Some reasons for the model -experiment disagreement are radiation intensity averaging, the experimental technique, mass transfer in the chamber, and incompleteness of the set of chemical reactions included in the model. The work is concluded with guidelines for development of a more complex mathematical model that includes elements of mass transfer inside the reaction chamber, and uses a three dimensional approach to distribute radiation from the low pressure mercury ultraviolet tube.

  12. Verification of kinetic schemes of hydrogen ignition and combustion in air

    NASA Astrophysics Data System (ADS)

    Fedorov, A. V.; Fedorova, N. N.; Vankova, O. S.; Tropin, D. A.

    2018-03-01

    Three chemical kinetic models for hydrogen combustion in oxygen and three gas-dynamic models for reactive mixture flow behind the initiating SW front were analyzed. The calculated results were compared with experimental data on the dependences of the ignition delay on the temperature and the dilution of the mixture with argon or nitrogen. Based on detailed kinetic mechanisms of nonequilibrium chemical transformations, a mathematical technique for describing the ignition and combustion of hydrogen in air was developed using the ANSYS Fluent code. The problem of ignition of a hydrogen jet fed coaxially into supersonic flow was solved numerically. The calculations were carried out using the Favre-averaged Navier-Stokes equations for a multi-species gas taking into account chemical reactions combined with the k-ω SST turbulence model. The problem was solved in several steps. In the first step, verification of the calculated and experimental data for the three kinetic schemes was performed without considering the conicity of the flow. In the second step, parametric calculations were performed to determine the influence of the conicity of the flow on the mixing and ignition of hydrogen in air using a kinetic scheme consisting of 38 reactions. Three conical supersonic nozzles for a Mach number M = 2 with different expansion angles β = 4°, 4.5°, and 5° were considered.

  13. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  14. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  15. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  16. Numerical simulation and experimental verification of extended source interferometer

    NASA Astrophysics Data System (ADS)

    Hou, Yinlong; Li, Lin; Wang, Shanshan; Wang, Xiao; Zang, Haijun; Zhu, Qiudong

    2013-12-01

    Extended source interferometer, compared with the classical point source interferometer, can suppress coherent noise of environment and system, decrease dust scattering effects and reduce high-frequency error of reference surface. Numerical simulation and experimental verification of extended source interferometer are discussed in this paper. In order to provide guidance for the experiment, the modeling of the extended source interferometer is realized by using optical design software Zemax. Matlab codes are programmed to rectify the field parameters of the optical system automatically and get a series of interferometric data conveniently. The communication technique of DDE (Dynamic Data Exchange) was used to connect Zemax and Matlab. Then the visibility of interference fringes can be calculated through adding the collected interferometric data. Combined with the simulation, the experimental platform of the extended source interferometer was established, which consists of an extended source, interference cavity and image collection system. The decrease of high-frequency error of reference surface and coherent noise of the environment is verified. The relation between the spatial coherence and the size, shape, intensity distribution of the extended source is also verified through the analysis of the visibility of interference fringes. The simulation result is in line with the result given by real extended source interferometer. Simulation result shows that the model can simulate the actual optical interference of the extended source interferometer quite well. Therefore, the simulation platform can be used to guide the experiment of interferometer which is based on various extended sources.

  17. Finger vein recognition using local line binary pattern.

    PubMed

    Rosdi, Bakhtiar Affendi; Shing, Chai Wuh; Suandi, Shahrel Azmin

    2011-01-01

    In this paper, a personal verification method using finger vein is presented. Finger vein can be considered more secured compared to other hands based biometric traits such as fingerprint and palm print because the features are inside the human body. In the proposed method, a new texture descriptor called local line binary pattern (LLBP) is utilized as feature extraction technique. The neighbourhood shape in LLBP is a straight line, unlike in local binary pattern (LBP) which is a square shape. Experimental results show that the proposed method using LLBP has better performance than the previous methods using LBP and local derivative pattern (LDP).

  18. Electromagnetic Modelling of MMIC CPWs for High Frequency Applications

    NASA Astrophysics Data System (ADS)

    Sinulingga, E. P.; Kyabaggu, P. B. K.; Rezazadeh, A. A.

    2018-02-01

    Realising the theoretical electrical characteristics of components through modelling can be carried out using computer-aided design (CAD) simulation tools. If the simulation model provides the expected characteristics, the fabrication process of Monolithic Microwave Integrated Circuit (MMIC) can be performed for experimental verification purposes. Therefore improvements can be suggested before mass fabrication takes place. This research concentrates on development of MMIC technology by providing accurate predictions of the characteristics of MMIC components using an improved Electromagnetic (EM) modelling technique. The knowledge acquired from the modelling and characterisation process in this work can be adopted by circuit designers for various high frequency applications.

  19. A simulation technique for predicting thickness of thermal sprayed coatings

    NASA Technical Reports Server (NTRS)

    Goedjen, John G.; Miller, Robert A.; Brindley, William J.; Leissler, George W.

    1995-01-01

    The complexity of many of the components being coated today using the thermal spray process makes the trial and error approach traditionally followed in depositing a uniform coating inadequate, thereby necessitating a more analytical approach to developing robotic trajectories. A two dimensional finite difference simulation model has been developed to predict the thickness of coatings deposited using the thermal spray process. The model couples robotic and component trajectories and thermal spraying parameters to predict coating thickness. Simulations and experimental verification were performed on a rotating disk to evaluate the predictive capabilities of the approach.

  20. Fracture modes in notched angleplied composite laminates

    NASA Technical Reports Server (NTRS)

    Irvine, T. B.; Ginty, C. A.

    1984-01-01

    The Composite Durability Structural Analysis (CODSTRAN) computer code is used to determine composite fracture. Fracture modes in solid and notched, unidirectional and angleplied graphite/epoxy composites were determined by using CODSTRAN. Experimental verification included both nondestructive (ultrasonic C-Scanning) and destructive (scanning electron microscopy) techniques. The fracture modes were found to be a function of ply orientations and whether the composite is notched or unnotched. Delaminations caused by stress concentrations around notch tips were also determined. Results indicate that the composite mechanics, structural analysis, laminate analysis, and fracture criteria modules embedded in CODSTRAN are valid for determining composite fracture modes.

  1. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  2. Nonlinear earthquake analysis of reinforced concrete frames with fiber and Bernoulli-Euler beam-column element.

    PubMed

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched.

  3. An experimental investigation of nacelle-pylon installation on an unswept wing at subsonic and transonic speeds

    NASA Technical Reports Server (NTRS)

    Carlson, J. R.; Compton, W. B., III

    1984-01-01

    A wind tunnel investigation was conducted to determine the aerodynamic interference associated with the installation of a long duct, flow-through nacelle on a straight unswept untapered supercritical wing. Experimental data was obtained for the verification of computational prediction techniques. The model was tested in the 16-Foot Transonic Tunnel at Mach numbers from 0.20 to 0.875 and at angles of attack from about 0 deg to 5 deg. The results of the investigation show that strong viscous and compressibility effects are present at the transonic Mach numbers. Numerical comparisons show that linear theory is adequate for subsonic Mach number flow prediction, but is inadequate for prediction of the extreme flow conditions that exist at the transonic Mach numbers.

  4. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  5. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  6. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  7. Experimental technologies comparison for strain measurement of a composite main landing gear bay specimen

    NASA Astrophysics Data System (ADS)

    Viscardi, Massimo; Arena, Maurizio; Ciminello, Monica; Guida, Michele; Meola, Carosena; Cerreta, Pietro

    2018-03-01

    The development of advanced monitoring system for strain measurements on aeronautical components remain an important target both when related to the optimization of the lead-time and cost for part validation, allowing earlier entry into service, and when related to the implementation of advanced health monitoring systems dedicated to the in-service parameters verification and early stage detection of structural problems. The paper deals with the experimental testing of a composite samples set of the main landing gear bay for a CS-25 category aircraft, realized through an innovative design and production process. The test have represented a good opportunity for direct comparison of different strain measurement techniques: Strain Gauges (SG) and Fibers Bragg Grating (FBG) have been used as well as non-contact techniques, specifically the Digital Image Correlation (DIC) and Infrared (IR) thermography applied where possible in order to highlight possible hot-spot during the tests. The crucial points identification on the specimens has been supported by means of advanced finite element simulations, aimed to assessment of the structural strength and deformation as well as to ensure the best performance and the global safety of the whole experimental campaign.

  8. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  9. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  10. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  11. The Inhibition of the Rayleigh-Taylor Instability by Rotation.

    PubMed

    Baldwin, Kyle A; Scase, Matthew M; Hill, Richard J A

    2015-07-01

    It is well-established that the Coriolis force that acts on fluid in a rotating system can act to stabilise otherwise unstable flows. Chandrasekhar considered theoretically the effect of the Coriolis force on the Rayleigh-Taylor instability, which occurs at the interface between a dense fluid lying on top of a lighter fluid under gravity, concluding that rotation alone could not stabilise this system indefinitely. Recent numerical work suggests that rotation may, nevertheless, slow the growth of the instability. Experimental verification of these results using standard techniques is problematic, owing to the practical difficulty in establishing the initial conditions. Here, we present a new experimental technique for studying the Rayleigh-Taylor instability under rotation that side-steps the problems encountered with standard techniques by using a strong magnetic field to destabilize an otherwise stable system. We find that rotation about an axis normal to the interface acts to retard the growth rate of the instability and stabilise long wavelength modes; the scale of the observed structures decreases with increasing rotation rate, asymptoting to a minimum wavelength controlled by viscosity. We present a critical rotation rate, dependent on Atwood number and the aspect ratio of the system, for stabilising the most unstable mode.

  12. The Inhibition of the Rayleigh-Taylor Instability by Rotation

    PubMed Central

    Baldwin, Kyle A.; Scase, Matthew M.; Hill, Richard J. A.

    2015-01-01

    It is well-established that the Coriolis force that acts on fluid in a rotating system can act to stabilise otherwise unstable flows. Chandrasekhar considered theoretically the effect of the Coriolis force on the Rayleigh-Taylor instability, which occurs at the interface between a dense fluid lying on top of a lighter fluid under gravity, concluding that rotation alone could not stabilise this system indefinitely. Recent numerical work suggests that rotation may, nevertheless, slow the growth of the instability. Experimental verification of these results using standard techniques is problematic, owing to the practical difficulty in establishing the initial conditions. Here, we present a new experimental technique for studying the Rayleigh-Taylor instability under rotation that side-steps the problems encountered with standard techniques by using a strong magnetic field to destabilize an otherwise stable system. We find that rotation about an axis normal to the interface acts to retard the growth rate of the instability and stabilise long wavelength modes; the scale of the observed structures decreases with increasing rotation rate, asymptoting to a minimum wavelength controlled by viscosity. We present a critical rotation rate, dependent on Atwood number and the aspect ratio of the system, for stabilising the most unstable mode. PMID:26130005

  13. 76 FR 81991 - National Spectrum Sharing Research Experimentation, Validation, Verification, Demonstration and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... non-federal community, including the academic, commercial, and public safety sectors, to implement a..., Verification, Demonstration and Trials: Technical Workshop II on Coordinating Federal Government/Private Sector Spectrum Innovation Testing Needs AGENCY: The National Coordination Office (NCO) for Networking and...

  14. Bayesian truthing as experimental verification of C4ISR sensors

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew

    2015-05-01

    In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.

  15. Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification

    NASA Technical Reports Server (NTRS)

    Melton, D. M.

    1998-01-01

    Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.

  16. Application of a Fuzzy Verification Technique for Assessment of the Weather Running Estimate-Nowcast (WRE-N) Model

    DTIC Science & Technology

    2016-10-01

    comes when considering numerous scores and statistics during a preliminary evaluation of the applicability of the fuzzy- verification minimum coverage...The selection of thresholds with which to generate categorical-verification scores and statistics from the application of both traditional and...of statistically significant numbers of cases; the latter presents a challenge of limited application for assessment of the forecast models’ ability

  17. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  18. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  19. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and experiments, using fission-spectrum neutron sources to assess neutron transmission through composite low-Z attenuators.« less

  20. Experimental Verification Of The Osculating Cones Method For Two Waverider Forebodies At Mach 4 and 6

    NASA Technical Reports Server (NTRS)

    Miller, Rolf W.; Argrow, Brian M.; Center, Kenneth B.; Brauckmann, Gregory J.; Rhode, Matthew N.

    1998-01-01

    The NASA Langley Research Center Unitary Plan Wind Tunnel and the 20-Inch Mach 6 Tunnel were used to test two osculating cones waverider models. The Mach-4 and Mach-6 shapes were generated using the interactive design tool WIPAR. WIPAR performance predictions are compared to the experimental results. Vapor screen results for the Mach-4 model at the on- design Mach number provide visual verification that the shock is attached along the entire leading edge, within the limits of observation. WIPAR predictions of pressure distributions and aerodynamic coefficients show general agreement with the corresponding experimental values.

  1. Homolytic Cleavage of a B-B Bond by the Cooperative Catalysis of Two Lewis Bases: Computational Design and Experimental Verification.

    PubMed

    Wang, Guoqiang; Zhang, Honglin; Zhao, Jiyang; Li, Wei; Cao, Jia; Zhu, Chengjian; Li, Shuhua

    2016-05-10

    Density functional theory (DFT) investigations revealed that 4-cyanopyridine was capable of homolytically cleaving the B-B σ bond of diborane via the cooperative coordination to the two boron atoms of the diborane to generate pyridine boryl radicals. Our experimental verification provides supportive evidence for this new B-B activation mode. With this novel activation strategy, we have experimentally realized the catalytic reduction of azo-compounds to hydrazine derivatives, deoxygenation of sulfoxides to sulfides, and reduction of quinones with B2 (pin)2 at mild conditions. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Intermediate Experimental Vehicle (IXV): Avionics and Software of the ESA Reentry Demonstrator

    NASA Astrophysics Data System (ADS)

    Malucchi, Giovanni; Dussy, Stephane; Camuffo, Fabrizio

    2012-08-01

    The IXV project is conceived as a technology platform that would perform the step forward with respect to the Atmospheric Reentry Demonstrator (ARD), by increasing the system maneuverability and verifying the critical technology performances against a wider re- entry corridor.The main objective is to design, develop and to perform an in-flight verification of an autonomous lifting and aerodynamically controlled (by a combined use of thrusters and aerodynamic surfaces) reentry system.The project also includes the verification and experimentation of a set of critical reentry technologies and disciplines:Thermal Protection System (TPS), for verification and characterization of thermal protection technologies in representative operational environment;Aerodynamics - Aerthermodynamics (AED-A TD), for understanding and validation of aerodynamics and aerothermodyamics phenomena with improvement of design tools;Guidance, Navigation and Control (GNC), for verification of guidance, navigation and control techniques in representative operational environment (i.e. reentry from Low Earth Orbit);Flight dynamics, to update and validate the vehicle model during actual flight, focused on stability and control derivatives.The above activities are being performed through the implementation of a strict system design-to-cost approach with a proto-flight model development philosophy.In 2008 and 2009, the IXV project activities reached the successful completion of the project Phase-B, including the System PDR, and early project Phase-C.In 2010, following a re-organization of the industrial consortium, the IXV project successfully completed a design consolidation leading to an optimization of the technical baseline including the GNC, avionics (i.e. power, data handling, radio frequency and telemetry), measurement sensors, hot and cold composite structures, thermal protections and control, with significant improvements of the main system budgets.The project has successfully closed the System CDR during 2011 and it is currently running the Phase-D with the target to be launched with Vega from Kourou in 2014The paper will provide an overview of the IXV design and mission objectives in the frame of the atmospheric reentry overall activities, focusing on the avionics and software architecture and design.

  3. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  4. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  5. Cleanup Verification Package for the 100-F-20, Pacific Northwest Laboratory Parallel Pits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Appel

    2007-01-22

    This cleanup verification package documents completion of remedial action for the 100-F-20, Pacific Northwest Laboratory Parallel Pits waste site. This waste site consisted of two earthen trenches thought to have received both radioactive and nonradioactive material related to the 100-F Experimental Animal Farm.

  6. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  7. A Vibration-Based Strategy for Health Monitoring of Offshore Pipelines' Girth-Welds

    PubMed Central

    Razi, Pejman; Taheri, Farid

    2014-01-01

    This study presents numerical simulations and experimental verification of a vibration-based damage detection technique. Health monitoring of a submerged pipe's girth-weld against an advancing notch is attempted. Piezoelectric transducers are bonded on the pipe for sensing or actuation purposes. Vibration of the pipe is excited by two means: (i) an impulsive force; (ii) using one of the piezoelectric transducers as an actuator to propagate chirp waves into the pipe. The methodology adopts the empirical mode decomposition (EMD), which processes vibration data to establish energy-based damage indices. The results obtained from both the numerical and experimental studies confirm the integrity of the approach in identifying the existence, and progression of the advancing notch. The study also discusses and compares the performance of the two vibration excitation means in damage detection. PMID:25225877

  8. Verification of rain-flow reconstructions of a variable amplitude load history. M.S. Thesis, 1990 Final Report

    NASA Technical Reports Server (NTRS)

    Clothiaux, John D.; Dowling, Norman E.

    1992-01-01

    The suitability of using rain-flow reconstructions as an alternative to an original loading spectrum for component fatigue life testing is investigated. A modified helicopter maneuver history is used for the rain-flow cycle counting and history regenerations. Experimental testing on a notched test specimen over a wide range of loads produces similar lives for the original history and the reconstructions. The test lives also agree with a simplified local strain analysis performed on the specimen utilizing the rain-flow cycle count. The rain-flow reconstruction technique is shown to be a viable test spectrum alternative to storing the complete original load history, especially in saving computer storage space and processing time. A description of the regeneration method, the simplified life prediction analysis, and the experimental methods are included in the investigation.

  9. Development of a Tomography Technique for Assessment of the Material Condition of Concrete Using Optimized Elastic Wave Parameters.

    PubMed

    Chai, Hwa Kian; Liu, Kit Fook; Behnia, Arash; Yoshikazu, Kobayashi; Shiotani, Tomoki

    2016-04-16

    Concrete is the most ubiquitous construction material. Apart from the fresh and early age properties of concrete material, its condition during the structure life span affects the overall structural performance. Therefore, development of techniques such as non-destructive testing which enable the investigation of the material condition, are in great demand. Tomography technique has become an increasingly popular non-destructive evaluation technique for civil engineers to assess the condition of concrete structures. In the present study, this technique is investigated by developing reconstruction procedures utilizing different parameters of elastic waves, namely the travel time, wave amplitude, wave frequency, and Q-value. In the development of algorithms, a ray tracing feature was adopted to take into account the actual non-linear propagation of elastic waves in concrete containing defects. Numerical simulation accompanied by experimental verifications of wave motion were conducted to obtain wave propagation profiles in concrete containing honeycomb as a defect and in assessing the tendon duct filling of pre-stressed concrete (PC) elements. The detection of defects by the developed tomography reconstruction procedures was evaluated and discussed.

  10. A Review of Transmission Diagnostics Research at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Zakajsek, James J.

    1994-01-01

    This paper presents a summary of the transmission diagnostics research work conducted at NASA Lewis Research Center over the last four years. In 1990, the Transmission Health and Usage Monitoring Research Team at NASA Lewis conducted a survey to determine the critical needs of the diagnostics community. Survey results indicated that experimental verification of gear and bearing fault detection methods, improved fault detection in planetary systems, and damage magnitude assessment and prognostics research were all critical to a highly reliable health and usage monitoring system. In response to this, a variety of transmission fault detection methods were applied to experimentally obtained fatigue data. Failure modes of the fatigue data include a variety of gear pitting failures, tooth wear, tooth fracture, and bearing spalling failures. Overall results indicate that, of the gear fault detection techniques, no one method can successfully detect all possible failure modes. The more successful methods need to be integrated into a single more reliable detection technique. A recently developed method, NA4, in addition to being one of the more successful gear fault detection methods, was also found to exhibit damage magnitude estimation capabilities.

  11. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  12. Finger Vein Recognition Using Local Line Binary Pattern

    PubMed Central

    Rosdi, Bakhtiar Affendi; Shing, Chai Wuh; Suandi, Shahrel Azmin

    2011-01-01

    In this paper, a personal verification method using finger vein is presented. Finger vein can be considered more secured compared to other hands based biometric traits such as fingerprint and palm print because the features are inside the human body. In the proposed method, a new texture descriptor called local line binary pattern (LLBP) is utilized as feature extraction technique. The neighbourhood shape in LLBP is a straight line, unlike in local binary pattern (LBP) which is a square shape. Experimental results show that the proposed method using LLBP has better performance than the previous methods using LBP and local derivative pattern (LDP). PMID:22247670

  13. Experimental verification of a tuned inertial mass electromagnetic transducer

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuta; Sugiura, Keita; Asai, Takehiko

    2018-03-01

    This research reports on the design and experimental verification of a tuned inertial mass electromagnetic trans- ducer (TIMET) for energy harvesting from vibrating large structures and structural vibration control devices. The TIMET consists of a permanent-magnetic synchronous motor (PMSM), a rotational mass, and a tuning spring. The PMSM and the rotational mass are connected to a ball screw mechanism so that the rotation of the PMSM is synchronized with the rotational mass. And the tuning spring interfaced to the shaft of the ball screw mechanism is connected to the vibrating structure. Thus, through this ball screw mechanism, transla- tional vibration motion of the structure is converted to rotational behavior and mechanical energy is absorbed as electrical energy by the PMSM. Moreover, the amplified equivalent inertial mass effect is obtained by rotating relatively small physical masses. Therefore, when the stiffness of the tuning spring is determined so that the inertial mass resonates with the natural frequency of the vibratory structure, the PMSM rotates more effectively. As a result, the generated energy by the PMSM can be increased. The authors design a prototype of the TIMET and carry out experiments using sine and sine seep waves to show the effectiveness of the tuned inertial mass mechanism. Also, an analytical model of the proposed device is developed using a curve fitting technique to simulate the behavior of the TIMET.

  14. Evaluating DFT for Transition Metals and Binaries: Developing the V/DM-17 Test Set

    NASA Astrophysics Data System (ADS)

    Decolvenaere, Elizabeth; Mattsson, Ann

    We have developed the V-DM/17 test set to evaluate the experimental accuracy of DFT calculations of transition metals. When simulation and experiment disagree, the disconnect in length-scales and temperatures makes determining ``who is right'' difficult. However, methods to evaluate the experimental accuracy of functionals in the context of solid-state materials science, especially for transition metals, is lacking. As DFT undergoes a shift from a descriptive to a predictive tool, these issues of verification are becoming increasingly important. With undertakings like the Materials Project leading the way in high-throughput predictions and discoveries, the development of a one-size-fits-most approach to verification is critical. Our test set evaluates 26 transition metal elements and 80 transition metal alloys across three physical observables: lattice constants, elastic coefficients, and formation energy of alloys. Whether or not the formation energy can be reproduced measures whether the relevant physics are captured in a calculation. This is especially important question in transition metals, where active d-electrons can thwart commonly used techniques. In testing the V/DM-17 test set, we offer new views into the performance of existing functionals. Sandia National Labs is a multi-mission laboratory managed and operated by Sandia Corp., a wholly owned subsidiary of Lockheed Martin Corp., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  15. Visualization study of counterflow in superfluid 4He using metastable helium molecules.

    PubMed

    Guo, W; Cahn, S B; Nikkel, J A; Vinen, W F; McKinsey, D N

    2010-07-23

    Heat is transferred in superfluid 4He via a process known as thermal counterflow. It has been known for many years that above a critical heat current the superfluid component in this counterflow becomes turbulent. It has been suspected that the normal-fluid component may become turbulent as well, but experimental verification is difficult without a technique for visualizing the flow. Here we report a series of visualization studies on the normal-fluid component in a thermal counterflow performed by imaging the motion of seeded metastable helium molecules using a laser-induced-fluorescence technique. We present evidence that the flow of the normal fluid is indeed turbulent at relatively large velocities. Thermal counterflow in which both components are turbulent presents us with a theoretically challenging type of turbulent behavior that is new to physics.

  16. An algebraic iterative reconstruction technique for differential X-ray phase-contrast computed tomography.

    PubMed

    Fu, Jian; Schleede, Simone; Tan, Renbo; Chen, Liyuan; Bech, Martin; Achterhold, Klaus; Gifford, Martin; Loewen, Rod; Ruth, Ronald; Pfeiffer, Franz

    2013-09-01

    Iterative reconstruction has a wide spectrum of proven advantages in the field of conventional X-ray absorption-based computed tomography (CT). In this paper, we report on an algebraic iterative reconstruction technique for grating-based differential phase-contrast CT (DPC-CT). Due to the differential nature of DPC-CT projections, a differential operator and a smoothing operator are added to the iterative reconstruction, compared to the one commonly used for absorption-based CT data. This work comprises a numerical study of the algorithm and its experimental verification using a dataset measured at a two-grating interferometer setup. Since the algorithm is easy to implement and allows for the extension to various regularization possibilities, we expect a significant impact of the method for improving future medical and industrial DPC-CT applications. Copyright © 2012. Published by Elsevier GmbH.

  17. Experimental Verification of the Use of Metal Filled Via Hole Fences for Crosstalk Control of Microstrip Lines in LTCC Packages

    NASA Technical Reports Server (NTRS)

    Ponchak, George E.; Chun, Donghoon; Katehi, Linda P. B.; Yook, Jong-Gwan

    1999-01-01

    Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior 3D-FEM electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually increases coupling between the lines; however, if the top of the via posts are connected by a metal Strip, coupling is reduced. In this paper, experimental verification of the 3D-FEM simulations Is demonstrated for commercially fabricated LTCC packages.

  18. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  19. Development of Novel Treatment Plan Verification Techniques for Prostate Intensity Modulation Arc Therapy

    DTIC Science & Technology

    2010-03-01

    is to develop a novel clinical useful delivered-dose verification protocol for modern prostate VMAT using Electronic Portal Imaging Device (EPID...technique. A number of important milestones have been accomplished, which include (i) calibrated CBCT HU vs. electron density curve; (ii...prostate  VMAT  using  Electronic   Portal  Imaging  Device  (EPID)  and  onboard Cone beam Computed Tomography (CBCT).  The specific aims of this project

  20. FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.

    PubMed

    Gu, Ming; Chakrabartty, Shantanu

    2013-08-01

    This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).

  1. The species translation challenge—A systems biology perspective on human and rat bronchial epithelial cells

    PubMed Central

    Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2014-01-01

    The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to ‘translate’ the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies. PMID:25977767

  2. The species translation challenge-a systems biology perspective on human and rat bronchial epithelial cells.

    PubMed

    Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2014-01-01

    The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to 'translate' the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies.

  3. 77 FR 64596 - Proposed Information Collection (Income Verification) Activity: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-22

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0518] Proposed Information Collection (Income... to income- dependent benefits. DATES: Written comments and recommendations on the proposed collection... techniques or the use of other forms of information technology. Title: Income Verification, VA Form 21-0161a...

  4. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  5. Single-molecule experiments in biological physics: methods and applications.

    PubMed

    Ritort, F

    2006-08-16

    I review single-molecule experiments (SMEs) in biological physics. Recent technological developments have provided the tools to design and build scientific instruments of high enough sensitivity and precision to manipulate and visualize individual molecules and measure microscopic forces. Using SMEs it is possible to manipulate molecules one at a time and measure distributions describing molecular properties, characterize the kinetics of biomolecular reactions and detect molecular intermediates. SMEs provide additional information about thermodynamics and kinetics of biomolecular processes. This complements information obtained in traditional bulk assays. In SMEs it is also possible to measure small energies and detect large Brownian deviations in biomolecular reactions, thereby offering new methods and systems to scrutinize the basic foundations of statistical mechanics. This review is written at a very introductory level, emphasizing the importance of SMEs to scientists interested in knowing the common playground of ideas and the interdisciplinary topics accessible by these techniques. The review discusses SMEs from an experimental perspective, first exposing the most common experimental methodologies and later presenting various molecular systems where such techniques have been applied. I briefly discuss experimental techniques such as atomic-force microscopy (AFM), laser optical tweezers (LOTs), magnetic tweezers (MTs), biomembrane force probes (BFPs) and single-molecule fluorescence (SMF). I then present several applications of SME to the study of nucleic acids (DNA, RNA and DNA condensation) and proteins (protein-protein interactions, protein folding and molecular motors). Finally, I discuss applications of SMEs to the study of the nonequilibrium thermodynamics of small systems and the experimental verification of fluctuation theorems. I conclude with a discussion of open questions and future perspectives.

  6. TOPICAL REVIEW: Single-molecule experiments in biological physics: methods and applications

    NASA Astrophysics Data System (ADS)

    Ritort, F.

    2006-08-01

    I review single-molecule experiments (SMEs) in biological physics. Recent technological developments have provided the tools to design and build scientific instruments of high enough sensitivity and precision to manipulate and visualize individual molecules and measure microscopic forces. Using SMEs it is possible to manipulate molecules one at a time and measure distributions describing molecular properties, characterize the kinetics of biomolecular reactions and detect molecular intermediates. SMEs provide additional information about thermodynamics and kinetics of biomolecular processes. This complements information obtained in traditional bulk assays. In SMEs it is also possible to measure small energies and detect large Brownian deviations in biomolecular reactions, thereby offering new methods and systems to scrutinize the basic foundations of statistical mechanics. This review is written at a very introductory level, emphasizing the importance of SMEs to scientists interested in knowing the common playground of ideas and the interdisciplinary topics accessible by these techniques. The review discusses SMEs from an experimental perspective, first exposing the most common experimental methodologies and later presenting various molecular systems where such techniques have been applied. I briefly discuss experimental techniques such as atomic-force microscopy (AFM), laser optical tweezers (LOTs), magnetic tweezers (MTs), biomembrane force probes (BFPs) and single-molecule fluorescence (SMF). I then present several applications of SME to the study of nucleic acids (DNA, RNA and DNA condensation) and proteins (protein-protein interactions, protein folding and molecular motors). Finally, I discuss applications of SMEs to the study of the nonequilibrium thermodynamics of small systems and the experimental verification of fluctuation theorems. I conclude with a discussion of open questions and future perspectives.

  7. Unravelling the electrochemical double layer by direct probing of the solid/liquid interface

    DOE PAGES

    Favaro, Marco; Jeong, Beomgyun; Ross, Philip N.; ...

    2016-08-31

    The electrochemical double layer plays a critical role in electrochemical processes. Whilst there have been many theoretical models predicting structural and electrical organization of the electrochemical double layer, the experimental verification of these models has been challenging due to the limitations of available experimental techniques. The induced potential drop in the electrolyte has never been directly observed and verified experimentally, to the best of our knowledge. In this study, we report the direct probing of the potential drop as well as the potential of zero charge by means of ambient pressure X-ray photoelectron spectroscopy performed under polarization conditions. By analyzingmore » the spectra of the solvent (water) and a spectator neutral molecule with numerical simulations of the electric field, we discern the shape of the electrochemical double layer profile. In addition, we determine how the electrochemical double layer changes as a function of both the electrolyte concentration and applied potential.« less

  8. Experimental Observation of Thin-shell Instability in a Collisionless Plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, H.; Doria, D.; Sarri, G.

    We report on the experimental observation of the instability of a plasma shell, which formed during the expansion of a laser-ablated plasma into a rarefied ambient medium. By means of a proton radiography technique, the evolution of the instability is temporally and spatially resolved on a timescale much shorter than the hydrodynamic one. The density of the thin shell exceeds that of the surrounding plasma, which lets electrons diffuse outward. An ambipolar electric field grows on both sides of the thin shell that is antiparallel to the density gradient. Ripples in the thin shell result in a spatially varying balancemore » between the thermal pressure force mediated by this field and the ram pressure force that is exerted on it by the inflowing plasma. This mismatch amplifies the ripples by the same mechanism that drives the hydrodynamic nonlinear thin-shell instability (NTSI). Our results thus constitute the first experimental verification that the NTSI can develop in colliding flows.« less

  9. Experimental Observation of Thin-shell Instability in a Collisionless Plasma

    NASA Astrophysics Data System (ADS)

    Ahmed, H.; Doria, D.; Dieckmann, M. E.; Sarri, G.; Romagnani, L.; Bret, A.; Cerchez, M.; Giesecke, A. L.; Ianni, E.; Kar, S.; Notley, M.; Prasad, R.; Quinn, K.; Willi, O.; Borghesi, M.

    2017-01-01

    We report on the experimental observation of the instability of a plasma shell, which formed during the expansion of a laser-ablated plasma into a rarefied ambient medium. By means of a proton radiography technique, the evolution of the instability is temporally and spatially resolved on a timescale much shorter than the hydrodynamic one. The density of the thin shell exceeds that of the surrounding plasma, which lets electrons diffuse outward. An ambipolar electric field grows on both sides of the thin shell that is antiparallel to the density gradient. Ripples in the thin shell result in a spatially varying balance between the thermal pressure force mediated by this field and the ram pressure force that is exerted on it by the inflowing plasma. This mismatch amplifies the ripples by the same mechanism that drives the hydrodynamic nonlinear thin-shell instability (NTSI). Our results thus constitute the first experimental verification that the NTSI can develop in colliding flows.

  10. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    PubMed

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  11. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  12. Nondestructive analysis and development

    NASA Technical Reports Server (NTRS)

    Moslehy, Faissal A.

    1993-01-01

    This final report summarizes the achievements of project #4 of the NASA/UCF Cooperative Agreement from January 1990 to December 1992. The objectives of this project are to review NASA's NDE program at Kennedy Space Center (KSC) and recommend means for enhancing the present testing capabilities through the use of improved or new technologies. During the period of the project, extensive development of a reliable nondestructive, non-contact vibration technique to determine and quantify the bond condition of the thermal protection system (TPS) tiles of the Space Shuttle Orbiter was undertaken. Experimental modal analysis (EMA) is used as a non-destructive technique for the evaluation of Space Shuttle thermal protection system (TPS) tile bond integrity. Finite element (FE) models for tile systems were developed and were used to generate their vibration characteristics (i.e. natural frequencies and mode shapes). Various TPS tile assembly configurations as well as different bond conditions were analyzed. Results of finite element analyses demonstrated a drop in natural frequencies and a change in mode shapes which correlate with both size and location of disbond. Results of experimental testing of tile panels correlated with FE results and demonstrated the feasibility of EMA as a viable technique for tile bond verification. Finally, testing performed on the Space Shuttle Columbia using a laser doppler velocimeter demonstrated the application of EMA, when combined with FE modeling, as a non-contact, non-destructive bond evaluation technique.

  13. Finite element simulation and Experimental verification of Incremental Sheet metal Forming

    NASA Astrophysics Data System (ADS)

    Kaushik Yanamundra, Krishna; Karthikeyan, R., Dr.; Naranje, Vishal, Dr

    2018-04-01

    Incremental sheet metal forming is now a proven manufacturing technique that can be employed to obtain application specific, customized, symmetric or asymmetric shapes that are required by automobile or biomedical industries for specific purposes like car body parts, dental implants or knee implants. Finite element simulation of metal forming process is being performed successfully using explicit dynamics analysis of commercial FE software. The simulation is mainly useful in optimization of the process as well design of the final product. This paper focuses on simulating the incremental sheet metal forming process in ABAQUS, and validating the results using experimental methods. The shapes generated for testing are of trapezoid, dome and elliptical shapes whose G codes are written and fed into the CNC milling machine with an attached forming tool with a hemispherical bottom. The same pre-generated coordinates are used to simulate a similar machining conditions in ABAQUS and the tool forces, stresses and strains in the workpiece while machining are obtained as the output data. The forces experimentally were recorded using a dynamometer. The experimental and simulated results were then compared and thus conclusions were drawn.

  14. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Derivations and Verification of Plans. Volume 1

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K, Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.

  15. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  16. Cluster man/system design requirements and verification. [for Skylab program

    NASA Technical Reports Server (NTRS)

    Watters, H. H.

    1974-01-01

    Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.

  17. Optimization of magnetization transfer measurements: statistical analysis by stochastic simulation. Application to creatine kinase kinetics.

    PubMed

    Rydzy, M; Deslauriers, R; Smith, I C; Saunders, J K

    1990-08-01

    A systematic study was performed to optimize the accuracy of kinetic parameters derived from magnetization transfer measurements. Three techniques were investigated: time-dependent saturation transfer (TDST), saturation recovery (SRS), and inversion recovery (IRS). In the last two methods, one of the resonances undergoing exchange is saturated throughout the experiment. The three techniques were compared with respect to the accuracy of the kinetic parameters derived from experiments performed in a given, fixed, amount of time. Stochastic simulation of magnetization transfer experiments was performed to optimize experimental design. General formulas for the relative accuracies of the unidirectional rate constant (k) were derived for each of the three experimental methods. It was calculated that for k values between 0.1 and 1.0 s-1, T1 values between 1 and 10 s, and relaxation delays appropriate for the creatine kinase reaction, the SRS method yields more accurate values of k than does the IRS method. The TDST method is more accurate than the SRS method for reactions where T1 is long and k is large, within the range of k and T1 values examined. Experimental verification of the method was carried out on a solution in which the forward (PCr----ATP) rate constant (kf) of the creatine kinase reaction was measured.

  18. Nonlinear Earthquake Analysis of Reinforced Concrete Frames with Fiber and Bernoulli-Euler Beam-Column Element

    PubMed Central

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched. PMID:24578667

  19. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    PubMed

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  20. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part II—Experimental Implementation

    PubMed Central

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441

  1. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  2. Simulation of Laboratory Tests of Steel Arch Support

    NASA Astrophysics Data System (ADS)

    Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof

    2017-03-01

    The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.

  3. Experimental measurement-device-independent verification of quantum steering

    NASA Astrophysics Data System (ADS)

    Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.

    2015-01-01

    Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  4. Experimental measurement-device-independent verification of quantum steering.

    PubMed

    Kocsis, Sacha; Hall, Michael J W; Bennet, Adam J; Saunders, Dylan J; Pryde, Geoff J

    2015-01-07

    Bell non-locality between distant quantum systems--that is, joint correlations which violate a Bell inequality--can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  5. Target Capturing Control for Space Robots with Unknown Mass Properties: A Self-Tuning Method Based on Gyros and Cameras.

    PubMed

    Li, Zhenyu; Wang, Bin; Liu, Hong

    2016-08-30

    Satellite capturing with free-floating space robots is still a challenging task due to the non-fixed base and unknown mass property issues. In this paper gyro and eye-in-hand camera data are adopted as an alternative choice for solving this problem. For this improved system, a new modeling approach that reduces the complexity of system control and identification is proposed. With the newly developed model, the space robot is equivalent to a ground-fixed manipulator system. Accordingly, a self-tuning control scheme is applied to handle such a control problem including unknown parameters. To determine the controller parameters, an estimator is designed based on the least-squares technique for identifying the unknown mass properties in real time. The proposed method is tested with a credible 3-dimensional ground verification experimental system, and the experimental results confirm the effectiveness of the proposed control scheme.

  6. Target Capturing Control for Space Robots with Unknown Mass Properties: A Self-Tuning Method Based on Gyros and Cameras

    PubMed Central

    Li, Zhenyu; Wang, Bin; Liu, Hong

    2016-01-01

    Satellite capturing with free-floating space robots is still a challenging task due to the non-fixed base and unknown mass property issues. In this paper gyro and eye-in-hand camera data are adopted as an alternative choice for solving this problem. For this improved system, a new modeling approach that reduces the complexity of system control and identification is proposed. With the newly developed model, the space robot is equivalent to a ground-fixed manipulator system. Accordingly, a self-tuning control scheme is applied to handle such a control problem including unknown parameters. To determine the controller parameters, an estimator is designed based on the least-squares technique for identifying the unknown mass properties in real time. The proposed method is tested with a credible 3-dimensional ground verification experimental system, and the experimental results confirm the effectiveness of the proposed control scheme. PMID:27589748

  7. Experimental Verification of Entanglement Generated in a Plasmonic System.

    PubMed

    Dieleman, F; Tame, M S; Sonnefraud, Y; Kim, M S; Maier, S A

    2017-12-13

    A core process in many quantum tasks is the generation of entanglement. It is being actively studied in a variety of physical settings-from simple bipartite systems to complex multipartite systems. In this work we experimentally study the generation of bipartite entanglement in a nanophotonic system. Entanglement is generated via the quantum interference of two surface plasmon polaritons in a beamsplitter structure, i.e., utilizing the Hong-Ou-Mandel (HOM) effect, and its presence is verified using quantum state tomography. The amount of entanglement is quantified by the concurrence and we find values of up to 0.77 ± 0.04. Verifying entanglement in the output state from HOM interference is a nontrivial task and cannot be inferred from the visibility alone. The techniques we use to verify entanglement could be applied to other types of photonic system and therefore may be useful for the characterization of a range of different nanophotonic quantum devices.

  8. Note: Ultrasonic gas flowmeter based on optimized time-of-flight algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, X. F.; Tang, Z. A.

    2011-04-15

    A new digital signal processor based single path ultrasonic gas flowmeter is designed, constructed, and experimentally tested. To achieve high accuracy measurements, an optimized ultrasound driven method of incorporation of the amplitude modulation and the phase modulation of the transmit-receive technique is used to stimulate the transmitter. Based on the regularities among the received envelope zero-crossings, different received signal's signal-to-noise ratio situations are discriminated and optional time-of-flight algorithms are applied to take flow rate calculations. Experimental results from the dry calibration indicate that the designed flowmeter prototype can meet the zero-flow verification test requirements of the American Gas Association Reportmore » No. 9. Furthermore, the results derived from the flow calibration prove that the proposed flowmeter prototype can measure flow rate accurately in the practical experiments, and the nominal accuracies after FWME adjustment are lower than 0.8% throughout the calibration range.« less

  9. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  10. Determining potential 30/20 GHZ domestic satellite system concepts and establishment of a suitable experimental configuration

    NASA Technical Reports Server (NTRS)

    Stevens, G. H.; Anzic, G.

    1979-01-01

    NASA is conducting a series of millimeter wave satellite communication systems and market studies to: (1) determine potential domestic 30/20 GHz satellite concepts and market potential, and (2) establish the requirements for a suitable technology verification payload which, although intended to be modest in capacity, would sufficiently demonstrate key technologies and experimentally address key operational issues. Preliminary results and critical issues of the current contracted effort are described. Also included is a description of a NASA-developed multibeam satellite payload configuration which may be representative of concepts utilized in a technology flight verification program.

  11. The experimental verification of wall movement influence coefficients for an adaptive walled test section

    NASA Technical Reports Server (NTRS)

    Neal, G.

    1988-01-01

    Flexible walled wind tunnels have for some time been used to reduce wall interference effects at the model. A necessary part of the 3-D wall adjustment strategy being developed for the Transonic Self-Streamlining Wind Tunnel (TSWT) of Southampton University is the use of influence coefficients. The influence of a wall bump on the centerline flow in TSWT has been calculated theoretically using a streamline curvature program. This report details the experimental verification of these influence coefficients and concludes that it is valid to use the theoretically determined values in 3-D model testing.

  12. Analysis and discussion on the experimental data of electrolyte analyzer

    NASA Astrophysics Data System (ADS)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  13. Combining Space Geodesy, Seismology, and Geochemistry for Monitoring Verification and Accounting of CO 2 in Sequestration Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swart, Peter K.; Dixon, Tim

    2014-09-30

    A series of surface geophysical and geochemical techniques are tested in order to demonstrate and validate low cost approaches for Monitoring, Verification and Accounting (MVA) of the integrity of deep reservoirs for CO 2 storage. These techniques are (i) surface deformation by GPS; ii) surface deformation by InSAR; iii) passive source seismology via broad band seismometers; and iv) soil gas monitoring with a cavity ring down spectrometer for measurement of CO 2 concentration and carbon isotope ratio. The techniques were tested at an active EOR (Enhanced Oil Recovery) site in Texas. Each approach has demonstrated utility. Assuming Carbon Capture, Utilizationmore » and Storage (CCUS) activities become operational in the future, these techniques can be used to augment more expensive down-hole techniques.« less

  14. The use of positron emission tomography in pion radiotherapy.

    PubMed

    Goodman, G B; Lam, G K; Harrison, R W; Bergstrom, M; Martin, W R; Pate, B D

    1986-10-01

    The radioactive debris produced by pion radiotherapy can be imaged by the technique of Positron Emission Tomography (PET) as a method of non-invasive in situ verification of the pion treatment. This paper presents the first visualization of the pion stopping distribution within a tumor in a human brain using PET. Together with the tissue functional information provided by the standard PET scans using radiopharmaceuticals, the combination of pion with PET technique can provide a much better form of radiotherapy than the use of conventional radiation in both treatment planning and verification.

  15. A High-Level Language for Modeling Algorithms and Their Properties

    NASA Astrophysics Data System (ADS)

    Akhtar, Sabina; Merz, Stephan; Quinson, Martin

    Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.

  16. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  17. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  18. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  19. Cross-checking of Large Evaluated and Experimental Nuclear Reaction Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeydina, O.; Koning, A.J.; Soppera, N.

    2014-06-15

    Automated methods are presented for the verification of large experimental and evaluated nuclear reaction databases (e.g. EXFOR, JEFF, TENDL). These methods allow an assessment of the overall consistency of the data and detect aberrant values in both evaluated and experimental databases.

  20. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  1. Neutron spectrometry for UF 6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  2. Color separation in forensic image processing using interactive differential evolution.

    PubMed

    Mushtaq, Harris; Rahnamayan, Shahryar; Siddiqi, Areeb

    2015-01-01

    Color separation is an image processing technique that has often been used in forensic applications to differentiate among variant colors and to remove unwanted image interference. This process can reveal important information such as covered text or fingerprints in forensic investigation procedures. However, several limitations prevent users from selecting the appropriate parameters pertaining to the desired and undesired colors. This study proposes the hybridization of an interactive differential evolution (IDE) and a color separation technique that no longer requires users to guess required control parameters. The IDE algorithm optimizes these parameters in an interactive manner by utilizing human visual judgment to uncover desired objects. A comprehensive experimental verification has been conducted on various sample test images, including heavily obscured texts, texts with subtle color variations, and fingerprint smudges. The advantage of IDE is apparent as it effectively optimizes the color separation parameters at a level indiscernible to the naked eyes. © 2014 American Academy of Forensic Sciences.

  3. Multi-Mounted X-Ray Computed Tomography.

    PubMed

    Fu, Jian; Liu, Zhenzhong; Wang, Jingzheng

    2016-01-01

    Most existing X-ray computed tomography (CT) techniques work in single-mounted mode and need to scan the inspected objects one by one. It is time-consuming and not acceptable for the inspection in a large scale. In this paper, we report a multi-mounted CT method and its first engineering implementation. It consists of a multi-mounted scanning geometry and the corresponding algebraic iterative reconstruction algorithm. This approach permits the CT rotation scanning of multiple objects simultaneously without the increase of penetration thickness and the signal crosstalk. Compared with the conventional single-mounted methods, it has the potential to improve the imaging efficiency and suppress the artifacts from the beam hardening and the scatter. This work comprises a numerical study of the method and its experimental verification using a dataset measured with a developed multi-mounted X-ray CT prototype system. We believe that this technique is of particular interest for pushing the engineering applications of X-ray CT.

  4. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  5. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  6. Audio-visual imposture

    NASA Astrophysics Data System (ADS)

    Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard

    2006-05-01

    A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.

  7. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  8. Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. D. Habel

    2008-05-20

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.

  9. Spectroscopic and chemical reactivity analysis of D-Myo-Inositol using quantum chemical approach and its experimental verification

    NASA Astrophysics Data System (ADS)

    Mishra, Devendra P.; Srivastava, Anchal; Shukla, R. K.

    2017-07-01

    This paper describes the spectroscopic (^1H and ^{13}C NMR, FT-IR and UV-Visible), chemical, nonlinear optical and thermodynamic properties of D-Myo-Inositol using quantum chemical technique and its experimental verification. The structural parameters of the compound are determined from the optimized geometry by B3LYP method with 6 {-}311{+}{+}G(d,p) basis set. It was found that the optimized parameters thus obtained are almost in agreement with the experimental ones. A detailed interpretation of the infrared spectra of D-Myo-Inositol is also reported in the present work. After optimization, the proton and carbon NMR chemical shifts of the studied compound are calculated using GIAO and 6 {-}311{+}{+}G(d,p) basis set. The search of organic materials with improved charge transfer properties requires precise quantum chemical calculations of space-charge density distribution, state and transition dipole moments and HOMO-LUMO states. The nature of the transitions in the observed UV-Visible spectrum of the compound has been studied by the time-dependent density functional theory (TD-DFT). The global reactivity descriptors like chemical potential, electronegativity, hardness, softness and electrophilicity index, have been calculated using DFT. The thermodynamic calculation related to the title compound was also performed at B3LYP/ 6 {-}311{+}{+}G(d,p) level of theory. The standard statistical thermodynamic functions like heat capacity at constant pressure, entropy and enthalpy change were obtained from the theoretical harmonic frequencies of the optimized molecule. It is observed that the values of heat capacity, entropy and enthalpy increase with increase in temperature from 100 to 1000 K, which is attributed to the enhancement of molecular vibration with the increase in temperature.

  10. Isocenter verification for linac‐based stereotactic radiation therapy: review of principles and techniques

    PubMed Central

    Sabet, Mahsheed; O'Connor, Daryl J.; Greer, Peter B.

    2011-01-01

    There have been several manual, semi‐automatic and fully‐automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator‐based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine. PACS numbers: 87.53.Ly, 87.56.Fc, 87.56.‐v PMID:22089022

  11. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  12. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.

  13. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.

  14. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  15. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  16. A critique of the hypothesis, and a defense of the question, as a framework for experimentation.

    PubMed

    Glass, David J

    2010-07-01

    Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.

  17. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation.

    PubMed

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 microm can be achieved.

  18. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation

    NASA Astrophysics Data System (ADS)

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 μm can be achieved.

  19. Development of eddy current probe for fiber orientation assessment in carbon fiber composites

    NASA Astrophysics Data System (ADS)

    Wincheski, Russell A.; Zhao, Selina

    2018-04-01

    Measurement of the fiber orientation in a carbon fiber composite material is crucial in understanding the load carrying capability of the structure. As manufacturing conditions including resin flow and molding pressures can alter fiber orientation, verification of the as-designed fiber layup is necessary to ensure optimal performance of the structure. In this work, the development of an eddy current probe and data processing technique for analysis of fiber orientation in carbon fiber composites is presented. A proposed directional eddy current probe is modeled and its response to an anisotropic multi-layer conductor simulated. The modeling results are then used to finalize specifications of the eddy current probe. Experimental testing of the fabricated probe is presented for several samples including a truncated pyramid part with complex fiber orientation draped to the geometry for resin transfer molding. The inductively coupled single sided measurement enables fiber orientation characterization through the thickness of the part. The fast and cost-effective technique can be applied as a spot check or as a surface map of the fiber orientations across the structure. This paper will detail the results of the probe design, computer simulations, and experimental results.

  20. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  1. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    NASA Astrophysics Data System (ADS)

    Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.

    2017-05-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.

  2. Flow Friction or Spontaneous Ignition?

    NASA Technical Reports Server (NTRS)

    Stoltzfus, Joel M.; Gallus, Timothy D.; Sparks, Kyle

    2012-01-01

    "Flow friction," a proposed ignition mechanism in oxygen systems, has proved elusive in attempts at experimental verification. In this paper, the literature regarding flow friction is reviewed and the experimental verification attempts are briefly discussed. Another ignition mechanism, a form of spontaneous combustion, is proposed as an explanation for at least some of the fire events that have been attributed to flow friction in the literature. In addition, the results of a failure analysis performed at NASA Johnson Space Center White Sands Test Facility are presented, and the observations indicate that spontaneous combustion was the most likely cause of the fire in this 2000 psig (14 MPa) oxygen-enriched system.

  3. Non-Equilbrium Fermi Gases

    DTIC Science & Technology

    2016-02-02

    understanding is the experimental verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in...and additional qualifiers separated by commas, e.g. Smith, Richard, J, Jr. 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES). Self -explanatory... verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in shape and magnitude with all of our

  4. Cleanup Verification Package for the 118-F-6 Burial Ground

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    H. M. Sulloway

    2008-10-02

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.

  5. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less

  6. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE PAGES

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    2017-10-16

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less

  7. Wind tunnel seeding particles for laser velocimeter

    NASA Technical Reports Server (NTRS)

    Ghorieshi, Anthony

    1992-01-01

    The design of an optimal air foil has been a major challenge for aerospace industries. The main objective is to reduce the drag force while increasing the lift force in various environmental air conditions. Experimental verification of theoretical and computational results is a crucial part of the analysis because of errors buried in the solutions, due to the assumptions made in theoretical work. Experimental studies are an integral part of a good design procedure; however, empirical data are not always error free due to environmental obstacles or poor execution, etc. The reduction of errors in empirical data is a major challenge in wind tunnel testing. One of the recent advances of particular interest is the use of a non-intrusive measurement technique known as laser velocimetry (LV) which allows for obtaining quantitative flow data without introducing flow disturbing probes. The laser velocimeter technique is based on measurement of scattered light by the particles present in the flow but not the velocity of the flow. Therefore, for an accurate flow velocity measurement with laser velocimeters, two criterion are investigated: (1) how well the particles track the local flow field, and (2) the requirement of light scattering efficiency to obtain signals with the LV. In order to demonstrate the concept of predicting the flow velocity by velocity measurement of particle seeding, the theoretical velocity of the gas flow is computed and compared with experimentally obtained velocity of particle seeding.

  8. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-11-30

    We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.

  9. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  10. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  11. Action-based verification of RTCP-nets with CADP

    NASA Astrophysics Data System (ADS)

    Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin

    2015-12-01

    The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.

  12. Design and Mechanical Evaluation of a Capacitive Sensor-Based Indexed Platform for Verification of Portable Coordinate Measuring Instruments

    PubMed Central

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-01

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458

  13. Design and mechanical evaluation of a capacitive sensor-based indexed platform for verification of portable coordinate measuring instruments.

    PubMed

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-02

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures.

  14. Control designs for low-loss active magnetic bearings: Theory and implementation

    NASA Astrophysics Data System (ADS)

    Wilson, Brian Christopher David

    Active Magnetic Bearings (AMB) have been proposed for use in Electromechanical Flywheel Batteries. In these devices, kinetic energy is stored in a magnetically levitated flywheel which spins in a vacuum. The AMB eliminates all mechanical losses, however, electrical loss, which is proportional to the square of the magnetic flux, is still significant. For efficient operation, the flux bias, which is typically introduced into the electromagnets to improve the AMB stiffness, must be reduced, preferably to zero. This zero-bias (ZB) mode of operation cripples the classical control techniques which are customarily used and nonlinear control is required. As a compromise between AMB stiffness and efficiency, a new flux bias scheme is proposed called the generalized complementary flux condition (gcfc). A flux-bias dependent trade-off exists between AMB stiffness, power consumption, and power loss. This work theoretically develops and experimentally verifies new low-loss AMB control designs which employ the gcfc condition. Particular attention is paid to the removal of the singularity present in the standard nonlinear control techniques when operating in ZB. Experimental verification is conduced on a 6-DOF AMB reaction wheel. Practical aspects of the gcfc implementation such as flux measurement and flux-bias implementation with voltage mode amplifiers using IR compensation are investigated. Comparisons are made between the gcfc bias technique and the standard constant-flux-sum (cfs) bias method. Under typical operating circumstances, theoretical analysis and experimental data show that the new gcfc bias scheme is more efficient in producing the control flux required for rotor stabilization than the ordinary cfs bias strategy.

  15. Direct and full-scale experimental verifications towards ground-satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei

    2013-05-01

    Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.

  16. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressuremore » gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. Furthermore, these simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.« less

  17. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    NASA Astrophysics Data System (ADS)

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun; Lane, J. Matthew D.

    2018-05-01

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressure gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. These simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.

  18. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    DOE PAGES

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun; ...

    2018-05-04

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressuremore » gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. Furthermore, these simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.« less

  19. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  20. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  1. Rudolph A. Marcus and His Theory of Electron Transfer Reactions

    Science.gov Websites

    early 1950s and soon discovered ... a strong experimental program at Brookhaven on electron-transfer experimental work provided the first verification of several of the predictions of his theory. This, in turn Marcus theory, namely, experimental evidence for the so-called "inverted region" where rates

  2. Experimental verification of radial magnetic levitation force on the cylindrical magnets in ferrofluid dampers

    NASA Astrophysics Data System (ADS)

    Yang, Wenming; Wang, Pengkai; Hao, Ruican; Ma, Buchuan

    2017-03-01

    Analytical and numerical calculation methods of the radial magnetic levitation force on the cylindrical magnets in cylindrical vessels filled with ferrofluid was reviewed. An experimental apparatus to measure this force was designed and tailored, which could measure the forces in a range of 0-2.0 N with an accuracy of 0.001 N. After calibrated, this apparatus was used to study the radial magnetic levitation force experimentally. The results showed that the numerical method overestimates this force, while the analytical ones underestimate it. The maximum deviation between the numerical results and the experimental ones was 18.5%, while that between the experimental results with the analytical ones attained 68.5%. The latter deviation narrowed with the lengthening of the magnets. With the aids of the experimental verification of the radial magnetic levitation force, the effect of eccentric distance of magnets on the viscous energy dissipation in ferrofluid dampers could be assessed. It was shown that ignorance of the eccentricity of magnets during the estimation could overestimate the viscous dissipation in ferrofluid dampers.

  3. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  4. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  5. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  6. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  7. Experimental Verification of Bayesian Planet Detection Algorithms with a Shaped Pupil Coronagraph

    NASA Astrophysics Data System (ADS)

    Savransky, D.; Groff, T. D.; Kasdin, N. J.

    2010-10-01

    We evaluate the feasibility of applying Bayesian detection techniques to discovering exoplanets using high contrast laboratory data with simulated planetary signals. Background images are generated at the Princeton High Contrast Imaging Lab (HCIL), with a coronagraphic system utilizing a shaped pupil and two deformable mirrors (DMs) in series. Estimates of the electric field at the science camera are used to correct for quasi-static speckle and produce symmetric high contrast dark regions in the image plane. Planetary signals are added in software, or via a physical star-planet simulator which adds a second off-axis point source before the coronagraph with a beam recombiner, calibrated to a fixed contrast level relative to the source. We produce a variety of images, with varying integration times and simulated planetary brightness. We then apply automated detection algorithms such as matched filtering to attempt to extract the planetary signals. This allows us to evaluate the efficiency of these techniques in detecting planets in a high noise regime and eliminating false positives, as well as to test existing algorithms for calculating the required integration times for these techniques to be applicable.

  8. A numerical study of axisymmetric compressible non-isothermal and reactive swirling flow

    NASA Astrophysics Data System (ADS)

    Tavernetti, William E.; Hafez, Mohamed M.

    2017-09-01

    Non-linear dynamical phenomena in combustion processes is an active area of experimental and theoretical research. This is in large part due to increasingly strict environmental pressures to make gas turbine engines and industrial burners more efficient. Using numerical methods, for steady and unsteady confined and unconfined compressible flow, this study examines the modeling influence of compressibility for axisymmetric swirling flow. The compressible reactive Navier-Stokes equations in terms of stream function, vorticity, circulation are used. Results, details of the numerical algorithms, as well as numerical verification techniques and validation with sources from the literature will be presented. Understanding how vortex breakdown phenomena are affected by modeling reactant consumption with compressibility effect is the main goal of this study.

  9. Coherent optimal control of photosynthetic molecules

    NASA Astrophysics Data System (ADS)

    Caruso, F.; Montangero, S.; Calarco, T.; Huelga, S. F.; Plenio, M. B.

    2012-04-01

    We demonstrate theoretically that open-loop quantum optimal control techniques can provide efficient tools for the verification of various quantum coherent transport mechanisms in natural and artificial light-harvesting complexes under realistic experimental conditions. To assess the feasibility of possible biocontrol experiments, we introduce the main settings and derive optimally shaped and robust laser pulses that allow for the faithful preparation of specified initial states (such as localized excitation or coherent superposition, i.e., propagating and nonpropagating states) of the photosystem and probe efficiently the subsequent dynamics. With these tools, different transport pathways can be discriminated, which should facilitate the elucidation of genuine quantum dynamical features of photosystems and therefore enhance our understanding of the role that coherent processes may play in actual biological complexes.

  10. Control technology development

    NASA Astrophysics Data System (ADS)

    Schaechter, D. B.

    1982-03-01

    The main objectives of the control technology development task are given in the slide below. The first is to develop control design techniques based on flexible structural models, rather than simple rigid-body models. Since large space structures are distributed parameter systems, a new degree of freedom, that of sensor/actuator placement, may be exercised for improving control system performance. Another characteristic of large space structures is numerous oscillatory modes within the control bandwidth. Reduced-order controller design models must be developed which produce stable closed-loop systems when combined with the full-order system. Since the date of an actual large-space-structure flight is rapidly approaching, it is vitally important that theoretical developments are tested in actual hardware. Experimental verification is a vital counterpart of all current theoretical developments.

  11. Elastic suspension of a wind tunnel test section

    NASA Technical Reports Server (NTRS)

    Hacker, R.; Rock, S.; Debra, D. B.

    1982-01-01

    Experimental verification of the theory describing arbitrary motions of an airfoil is reported. The experimental apparatus is described. A mechanism was designed to provide two separate degrees of freedom without friction or backlash to mask the small but important aerodynamic effects of interest.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toltz, A; Seuntjens, J; Hoesl, M

    Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  13. Experimental verification of nanofluid shear-wave reconversion in ultrasonic fields.

    PubMed

    Forrester, Derek Michael; Huang, Jinrui; Pinfield, Valerie J; Luppé, Francine

    2016-03-14

    Here we present the verification of shear-mediated contributions to multiple scattering of ultrasound in suspensions. Acoustic spectroscopy was carried out with suspensions of silica of differing particle sizes and concentrations in water to find the attenuation at a broad range of frequencies. As the particle sizes approach the nanoscale, commonly used multiple scattering models fail to match experimental results. We develop a new model, taking into account shear mediated contributions, and find excellent agreement with the attenuation spectra obtained using two types of spectrometer. The results determine that shear-wave phenomena must be considered in ultrasound characterisation of nanofluids at even relatively low concentrations of scatterers that are smaller than one micrometre in diameter.

  14. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    NASA Astrophysics Data System (ADS)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  15. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  16. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  17. High-speed autoverifying technology for printed wiring boards

    NASA Astrophysics Data System (ADS)

    Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi

    1996-10-01

    We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).

  18. Verification of ANSYS Fluent and OpenFOAM CFD platforms for prediction of impact flow

    NASA Astrophysics Data System (ADS)

    Tisovská, Petra; Peukert, Pavel; Kolář, Jan

    The main goal of the article is a verification of the heat transfer coefficient numerically predicted by two CDF platforms - ANSYS-Fluent and OpenFOAM on the problem of impact flows oncoming from 2D nozzle. Various mesh parameters and solver settings were tested under several boundary conditions and compared to known experimental results. The best solver setting, suitable for further optimization of more complex geometry is evaluated.

  19. Experimental Verification of the Theory of Oscillating Airfoils

    NASA Technical Reports Server (NTRS)

    Silverstein, Abe; Joyner, Upshur T

    1939-01-01

    Measurements have been made of the lift on an airfoil in pitching oscillation with a continuous-recording, instantaneous-force balance. The experimental values for the phase difference between the angle of attack and the lift are shown to be in close agreement with theory.

  20. On the assessment of biological life support system operation range

    NASA Astrophysics Data System (ADS)

    Bartsev, Sergey

    Biological life support systems (BLSS) can be used in long-term space missions only if well-thought-out assessment of the allowable operating range is obtained. The range has to account both permissible working parameters of BLSS and the critical level of perturbations of BLSS stationary state. Direct approach to outlining the range by statistical treatment of experimental data on BLSS destruction seems to be not applicable due to ethical, economical, and saving time reasons. Mathematical model is the unique tool for the generalization of experimental data and the extrapolation of the revealed regularities beyond empirical experience. The problem is that the quality of extrapolation depends on the adequacy of corresponding model verification, but good verification requires wide range of experimental data for fitting, which is not achievable for manned experimental BLSS. Possible way to improve the extrapolation quality of inevitably poorly verified models of manned BLSS is to extrapolate general tendency obtained from unmanned LSS theoretical-experiment investigations. Possibilities and limitations of such approach are discussed.

  1. Performance of high area ratio nozzles for a small rocket thruster

    NASA Technical Reports Server (NTRS)

    Kushida, R. O.; Hermel, J.; Apfel, S.; Zydowicz, M.

    1986-01-01

    Theoretical estimates of supersonic nozzle performance have been compared to experimental test data for nozzles with an area ratio of 100:1 conical and 300:1 optimum contour, and 300:1 nozzles cut off at 200:1 and 100:1. These tests were done on a Hughes Aircraft Company 5 lbf monopropellant hydrazine thruster with chamber pressures ranging from 25 to 135 psia. The analytic method used is the conventional inviscid method of characteristic with correction for laminar boundary layer displacement and drag. Replacing the 100:1 conical nozzle with the 300:1 contoured nozzle resulted in an improvement in thrust performance of 0.74 percent at chamber pressure of 25 psia to 2.14 percent at chamber pressure of 135 psia. The data is significant because it is experimental verification that conventional nozzle design techniques are applicable even where the boundary layer is laminar and displaces as much as 35 percent of the flow at the nozzle exit plane.

  2. Experimental and computational study and development of the bituminous coal entrained-flow air-blown gasifier for IGCC

    NASA Astrophysics Data System (ADS)

    Abaimov, N. A.; Osipov, P. V.; Ryzhkov, A. F.

    2016-10-01

    In the paper the development of the advanced bituminous coal entrained-flow air- blown gasifier for the high power integrated gasification combined cycle is considered. The computational fluid dynamics technique is used as the basic development tool. The experiment on the pressurized entrained-flow gasifier was performed by “NPO CKTI” JSC for the thermochemical processes submodel verification. The kinetic constants for Kuznetsk bituminous coal (flame coal), obtained by thermal gravimetric analysis method, are used in the model. The calculation results obtained by the CFD model are in satisfactory agreements with experimental data. On the basis of the verified model the advanced gasifier structure was suggested which permits to increase the hydrogen content in the synthesis gas and consequently to improve the gas turbine efficiency. In order to meet the specified requirements vapor is added on the second stage of MHI type gasifier and heat necessary for air gasification is compensated by supplemental heating of the blasting air.

  3. Experimental Verification of Guided-Wave Lumped Circuits Using Waveguide Metamaterials

    NASA Astrophysics Data System (ADS)

    Li, Yue; Zhang, Zhijun

    2018-04-01

    Through the construction and characterization in microwave frequencies, we experimentally demonstrate our recently developed theory of waveguide lumped circuits, i.e., waveguide metatronics [Sci. Adv. 2, e1501790 (2016), 10.1126/sciadv.1501790], as a method to design subwavelength-scaled analog circuits. In the paradigm of waveguide metatronics, numbers of lumped inductors and capacitors are easily integrated functionally inside the waveguide, which is an irreplaceable transmission line in millimeter-wave and terahertz systems with the advantages of low radiation loss and low crosstalk. An example of multiple-ordered metatronic filters with layered structures is fabricated utilizing the technique of substrate integrated waveguides, which can be easily constructed by the printed-circuit-board process. The materials used in the construction are also typical microwave materials with positive permittivity, low loss, and negligible dispersion, imitating the plasmonic materials with negative permittivity in the optical domain. The results verify the theory of waveguide metatronics, which provides an efficient platform of functional lumped circuit design for guided-wave processing.

  4. Crack growth induced by thermal-mechanical loading

    NASA Astrophysics Data System (ADS)

    John, R.; Hartman, G. A.; Gallagher, J. P.

    1992-06-01

    Advanced aerospace structures are often subjected to combined thermal and mechanical loads. The fracture-mechanics behavior of the structures may be altered by the thermal state existing around the crack. Hence, design of critical structural elements requires the knowledge of stress-intensity factors under both thermal and mechanical loads. This paper describes the development of an experimental technique to verify the thermal-stress-intensity factor generated by a temperature gradient around the crack. Thin plate specimens of a model material (AISI-SAE 1095 steel) were used for the heat transfer and thermal-mechanical fracture tests. Rapid thermal loading was achieved using high-intensity focused infrared spot heaters. These heaters were also used to generate controlled temperature rates for heat-transfer verification tests. The experimental results indicate that thermal loads can generate stress-intensity factors large enough to induce crack growth. The proposed thermal-stress-intensity factors appear to have the same effect as the conventional mechanical-stress-intensity factors with respect to fracture.

  5. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE PAGES

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...

    2017-09-01

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  6. Safeguardability of the vitrification option for disposal of plutonium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillay, K.K.S.

    1996-05-01

    Safeguardability of the vitrification option for plutonium disposition is rather complex and there is no experience base in either domestic or international safeguards for this approach. In the present treaty regime between the US and the states of the former Soviet Union, bilaterial verifications are considered more likely with potential for a third-party verification of safeguards. There are serious technological limitations to applying conventional bulk handling facility safeguards techniques to achieve independent verification of plutonium in borosilicate glass. If vitrification is the final disposition option chosen, maintaining continuity of knowledge of plutonium in glass matrices, especially those containing boron andmore » those spike with high-level wastes or {sup 137}Cs, is beyond the capability of present-day safeguards technologies and nondestructive assay techniques. The alternative to quantitative measurement of fissile content is to maintain continuity of knowledge through a combination of containment and surveillance, which is not the international norm for bulk handling facilities.« less

  7. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marleau, Peter; Brubaker, Erik; Deland, Sharon M.

    This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less

  9. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  10. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code

  11. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > GEFS > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code

  12. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  13. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  14. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  15. Recent literature on structural modeling, identification, and analysis

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.

    1990-01-01

    The literature on the mathematical modeling of large space structures is first reviewed, with attention given to continuum models, model order reduction, substructuring, and computational techniques. System identification and mode verification are then discussed with reference to the verification of mathematical models of large space structures. In connection with analysis, the paper surveys recent research on eigensolvers and dynamic response solvers for large-order finite-element-based models.

  16. Navier-Stokes simulation with constraint forces: finite-difference method for particle-laden flows and complex geometries.

    PubMed

    Höfler, K; Schwarzer, S

    2000-06-01

    Building on an idea of Fogelson and Peskin [J. Comput. Phys. 79, 50 (1988)] we describe the implementation and verification of a simulation technique for systems of non-Brownian particles in fluids at Reynolds numbers up to about 20 on the particle scale. This direct simulation technique fills a gap between simulations in the viscous regime and high-Reynolds-number modeling. It also combines sufficient computational accuracy with numerical efficiency and allows studies of several thousand, in principle arbitrarily shaped, extended and hydrodynamically interacting particles on regular work stations. We verify the algorithm in two and three dimensions for (i) single falling particles and (ii) a fluid flowing through a bed of fixed spheres. In the context of sedimentation we compute the volume fraction dependence of the mean sedimentation velocity. The results are compared with experimental and other numerical results both in the viscous and inertial regime and we find very satisfactory agreement.

  17. Cancer Bioinformatics for Updating Anticancer Drug Developments and Personalized Therapeutics.

    PubMed

    Lu, Da-Yong; Qu, Rong-Xin; Lu, Ting-Ren; Wu, Hong-Ying

    2017-01-01

    Last two to three decades, this world witnesses a rapid progress of biomarkers and bioinformatics technologies. Cancer bioinformatics is one of such important omics branches for experimental/clinical studies and applications. Same as other biological techniques or systems, bioinformatics techniques will be widely used. But they are presently not omni-potent. Despite great popularity and improvements, cancer bioinformatics has its own limitations and shortcomings at this stage of technical advancements. This article will offer a panorama of bioinformatics in cancer researches and clinical therapeutic applications-possible advantages and limitations relating to cancer therapeutics. A lot of beneficial capabilities and outcomes have been described. As a result, a successful new era for cancer bioinformatics is waiting for us if we can adhere on scientific studies of cancer bioinformatics in malignant- origin mining, medical verifications and clinical diagnostic applications. Cancer bioinformatics gave a great significance in disease diagnosis and therapeutic predictions. Many creative ideas and future perspectives are highlighted. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Multi-Mounted X-Ray Computed Tomography

    PubMed Central

    Fu, Jian; Liu, Zhenzhong; Wang, Jingzheng

    2016-01-01

    Most existing X-ray computed tomography (CT) techniques work in single-mounted mode and need to scan the inspected objects one by one. It is time-consuming and not acceptable for the inspection in a large scale. In this paper, we report a multi-mounted CT method and its first engineering implementation. It consists of a multi-mounted scanning geometry and the corresponding algebraic iterative reconstruction algorithm. This approach permits the CT rotation scanning of multiple objects simultaneously without the increase of penetration thickness and the signal crosstalk. Compared with the conventional single-mounted methods, it has the potential to improve the imaging efficiency and suppress the artifacts from the beam hardening and the scatter. This work comprises a numerical study of the method and its experimental verification using a dataset measured with a developed multi-mounted X-ray CT prototype system. We believe that this technique is of particular interest for pushing the engineering applications of X-ray CT. PMID:27073911

  19. Development of neural network techniques for finger-vein pattern classification

    NASA Astrophysics Data System (ADS)

    Wu, Jian-Da; Liu, Chiung-Tsiung; Tsai, Yi-Jang; Liu, Jun-Ching; Chang, Ya-Wen

    2010-02-01

    A personal identification system using finger-vein patterns and neural network techniques is proposed in the present study. In the proposed system, the finger-vein patterns are captured by a device that can transmit near infrared through the finger and record the patterns for signal analysis and classification. The biometric system for verification consists of a combination of feature extraction using principal component analysis and pattern classification using both back-propagation network and adaptive neuro-fuzzy inference systems. Finger-vein features are first extracted by principal component analysis method to reduce the computational burden and removes noise residing in the discarded dimensions. The features are then used in pattern classification and identification. To verify the effect of the proposed adaptive neuro-fuzzy inference system in the pattern classification, the back-propagation network is compared with the proposed system. The experimental results indicated the proposed system using adaptive neuro-fuzzy inference system demonstrated a better performance than the back-propagation network for personal identification using the finger-vein patterns.

  20. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  1. Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker

    PubMed Central

    Aguilar, Juan José

    2014-01-01

    This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744

  2. On-Line Monitoring and Diagnostics of the Integrity of Nuclear Plant Steam Generators and Heat Exchangers, Volumes 1, 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyaya, Belle R.; Hines, J. Wesley; Lu, Baofu

    2005-06-03

    The overall purpose of this Nuclear Engineering Education Research (NEER) project was to integrate new, innovative, and existing technologies to develop a fault diagnostics and characterization system for nuclear plant steam generators (SG) and heat exchangers (HX). Issues related to system level degradation of SG and HX tubing, including tube fouling, performance under reduced heat transfer area, and the damage caused by stress corrosion cracking, are the important factors that influence overall plant operation, maintenance, and economic viability of nuclear power systems. The research at The University of Tennessee focused on the development of techniques for monitoring process and structuralmore » integrity of steam generators and heat exchangers. The objectives of the project were accomplished by the completion of the following tasks. All the objectives were accomplished during the project period. This report summarizes the research and development activities, results, and accomplishments during June 2001 September 2004. Development and testing of a high-fidelity nodal model of a U-tube steam generator (UTSG) to simulate the effects of fouling and to generate a database representing normal and degraded process conditions. Application of the group method of data handling (GMDH) method for process variable prediction. Development of a laboratory test module to simulate particulate fouling of HX tubes and its effect on overall thermal resistance. Application of the GMDH technique to predict HX fluid temperatures, and to compare with the calculated thermal resistance.Development of a hybrid modeling technique for process diagnosis and its evaluation using laboratory heat exchanger test data. Development and testing of a sensor suite using piezo-electric devices for monitoring structural integrity of both flat plates (beams) and tubing. Experiments were performed in air, and in water with and without bubbly flow. Development of advanced signal processing methods using wavelet transforms and image processing techniques for isolating flaw types. Development and implementation of a new nonlinear and non-stationary signal processing method, called the Hilbert-Huang transform (HHT), for flaw detection and location. This is a more robust and adaptive approach compared to the wavelet transform.Implementation of a moving-window technique in the time domain for detecting and quantifying flaw types in tubular structures. A window zooming technique was also developed for flaw location in tubes. Theoretical study of elastic wave propagation (longitudinal and shear waves) in metallic flat plates and tubing with and without flaws. Simulation of the Lamb wave propagation using the finite-element code ABAQUS. This enabled the verification of the experimental results. The research tasks included both analytical research and experimental studies. The experimental results helped to enhance the robustness of fault monitoring methods and to provide a systematic verification of the analytical results. The results of this research were disseminated in scientific meetings. The journal manuscript titled, "Structural Integrity Monitoring of Steam generator Tubing Using Transient Acoustic Signal Analysis," was published in IEEE Trasactions on Nuclear Science, Vol. 52, No. 1, February 2005. The new findings of this research have potential applications in aerospace and civil structures. The report contains a complete bibliography that was developed during the course of the project.« less

  3. An Optimized Online Verification Imaging Procedure for External Beam Partial Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willis, David J., E-mail: David.Willis@petermac.or; Royal Melbourne Institute of Technology University, Melbourne, Victoria; Kron, Tomas

    2011-07-01

    The purpose of this study was to evaluate the capabilities of a kilovoltage (kV) on-board imager (OBI)-equipped linear accelerator in the setting of on-line verification imaging for external-beam partial breast irradiation. Available imaging techniques were optimized and assessed for image quality using a modified anthropomorphic phantom. Imaging dose was also assessed. Imaging techniques were assessed for physical clearance between patient and treatment machine using a volunteer. Nonorthogonal kV image pairs were identified as optimal in terms of image quality, clearance, and dose. After institutional review board approval, this approach was used for 17 patients receiving accelerated partial breast irradiation. Imagingmore » was performed before every fraction verification with online correction of setup deviations >5 mm (total image sessions = 170). Treatment staff rated risk of collision and visibility of tumor bed surgical clips where present. Image session duration and detected setup deviations were recorded. For all cases, both image projections (n = 34) had low collision risk. Surgical clips were rated as well as visualized in all cases where they were present (n = 5). The average imaging session time was 6 min, 16 sec, and a reduction in duration was observed as staff became familiar with the technique. Setup deviations of up to 1.3 cm were detected before treatment and subsequently confirmed offline. Nonorthogonal kV image pairs allowed effective and efficient online verification for partial breast irradiation. It has yet to be tested in a multicenter study to determine whether it is dependent on skilled treatment staff.« less

  4. Analysis of Fade Detection and Compensation Experimental Results in a Ka-Band Satellite System. Degree awarded by Akron Univ., May 2000

    NASA Technical Reports Server (NTRS)

    Johnson, Sandra

    2001-01-01

    The frequency bands being used for new satellite communication systems are constantly increasing to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band, the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS), launched in September 1993, is the first US communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including onboard baseband processing, multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this thesis is to describe and validate the method used by the ACTS Very Small Aperture Terminal (VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program is used to validate the compensation technique. In this thesis, models in MATLAB are developed to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka-band systems are also presented.

  5. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  6. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  7. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  8. Delay compensation in integrated communication and control systems. II - Implementation and verification

    NASA Technical Reports Server (NTRS)

    Luck, Rogelio; Ray, Asok

    1990-01-01

    The implementation and verification of the delay-compensation algorithm are addressed. The delay compensator has been experimentally verified at an IEEE 802.4 network testbed for velocity control of a DC servomotor. The performance of the delay-compensation algorithm was also examined by combined discrete-event and continuous-time simulation of the flight control system of an advanced aircraft that uses the SAE (Society of Automotive Engineers) linear token passing bus for data communications.

  9. FY2017 Final Report: Power of the People: A technical ethical and experimental examination of the use of crowdsourcing to support international nuclear safeguards verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire

    Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less

  10. Experimental Verification of the Use of Metal Filled Via Hole Fences for Crosstalk Control of Microstrip Lines in LTCC Packages

    NASA Technical Reports Server (NTRS)

    Ponchak, George E.; Chun, Donghoon; Yook, Jong-Gwan; Katehi, Linda P. B.

    2001-01-01

    Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior three-dimensional-finite element method (3-D-FEM) electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually Increases coupling between the lines: however, if the top of the via posts are connected by a metal strip, coupling is reduced. In this paper, experimental verification of the 3-D-FEM simulations is demonstrated for commercially fabricated low temperature cofired ceramic (LTCC) packages. In addition, measured attenuation of microstrip lines surrounded by the shielding structures is presented and shows that shielding structures do not change the attenuation characteristics of the line.

  11. Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry. Final report, September 1988--November 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, T.A.

    1992-12-01

    The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows.more » A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.« less

  12. Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, T.A.

    1992-12-01

    The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows.more » A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.« less

  13. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  14. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  15. Experimental Investigations of Non-Stationary Properties In Radiometer Receivers Using Measurements of Multiple Calibration References

    NASA Technical Reports Server (NTRS)

    Racette, Paul; Lang, Roger; Zhang, Zhao-Nan; Zacharias, David; Krebs, Carolyn A. (Technical Monitor)

    2002-01-01

    Radiometers must be periodically calibrated because the receiver response fluctuates. Many techniques exist to correct for the time varying response of a radiometer receiver. An analytical technique has been developed that uses generalized least squares regression (LSR) to predict the performance of a wide variety of calibration algorithms. The total measurement uncertainty including the uncertainty of the calibration can be computed using LSR. The uncertainties of the calibration samples used in the regression are based upon treating the receiver fluctuations as non-stationary processes. Signals originating from the different sources of emission are treated as simultaneously existing random processes. Thus, the radiometer output is a series of samples obtained from these random processes. The samples are treated as random variables but because the underlying processes are non-stationary the statistics of the samples are treated as non-stationary. The statistics of the calibration samples depend upon the time for which the samples are to be applied. The statistics of the random variables are equated to the mean statistics of the non-stationary processes over the interval defined by the time of calibration sample and when it is applied. This analysis opens the opportunity for experimental investigation into the underlying properties of receiver non stationarity through the use of multiple calibration references. In this presentation we will discuss the application of LSR to the analysis of various calibration algorithms, requirements for experimental verification of the theory, and preliminary results from analyzing experiment measurements.

  16. Acoustic time-of-flight for proton range verification in water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Kevin C.; Avery, Stephen, E-mail: Stephen.A

    2016-09-15

    Purpose: Measurement of the arrival times of thermoacoustic waves induced by pulsed proton dose depositions (protoacoustics) may provide a proton range verification method. The goal of this study is to characterize the required dose and protoacoustic proton range (distance) verification accuracy in a homogeneous water medium at a hospital-based clinical cyclotron. Methods: Gaussian-like proton pulses with 17 μs widths and instantaneous currents of 480 nA (5.6 × 10{sup 7} protons/pulse, 3.4 cGy/pulse at the Bragg peak) were generated by modulating the cyclotron proton source with a function generator. After energy degradation, the 190 MeV proton pulses irradiated a water phantom,more » and the generated protoacoustic emissions were measured by a hydrophone. The detector position and proton pulse characteristics were varied. The experimental results were compared to simulations. Different arrival time metrics derived from acoustic waveforms were compared, and the accuracy of protoacoustic time-of-flight distance calculations was assessed. Results: A 27 mPa noise level was observed in the treatment room during irradiation. At 5 cm from the proton beam, an average maximum pressure of 5.2 mPa/1 × 10{sup 7} protons (6.1 mGy at the Bragg peak) was measured after irradiation with a proton pulse with 10%–90% rise time of 11 μs. Simulation and experiment arrival times agreed well, and the observed 2.4 μs delay between simulation and experiment is attributed to the difference between the hydrophone’s acoustic and geometric centers. Based on protoacoustic arrival times, the beam axis position was measured to within (x, y) = (−2.0,  0.5) ± 1 mm. After deconvolution of the exciting proton pulse, the protoacoustic compression peak provided the most consistent measure of the distance to the Bragg peak, with an error distribution with mean = − 4.5 mm and standard deviation = 2.0 mm. Conclusions: Based on water tank measurements at a clinical hospital-based cyclotron, protoacoustics is a potential method for measuring the beam’s position (x and y within 2.0 mm) and Bragg peak range (2.0 mm standard deviation), although range verification will require simulation or experimental calibration to remove systematic error. Based on extrapolation, a protoacoustic arrival time reproducibility of 1.5 μs (2.2 mm) is achievable with 2 Gy of total deposited dose. Of the compared methods, deconvolution of the excitation proton pulse is the best technique for extracting protoacoustic arrival times, particularly if there is variation in the proton pulse shape.« less

  17. Acoustic time-of-flight for proton range verification in water.

    PubMed

    Jones, Kevin C; Vander Stappen, François; Sehgal, Chandra M; Avery, Stephen

    2016-09-01

    Measurement of the arrival times of thermoacoustic waves induced by pulsed proton dose depositions (protoacoustics) may provide a proton range verification method. The goal of this study is to characterize the required dose and protoacoustic proton range (distance) verification accuracy in a homogeneous water medium at a hospital-based clinical cyclotron. Gaussian-like proton pulses with 17 μs widths and instantaneous currents of 480 nA (5.6 × 10(7) protons/pulse, 3.4 cGy/pulse at the Bragg peak) were generated by modulating the cyclotron proton source with a function generator. After energy degradation, the 190 MeV proton pulses irradiated a water phantom, and the generated protoacoustic emissions were measured by a hydrophone. The detector position and proton pulse characteristics were varied. The experimental results were compared to simulations. Different arrival time metrics derived from acoustic waveforms were compared, and the accuracy of protoacoustic time-of-flight distance calculations was assessed. A 27 mPa noise level was observed in the treatment room during irradiation. At 5 cm from the proton beam, an average maximum pressure of 5.2 mPa/1 × 10(7) protons (6.1 mGy at the Bragg peak) was measured after irradiation with a proton pulse with 10%-90% rise time of 11 μs. Simulation and experiment arrival times agreed well, and the observed 2.4 μs delay between simulation and experiment is attributed to the difference between the hydrophone's acoustic and geometric centers. Based on protoacoustic arrival times, the beam axis position was measured to within (x, y) = (-2.0,  0.5) ± 1 mm. After deconvolution of the exciting proton pulse, the protoacoustic compression peak provided the most consistent measure of the distance to the Bragg peak, with an error distribution with mean = - 4.5 mm and standard deviation = 2.0 mm. Based on water tank measurements at a clinical hospital-based cyclotron, protoacoustics is a potential method for measuring the beam's position (x and y within 2.0 mm) and Bragg peak range (2.0 mm standard deviation), although range verification will require simulation or experimental calibration to remove systematic error. Based on extrapolation, a protoacoustic arrival time reproducibility of 1.5 μs (2.2 mm) is achievable with 2 Gy of total deposited dose. Of the compared methods, deconvolution of the excitation proton pulse is the best technique for extracting protoacoustic arrival times, particularly if there is variation in the proton pulse shape.

  18. Forecast Verification: Identification of small changes in weather forecasting skill

    NASA Astrophysics Data System (ADS)

    Weatherhead, E. C.; Jensen, T. L.

    2017-12-01

    Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.

  19. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  20. A high sensitivity heterodyne interferometer as a possible optical readout for the LISA gravitational reference sensor and its application to technology verification

    NASA Astrophysics Data System (ADS)

    Gohlke, Martin; Schuldt, Thilo; Weise, Dennis; Cordero, Jorge; Peters, Achim; Johann, Ulrich; Braxmaier, Claus

    2017-11-01

    The gravitational wave detector LISA utilizes as current baseline a high sensitivity Optical Readout (ORO) for measuring the relative position and tilt of a free flying proof mass with respect to the satellite housing. The required sensitivities in the frequency band from 30 μHz to 1Hz are ˜ pm/ √ Hz for the translation√ and nrad/√ Hz for the tilt measurement. EADS Astrium, in collaboration with the Humboldt University Berlin and the University of Applied Sciences Konstanz, has realized a prototype ORO over the past years. The interferometer is based on a highly symmetric design where both, measurement and reference beam have a similar optical pathlength, and the same frequency and polarization. The technique of differential wavefront sensing (DWS) for tilt measurement is implemented. With our setup noise levels below 5pm/ √Hz for translation and below 10nrad/ √Hz for tilt measurements - both for frequencies above 10mHz - were demonstrated. We give an overview over the experimental setup, its current performance and the planned improvements. We also discuss the application to first verification of critical LISA aspects. As example we present measurements of the coefficient of thermal expansion (CTE) of various carbon fiber reinforced plastic (CFRP) including a "near-zero-CTE" tube.

  1. Physical property measurements on analog granites related to the joint verification experiment

    NASA Astrophysics Data System (ADS)

    Martin, Randolph J., III; Coyner, Karl B.; Haupt, Robert W.

    1990-08-01

    A key element in JVE (Joint Verification Experiment) conducted jointly between the United States and the USSR is the analysis of the geology and physical properties of the rocks in the respective test sites. A study was initiated to examine unclassified crystalline rock specimens obtained from areas near the Soviet site, Semipalatinsk and appropriate analog samples selected from Mt. Katadin, Maine. These rocks were also compared to Sierra White and Westerly Granite which have been studied in great detail. Measurements performed to characterize these rocks were: (1) Uniaxial strain with simultaneous compressional and shear wave velocities; (2) Hydrostatic compression to 150 MPa with simultaneous compressional and shear wave velocities; (3) Attenuation measurements as a function of frequency and strain amplitude for both dry and water saturated conditions. Elastic moduli determined from the hydrostatic compression and uniaxial strain test show that the rock matrix/mineral properties were comparable in magnitudes which vary within 25 percent from sample to sample. These properties appear to be approximately isotropic, especially at high pressures. However, anisotropy evident for certain samples at pressures below 35 MPa is attributed to dominant pre-existing microcrack populations and their alignments. Dependence of extensional attenuation and Young's modulus on strain amplitude were experimentally determined for intact Sierra White granite using the hysteresis loop technique.

  2. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  3. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  4. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-07-31

    real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.

  5. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  6. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  7. Secure Image Hash Comparison for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.

    2014-06-06

    The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.

  8. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  9. Ionoacoustic characterization of the proton Bragg peak with submillimeter accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assmann, W., E-mail: walter.assmann@lmu.de; Reinhardt, S.; Lehrack, S.

    2015-02-15

    Purpose: Range verification in ion beam therapy relies to date on nuclear imaging techniques which require complex and costly detector systems. A different approach is the detection of thermoacoustic signals that are generated due to localized energy loss of ion beams in tissue (ionoacoustics). Aim of this work was to study experimentally the achievable position resolution of ionoacoustics under idealized conditions using high frequency ultrasonic transducers and a specifically selected probing beam. Methods: A water phantom was irradiated by a pulsed 20 MeV proton beam with varying pulse intensity and length. The acoustic signal of single proton pulses was measuredmore » by different PZT-based ultrasound detectors (3.5 and 10 MHz central frequencies). The proton dose distribution in water was calculated by Geant4 and used as input for simulation of the generated acoustic wave by the matlab toolbox k-WAVE. Results: In measurements from this study, a clear signal of the Bragg peak was observed for an energy deposition as low as 10{sup 12} eV. The signal amplitude showed a linear increase with particle number per pulse and thus, dose. Bragg peak position measurements were reproducible within ±30 μm and agreed with Geant4 simulations to better than 100 μm. The ionoacoustic signal pattern allowed for a detailed analysis of the Bragg peak and could be well reproduced by k-WAVE simulations. Conclusions: The authors have studied the ionoacoustic signal of the Bragg peak in experiments using a 20 MeV proton beam with its correspondingly localized energy deposition, demonstrating submillimeter position resolution and providing a deep insight in the correlation between the acoustic signal and Bragg peak shape. These results, together with earlier experiments and new simulations (including the results in this study) at higher energies, suggest ionoacoustics as a technique for range verification in particle therapy at locations, where the tumor can be localized by ultrasound imaging. This acoustic range verification approach could offer the possibility of combining anatomical ultrasound and Bragg peak imaging, but further studies are required for translation of these findings to clinical application.« less

  10. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  11. B-scan technique for localization and characterization of fatigue cracks around fastener holes in multi-layered structures

    NASA Astrophysics Data System (ADS)

    Hopkins, Deborah; Datuin, Marvin; Aldrin, John; Warchol, Mark; Warchol, Lyudmila; Forsyth, David

    2018-04-01

    The work presented here aims to develop and transition angled-beam shear-wave inspection techniques for crack localization at fastener sites in multi-layer aircraft structures. This requires moving beyond detection to achieve reliable crack location and size, thereby providing invaluable information for maintenance actions and service-life management. The technique presented is based on imaging cracks in "True" B-scans (depth view projected in the sheets along the beam path). The crack traces that contribute to localization in the True B-scans depend on small, diffracted signals from the crack edges and tips that are visible in simulations and experimental data acquired with sufficient gain. The most recent work shows that cracks rotated toward and away from the central ultrasonic beam also yield crack traces in True B-scans that allow localization in simulations, even for large obtuse angles where experimental and simulation results show very small or no indications in the C-scans. Similarly, for two sheets joined by sealant, simulations show that cracks in the second sheet can be located in True B-scans for all locations studied: cracks that intersect the front or back wall of the second sheet, as well as relatively small mid-bore cracks. These results are consistent with previous model verification and sensitivity studies that demonstrate crack localization in True B-scans for a single sheet and cracks perpendicular to the ultrasonic beam.

  12. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  13. A Readout Integrated Circuit (ROIC) employing self-adaptive background current compensation technique for Infrared Focal Plane Array (IRFPA)

    NASA Astrophysics Data System (ADS)

    Zhou, Tong; Zhao, Jian; He, Yong; Jiang, Bo; Su, Yan

    2018-05-01

    A novel self-adaptive background current compensation circuit applied to infrared focal plane array is proposed in this paper, which can compensate the background current generated in different conditions. Designed double-threshold detection strategy is to estimate and eliminate the background currents, which could significantly reduce the hardware overhead and improve the uniformity among different pixels. In addition, the circuit is well compatible to various categories of infrared thermo-sensitive materials. The testing results of a 4 × 4 experimental chip showed that the proposed circuit achieves high precision, wide application and high intelligence. Tape-out of the 320 × 240 readout circuit, as well as the bonding, encapsulation and imaging verification of uncooled infrared focal plane array, have also been completed.

  14. Automatic extraction of building boundaries using aerial LiDAR data

    NASA Astrophysics Data System (ADS)

    Wang, Ruisheng; Hu, Yong; Wu, Huayi; Wang, Jian

    2016-01-01

    Building extraction is one of the main research topics of the photogrammetry community. This paper presents automatic algorithms for building boundary extractions from aerial LiDAR data. First, segmenting height information generated from LiDAR data, the outer boundaries of aboveground objects are expressed as closed chains of oriented edge pixels. Then, building boundaries are distinguished from nonbuilding ones by evaluating their shapes. The candidate building boundaries are reconstructed as rectangles or regular polygons by applying new algorithms, following the hypothesis verification paradigm. These algorithms include constrained searching in Hough space, enhanced Hough transformation, and the sequential linking technique. The experimental results show that the proposed algorithms successfully extract building boundaries at rates of 97%, 85%, and 92% for three LiDAR datasets with varying scene complexities.

  15. Neutron spectroscopy with scintillation detectors using wavelets

    NASA Astrophysics Data System (ADS)

    Hartman, Jessica

    The purpose of this research was to study neutron spectroscopy using the EJ-299-33A plastic scintillator. This scintillator material provided a novel means of detection for fast neutrons, without the disadvantages of traditional liquid scintillation materials. EJ-299-33A provided a more durable option to these materials, making it less likely to be damaged during handling. Unlike liquid scintillators, this plastic scintillator was manufactured from a non-toxic material, making it safer to use, as well as easier to design detectors. The material was also manufactured with inherent pulse shape discrimination abilities, making it suitable for use in neutron detection. The neutron spectral unfolding technique was developed in two stages. Initial detector response function modeling was carried out through the use of the MCNPX Monte Carlo code. The response functions were developed for a monoenergetic neutron flux. Wavelets were then applied to smooth the response function. The spectral unfolding technique was applied through polynomial fitting and optimization techniques in MATLAB. Verification of the unfolding technique was carried out through the use of experimentally determined response functions. These were measured on the neutron source based on the Van de Graff accelerator at the University of Kentucky. This machine provided a range of monoenergetic neutron beams between 0.1 MeV and 24 MeV, making it possible to measure the set of response functions of the EJ-299-33A plastic scintillator detector to neutrons of specific energies. The response of a plutonium-beryllium (PuBe) source was measured using the source available at the University of Nevada, Las Vegas. The neutron spectrum reconstruction was carried out using the experimentally measured response functions. Experimental data was collected in the list mode of the waveform digitizer. Post processing of this data focused on the pulse shape discrimination analysis of the recorded response functions to remove the effects of photons and allow for source characterization based solely on the neutron response. The unfolding technique was performed through polynomial fitting and optimization techniques in MATLAB, and provided an energy spectrum for the PuBe source.

  16. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  17. Verification of intravenous catheter placement by auscultation--a simple, noninvasive technique.

    PubMed

    Lehavi, Amit; Rudich, Utay; Schechtman, Moshe; Katz, Yeshayahu Shai

    2014-01-01

    Verification of proper placement of an intravenous catheter may not always be simple. We evaluated the auscultation technique for this purpose. Twenty healthy volunteers were randomized for 18G catheter inserted intravenously either in the right (12) or left arm (8), and subcutaneously in the opposite arm. A standard stethoscope was placed over an area approximately 3 cm proximal to the tip of the catheter in the presumed direction of the vein to grade on a 0-6 scale the murmur heard by rapidly injecting 2 mL of NaCl 0.9% solution. The auscultation was evaluated by a blinded staff anesthesiologist. All 20 intravenous injection were evaluated as flow murmurs, and were graded an average 5.65 (±0.98), whereas all 20 subcutaneous injections were evaluated as either crackles or no sound, and were graded an average 2.00 (±1.38), without negative results. Sensitivity was calculated as 95%. Specificity and Kappa could not be calculated due to an empty false-positive group. Being simple, handy and noninvasive, we recommend to use the auscultation technique for verification of the proper placement of an intravenous catheter when uncertain of its position. Data obtained in our limited sample of healthy subjects need to be confirmed in the clinical setting.

  18. Technical review of SRT-CMA-930058 revalidation studies of Mark 16 experiments: J70

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, R.L.

    1993-10-25

    This study is a reperformance of a set of MGBS-TGAL criticality safety code validation calculations previously reported by Clark. The reperformance was needed because the records of the previous calculations could not be located in current APG files and records. As noted by the author, preliminary attempts to reproduce the Clark results by direct modeling in MGBS and TGAL were unsuccessful. Consultation with Clark indicated that the MGBS-TGAL (EXPT) option within the KOKO system should be used to set up the MGBS and TGAL input data records. The results of the study indicate that the technique used by Clark hasmore » been established and that the technique is now documented for future use. File records of the calculations have also been established in APG files. The review was performed per QAP 11--14 of 1Q34. Since the reviewer was involved in developing the procedural technique used for this study, this review can not be considered a fully independent review, but should be considered a verification that the document contains adequate information to allow a new user to perform similar calculations, a verification of the procedure by performing several calculations independently with identical results to the reported results, and a verification of the readability of the report.« less

  19. Operational verification of a 40-MHz annular array transducer

    PubMed Central

    Ketterling, Jeffrey A.; Ramachandran, Sarayu; Aristizäbal, Orlando

    2006-01-01

    An experimental system to take advantage of the imaging capabilities of a 5-ring polyvinylidene fluoride (PVDF) based annular array is presented. The array has a 6 mm total aperture and a 12 mm geometric focus. The experimental system is designed to pulse a single element of the array and then digitize the received data of all array channels simultaneously. All transmit/receive pairs are digitized and then the data are post-processed with a synthetic focusing technique to achieve an enhanced depth of field (DOF). The performance of the array is experimentally tested with a wire phantom consisting of 25-μm diameter wires diagonally spaced at 1 mm by 1 mm intervals. The phantom permitted the efficacy of the synthetic focusing algorithm to be tested and was also used for two-way beam characterization. Experimental results are compared to a spatial impulse response method beam simulation. After synthetic focusing, the two-way echo amplitude was enhanced over the range of 8 to 19 mm and the 6-dB DOF spanned from 9 to 15 mm. For a wire at a fixed axial depth, the relative time delays between transmit/receive ring pairs agreed with theoretical predictions to within ± 2 ns. To further test the system, B-mode images of an excised bovine eye are rendered. PMID:16555771

  20. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  1. Model Checking Satellite Operational Procedures

    NASA Astrophysics Data System (ADS)

    Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri

    2011-08-01

    We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.

  2. The influence of verification jig on framework fit for nonsegmented fixed implant-supported complete denture.

    PubMed

    Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje

    2012-05-01

    The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.

  3. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  4. Verification and Validation of Residual Stresses in Bi-Material Composite Rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Stacy Michelle; Hanson, Alexander Anthony; Briggs, Timothy

    Process-induced residual stresses commonly occur in composite structures composed of dissimilar materials. These residual stresses form due to differences in the composite materials’ coefficients of thermal expansion and the shrinkage upon cure exhibited by polymer matrix materials. Depending upon the specific geometric details of the composite structure and the materials’ curing parameters, it is possible that these residual stresses could result in interlaminar delamination or fracture within the composite. Therefore, the consideration of potential residual stresses is important when designing composite parts and their manufacturing processes. However, the experimental determination of residual stresses in prototype parts can be time andmore » cost prohibitive. As an alternative to physical measurement, it is possible for computational tools to be used to quantify potential residual stresses in composite prototype parts. Therefore, the objectives of the presented work are to demonstrate a simplistic method for simulating residual stresses in composite parts, as well as the potential value of sensitivity and uncertainty quantification techniques during analyses for which material property parameters are unknown. Specifically, a simplified residual stress modeling approach, which accounts for coefficient of thermal expansion mismatch and polymer shrinkage, is implemented within the Sandia National Laboratories’ developed SIERRA/SolidMechanics code. Concurrent with the model development, two simple, bi-material structures composed of a carbon fiber/epoxy composite and aluminum, a flat plate and a cylinder, are fabricated and the residual stresses are quantified through the measurement of deformation. Then, in the process of validating the developed modeling approach with the experimental residual stress data, manufacturing process simulations of the two simple structures are developed and undergo a formal verification and validation process, including a mesh convergence study, sensitivity analysis, and uncertainty quantification. The simulations’ final results show adequate agreement with the experimental measurements, indicating the validity of a simple modeling approach, as well as a necessity for the inclusion of material parameter uncertainty in the final residual stress predictions.« less

  5. Experimental verification of the spectral shift between near- and far-field peak intensities of plasmonic infrared nanoantennas.

    PubMed

    Alonso-González, P; Albella, P; Neubrech, F; Huck, C; Chen, J; Golmar, F; Casanova, F; Hueso, L E; Pucci, A; Aizpurua, J; Hillenbrand, R

    2013-05-17

    Theory predicts a distinct spectral shift between the near- and far-field optical response of plasmonic antennas. Here we combine near-field optical microscopy and far-field spectroscopy of individual infrared-resonant nanoantennas to verify experimentally this spectral shift. Numerical calculations corroborate our experimental results. We furthermore discuss the implications of this effect in surface-enhanced infrared spectroscopy.

  6. Theoretical verification of experimentally obtained conformation-dependent electronic conductance in a biphenyl molecule

    NASA Astrophysics Data System (ADS)

    Maiti, Santanu K.

    2014-07-01

    The experimentally obtained (Venkataraman et al. [1]) cosine squared relation of electronic conductance in a biphenyl molecule is verified theoretically within a tight-binding framework. Using Green's function formalism we numerically calculate two-terminal conductance as a function of relative twist angle among the molecular rings and find that the results are in good agreement with the experimental observation.

  7. Optical/digital identification/verification system based on digital watermarking technology

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.

    2000-06-01

    This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.

  8. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  9. Methods in Symbolic Computation and p-Adic Valuations of Polynomials

    NASA Astrophysics Data System (ADS)

    Guan, Xiao

    Symbolic computation has widely appear in many mathematical fields such as combinatorics, number theory and stochastic processes. The techniques created in the area of experimental mathematics provide us efficient ways of symbolic computing and verification of complicated relations. Part I consists of three problems. The first one focuses on a unimodal sequence derived from a quartic integral. Many of its properties are explored with the help of hypergeometric representations and automatic proofs. The second problem tackles the generating function of the reciprocal of Catalan number. It springs from the closed form given by Mathematica. Furthermore, three methods in special functions are used to justify this result. The third issue addresses the closed form solutions for the moments of products of generalized elliptic integrals , which combines the experimental mathematics and classical analysis. Part II concentrates on the p-adic valuations of polynomials from the perspective of trees. For a given polynomial f( n) indexed in positive integers, the package developed in Mathematica will create certain tree structure following a couple of rules. The evolution of such trees are studied both rigorously and experimentally from the view of field extension, nonparametric statistics and random matrix.

  10. Effects of rotation on coolant passage heat transfer. Volume 2: Coolant passages with trips normal and skewed to the flow

    NASA Technical Reports Server (NTRS)

    Johnson, B. V.; Wagner, J. H.; Steuber, G. D.

    1993-01-01

    An experimental program was conducted to investigate heat transfer and pressure loss characteristics of rotating multipass passages, for configurations and dimensions typical of modem turbine blades. This experimental program is one part of the NASA Hot Section Technology (HOST) Initiative, which has as its overall objective the development and verification of improved analysis methods that will form the basis for a design system that will produce turbine components with improved durability. The objective of this program was the generation of a data base of heat transfer and pressure loss data required to develop heat transfer correlations and to assess computational fluid dynamic techniques for rotating coolant passages. The experimental work was broken down into two phases. Phase 1 consists of experiments conducted in a smooth wall large scale heat transfer model. A detailed discussion of these results was presented in volume 1 of a NASA Report. In Phase 2 the large scale model was modified to investigate the effects of skewed and normal passage turbulators. The results of Phase 2 along with comparison to Phase 1 is the subject of this Volume 2 NASA Report.

  11. Robotic Spent Fuel Monitoring – It is time to improve old approaches and old techniques!

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tobin, Stephen Joseph; Dasari, Venkateswara Rao; Trellue, Holly Renee

    This report describes various approaches and techniques associated with robotic spent fuel monitoring. The purpose of this description is to improve the quality of measured signatures, reduce the inspection burden on the IAEA, and to provide frequent verification.

  12. TH-B-204-03: TG-199: Implanted Markers for Radiation Treatment Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z.

    Implanted markers as target surrogates have been widely used for treatment verification, as they provide safe and reliable monitoring of the inter- and intra-fractional target motion. The rapid advancement of technology requires a critical review and recommendation for the usage of implanted surrogates in current field. The symposium, also reporting an update of AAPM TG 199 - Implanted Target Surrogates for Radiation Treatment Verification, will be focusing on all clinical aspects of using the implanted target surrogates for treatment verification and related issues. A wide variety of markers available in the market will be first reviewed, including radiopaque markers, MRImore » compatible makers, non-migrating coils, surgical clips and electromagnetic transponders etc. The pros and cons of each kind will be discussed. The clinical applications of implanted surrogates will be presented based on different anatomical sites. For the lung, we will discuss gated treatments and 2D or 3D real-time fiducial tracking techniques. For the prostate, we will be focusing on 2D-3D, 3D-3D matching and electromagnetic transponder based localization techniques. For the liver, we will review techniques when patients are under gating, shallow or free breathing condition. We will review techniques when treating challenging breast cancer as deformation may occur. Finally, we will summarize potential issues related to the usage of implanted target surrogates with TG 199 recommendations. A review of fiducial migration and fiducial derived target rotation in different disease sites will be provided. The issue of target deformation, especially near the diaphragm, and related suggestions will be also presented and discussed. Learning Objectives: Knowledge of a wide variety of markers Knowledge of their application for different disease sites Understand of issues related to these applications Z. Wang: Research funding support from Brainlab AG Q. Xu: Consultant for Accuray; Q. Xu, I am a consultant for Accuray planning service.« less

  13. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  14. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > PEOPLE Home Operational Products Experimental Data Verification / Development Contacts Change Log Events Calendar Events People Numerical Forecast Systems Coming Soon. NOAA

  15. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  16. An Overview and Empirical Comparison of Distance Metric Learning Methods.

    PubMed

    Moutafis, Panagiotis; Leng, Mengjun; Kakadiaris, Ioannis A

    2016-02-16

    In this paper, we first offer an overview of advances in the field of distance metric learning. Then, we empirically compare selected methods using a common experimental protocol. The number of distance metric learning algorithms proposed keeps growing due to their effectiveness and wide application. However, existing surveys are either outdated or they focus only on a few methods. As a result, there is an increasing need to summarize the obtained knowledge in a concise, yet informative manner. Moreover, existing surveys do not conduct comprehensive experimental comparisons. On the other hand, individual distance metric learning papers compare the performance of the proposed approach with only a few related methods and under different settings. This highlights the need for an experimental evaluation using a common and challenging protocol. To this end, we conduct face verification experiments, as this task poses significant challenges due to varying conditions during data acquisition. In addition, face verification is a natural application for distance metric learning because the encountered challenge is to define a distance function that: 1) accurately expresses the notion of similarity for verification; 2) is robust to noisy data; 3) generalizes well to unseen subjects; and 4) scales well with the dimensionality and number of training samples. In particular, we utilize well-tested features to assess the performance of selected methods following the experimental protocol of the state-of-the-art database labeled faces in the wild. A summary of the results is presented along with a discussion of the insights obtained and lessons learned by employing the corresponding algorithms.

  17. Category V Compliant Container for Mars Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.

    2000-01-01

    A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.

  18. Dosimetric changes with computed tomography automatic tube-current modulation techniques.

    PubMed

    Spampinato, Sofia; Gueli, Anna Maria; Milone, Pietro; Raffaele, Luigi Angelo

    2018-04-06

    The study is aimed at a verification of dose changes for a computed tomography automatic tube-current modulation (ATCM) technique. For this purpose, anthropomorphic phantom and Gafchromic ® XR-QA2 films were used. Radiochromic films were cut according to the shape of two thorax regions. The ATCM algorithm is based on noise index (NI) and three exam protocols with different NI were chosen, of which one was a reference. Results were compared with dose values displayed by the console and with Poisson statistics. The information obtained with radiochromic films has been normalized with respect to the NI reference value to compare dose percentage variations. Results showed that, on average, the information reported by the CT console and calculated values coincide with measurements. The study allowed verification of the dose information reported by the CT console for an ATCM technique. Although this evaluation represents an estimate, the method can be a starting point for further studies.

  19. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  20. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y; Yin, F; Ren, L

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to furthermore » reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The technique enables LIVE to accurately reconstruct 4D-CBCT images using only orthogonal 6° angle, which greatly improves the efficiency and reduces dose of LIVE for intrafraction verification.« less

  1. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  2. Restricted access processor - An application of computer security technology

    NASA Technical Reports Server (NTRS)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  3. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less

  5. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.

    2016-08-14

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less

  6. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    NASA Astrophysics Data System (ADS)

    Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.

    2016-08-01

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.

  7. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  8. NEW DEVELOPMENTS AND APPLICATIONS OF SUPERHEATED EMULSIONS: WARHEAD VERIFICATION AND SPECIAL NUCLEAR MATERIAL INTERDICTION.

    PubMed

    d'Errico, F; Chierici, A; Gattas-Sethi, M; Philippe, S; Goldston, R; Glaser, A

    2018-04-25

    In recent years, neutron detection with superheated emulsions has received renewed attention thanks to improved detector manufacturing and read-out techniques, and thanks to successful applications in warhead verification and special nuclear material (SNM) interdiction. Detectors are currently manufactured with methods allowing high uniformity of the drop sizes, which in turn allows the use of optical read-out techniques based on dynamic light scattering. Small detector cartridges arranged in 2D matrices are developed for the verification of a declared warhead without revealing its design. For this application, the enabling features of the emulsions are that bubbles formed at different times cannot be distinguished from each other, while the passive nature of the detectors avoids the susceptibility to electronic snooping and tampering. Large modules of emulsions are developed to detect the presence of shielded special nuclear materials hidden in cargo containers 'interrogated' with high energy X-rays. In this case, the enabling features of the emulsions are photon discrimination, a neutron detection threshold close to 3 MeV and a rate-insensitive read-out.

  9. The Development of Models for Carbon Dioxide Reduction Technologies for Spacecraft Air Revitalization

    NASA Technical Reports Server (NTRS)

    Swickrath, Michael J.; Anderson, Molly

    2012-01-01

    Through the respiration process, humans consume oxygen (O2) while producing carbon dioxide (CO2) and water (H2O) as byproducts. For long term space exploration, CO2 concentration in the atmosphere must be managed to prevent hypercapnia. Moreover, CO2 can be used as a source of oxygen through chemical reduction serving to minimize the amount of oxygen required at launch. Reduction can be achieved through a number of techniques. NASA is currently exploring the Sabatier reaction, the Bosch reaction, and co- electrolysis of CO2 and H2O for this process. Proof-of-concept experiments and prototype units for all three processes have proven capable of returning useful commodities for space exploration. All three techniques have demonstrated the capacity to reduce CO2 in the laboratory, yet there is interest in understanding how all three techniques would perform at a system level within a spacecraft. Consequently, there is an impetus to develop predictive models for these processes that can be readily rescaled and integrated into larger system models. Such analysis tools provide the ability to evaluate each technique on a comparable basis with respect to processing rates. This manuscript describes the current models for the carbon dioxide reduction processes under parallel developmental efforts. Comparison to experimental data is provided were available for verification purposes.

  10. Analysis of transitional separation bubbles on infinite swept wings

    NASA Technical Reports Server (NTRS)

    Davis, R. L.; Carter, J. E.

    1986-01-01

    A previously developed two-dimensional local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation), has been extended for the calculation of transitional separation bubbles over infinite swept wings. As part of this effort, Roberts' empirical correlation, which is interpreted as a separated flow empirical extension of Mack's stability theory for attached flows, has been incorporated into the ALESEP procedure for the prediction of the transition location within the separation bubble. In addition, the viscous procedure used in the ALESEP techniques has been modified to allow for wall suction. A series of two-dimensional calculations is presented as a verification of the prediction capability of the interaction techniques with the Roberts' transition model. Numerical tests have shown that this two-dimensional natural transition correlation may also be applied to transitional separation bubbles over infinite swept wings. Results of the interaction procedure are compared with Horton's detailed experimental data for separated flow over a swept plate which demonstrates the accuracy of the present technique. Wall suction has been applied to a similar interaction calculation to demonstrate its effect on the separation bubble. The principal conclusion of this paper is that the prediction of transitional separation bubbles over two-dimensional or infinite swept geometries is now possible using the present interacting boundary layer approach.

  11. Constitutive modeling of superalloy single crystals with verification testing

    NASA Technical Reports Server (NTRS)

    Jordan, Eric; Walker, Kevin P.

    1985-01-01

    The goal is the development of constitutive equations to describe the elevated temperature stress-strain behavior of single crystal turbine blade alloys. The program includes both the development of a suitable model and verification of the model through elevated temperature-torsion testing. A constitutive model is derived from postulated constitutive behavior on individual crystallographic slip systems. The behavior of the entire single crystal is then arrived at by summing up the slip on all the operative crystallographic slip systems. This type of formulation has a number of important advantages, including the prediction orientation dependence and the ability to directly represent the constitutive behavior in terms which metallurgists use in describing the micromechanisms. Here, the model is briefly described, followed by the experimental set-up and some experimental findings to date.

  12. Experimental verification of cleavage characteristic stress vs grain size

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, W.; Li, D.; Yao, M.

    Instead of the accepted cleavage fracture stress [sigma][sub f] proposed by Knott et al, a new parameter S[sub co], named as ''cleavage characteristic stress,'' has been recently recommended to characterize the microscopic resistance to cleavage fracture. To give a definition, S[sub co] is the fracture stress at the brittle/ductile transition temperature of steels in plain tension, below which the yield strength approximately equals the true fracture stress combined with an abrupt curtailment of ductility. By considering a single-grain microcrack arrested at a boundary, Huang and Yao set up an expression of S[sub co] as a function of grain size. Themore » present work was arranged to provide an experimental verification of S[sub co] vs grain size.« less

  13. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  14. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y.

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct featuresmore » of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two different media. Optimization of the simulation parameters and the use of VR techniques saved a significant amount of computation time. Finally, parallelization of the simulations improved even further the calculation time, which reached 1 day for a typical irradiation case envisaged in the forthcoming clinical trials in MRT. An example of MRT treatment in a dog's head is presented, showing the performance of the calculation engine. Conclusions: The development of the first MC-based calculation engine for the future TPS devoted to MRT has been accomplished. This will constitute an essential tool for the future clinical trials on pets at the ESRF. The MC engine is able to calculate dose distributions in micrometer-sized bins in complex voxelized CT structures in a reasonable amount of time. Minimization of the computation time by using several approaches has led to timings that are adequate for pet radiotherapy at synchrotron facilities. The next step will consist in its integration into a user-friendly graphical front-end.« less

  15. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy.

    PubMed

    Martinez-Rovira, I; Sempau, J; Prezado, Y

    2012-05-01

    Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two different media. Optimization of the simulation parameters and the use of VR techniques saved a significant amount of computation time. Finally, parallelization of the simulations improved even further the calculation time, which reached 1 day for a typical irradiation case envisaged in the forthcoming clinical trials in MRT. An example of MRT treatment in a dog's head is presented, showing the performance of the calculation engine. The development of the first MC-based calculation engine for the future TPS devoted to MRT has been accomplished. This will constitute an essential tool for the future clinical trials on pets at the ESRF. The MC engine is able to calculate dose distributions in micrometer-sized bins in complex voxelized CT structures in a reasonable amount of time. Minimization of the computation time by using several approaches has led to timings that are adequate for pet radiotherapy at synchrotron facilities. The next step will consist in its integration into a user-friendly graphical front-end.

  16. Polarized light microscopy for 3-dimensional mapping of collagen fiber architecture in ocular tissues.

    PubMed

    Yang, Bin; Jan, Ning-Jiun; Brazile, Bryn; Voorhees, Andrew; Lathrop, Kira L; Sigal, Ian A

    2018-04-06

    Collagen fibers play a central role in normal eye mechanics and pathology. In ocular tissues, collagen fibers exhibit a complex 3-dimensional (3D) fiber orientation, with both in-plane (IP) and out-of-plane (OP) orientations. Imaging techniques traditionally applied to the study of ocular tissues only quantify IP fiber orientation, providing little information on OP fiber orientation. Accurate description of the complex 3D fiber microstructures of the eye requires quantifying full 3D fiber orientation. Herein, we present 3dPLM, a technique based on polarized light microscopy developed to quantify both IP and OP collagen fiber orientations of ocular tissues. The performance of 3dPLM was examined by simulation and experimental verification and validation. The experiments demonstrated an excellent agreement between extracted and true 3D fiber orientation. Both IP and OP fiber orientations can be extracted from the sclera and the cornea, providing previously unavailable quantitative 3D measures and insight into the tissue microarchitecture. Together, the results demonstrate that 3dPLM is a powerful imaging technique for the analysis of ocular tissues. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Doppler-broadened NICE-OHMS beyond the cavity-limited weak absorption condition - II: Experimental verification

    NASA Astrophysics Data System (ADS)

    Hausmaninger, Thomas; Silander, Isak; Ma, Weiguang; Axner, Ove

    2016-01-01

    Doppler-broadened (Db) noise-immune cavity-enhanced optical heterodyne molecular spectrometry (NICE-OHMS) is normally described by an expression, here termed the conventional (CONV) description, that is restricted to the conventional cavity-limited weak absorption condition (CCLWA), i.e. when the single pass absorbance is significantly smaller than the empty cavity losses, i.e. when α0 L < < π / F. To describe NICE-OHMS signals beyond this limit two simplified extended descriptions (termed the extended locking and extended transmission description, ELET, and the extended locking and full transmission description, ELFT), which are assumed to be valid under the relaxed cavity-limited weak absorption condition (RCLWA), i.e. when α0 L < π / F, and a full description (denoted FULL), presumed to be valid also when the α0 L < π / F condition does not hold, have recently been derived in an accompanying work (Ma W, et al. Doppler-broadened NICE-OHMS beyond the cavity-limited weak absorption condition - I. Theoretical Description. J Quant Spectrosc Radiat Transfer, 2015, http://dx.doi.org/10.1016/j.jqsrt.2015.09.007). The present work constitutes an experimental verification and assessment of the validity of these, performed in the Doppler limit for a set of Fα0 L / π values (up to 3.5); it is shown under which conditions the various descriptions are valid. It is concluded that for samples with Fα0 L / π up to 0.01, all descriptions replicate the data well. It is shown that the CONV description is adequate and provides accurate assessments of the signal strength (and thereby the analyte concentration) up to Fα0 L / π of around 0.1, while the ELET is accurate for Fα0 L / π up to around 0.3. The ELFT description mimics the Db NICE-OHMS signal well for Fα0 L / π up to around unity, while the FULL description is adequate for all Fα0 L / π values investigated. Access to these descriptions both increases considerably the dynamic range of the technique and facilitates calibration using certified reference gases, which thereby significantly broadens the applicability of the Db NICE-OHMS technique.

  18. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baba, H; Tachibana, H; Kamima, T

    2015-06-15

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less

  19. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    NASA Astrophysics Data System (ADS)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  20. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  1. Intensity- and energy-modulated electron radiotherapy by means of an xMLC for head and neck shallow tumors

    NASA Astrophysics Data System (ADS)

    Salguero, Francisco Javier; Arráns, Rafael; Atriana Palma, Bianey; Leal, Antonio

    2010-03-01

    The purpose of this paper is to assess the feasibility of delivering intensity- and energy-modulated electron radiation treatment (MERT) by a photon multileaf collimator (xMLC) and to evaluate the improvements obtained in shallow head and neck (HN) tumors. Four HN patient cases covering different clinical situations were planned by MERT, which used an in-house treatment planning system that utilized Monte Carlo dose calculation. The cases included one oronasal, two parotid and one middle ear tumors. The resulting dose-volume histograms were compared with those obtained from conventional photon and electron treatment techniques in our clinic, which included IMRT, electron beam and mixed beams, most of them using fixed-thickness bolus. Experimental verification was performed with plane-parallel ionization chambers for absolute dose verification, and a PTW ionization chamber array and radiochromic film for relative dosimetry. A MC-based treatment planning system for target with compromised volumes in depth and laterally has been validated. A quality assurance protocol for individual MERT plans was launched. Relative MC dose distributions showed a high agreement with film measurements and absolute ion chamber dose measurements performed at a reference point agreed with MC calculations within 2% in all cases. Clinically acceptable PTV coverage and organ-at-risk sparing were achieved by using the proposed MERT approach. MERT treatment plans, based on delivery of intensity-modulated electron beam using the xMLC, for superficial head and neck tumors, demonstrated comparable or improved PTV dose homogeneity with significantly lower dose to normal tissues. The clinical implementation of this technique will be able to offer a viable alternative for the treatment of shallow head and neck tumors.

  2. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  3. Apprendre a apprendre: L'autocorrection (Learning to Learn: Self-Correction).

    ERIC Educational Resources Information Center

    Noir, Pascal

    1996-01-01

    A technique used in an advanced French writing class to encourage student self-correction is described. The technique focused on correction of verbs and their tenses; reduction of repetition; appropriate use of "on" and "nous;" and verification of possessive adjectives, negatives, personal pronouns, spelling, and punctuation.…

  4. Inexpensive Eddy-Current Standard

    NASA Technical Reports Server (NTRS)

    Berry, Robert F., Jr.

    1985-01-01

    Radial crack replicas serve as evaluation standards. Technique entails intimately joining two pieces of appropriate aluminum alloy stock and centering drilled hole through and along interface. Bore surface of hole presents two vertical stock interface lines 180 degrees apart. These lines serve as radial crack defect replicas during eddy-current technique setup and verification.

  5. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less

  6. CFD modeling and experimental verification of a single-stage coaxial Stirling-type pulse tube cryocooler without either double-inlet or multi-bypass operating at 30-35 K using mixed stainless steel mesh regenerator matrices

    NASA Astrophysics Data System (ADS)

    Dang, Haizheng; Zhao, Yibo

    2016-09-01

    This paper presents the CFD modeling and experimental verifications of a single-stage inertance tube coaxial Stirling-type pulse tube cryocooler operating at 30-35 K using mixed stainless steel mesh regenerator matrices without either double-inlet or multi-bypass. A two-dimensional axis-symmetric CFD model with the thermal non-equilibrium mode is developed to simulate the internal process, and the underlying mechanism of significantly reducing the regenerator losses with mixed matrices is discussed in detail based on the given six cases. The modeling also indicates that the combination of the given different mesh segments can be optimized to achieve the highest cooling efficiency or the largest exergy ratio, and then the verification experiments are conducted in which the satisfactory agreements between simulated and tested results are observed. The experiments achieve a no-load temperature of 27.2 K and the cooling power of 0.78 W at 35 K, or 0.29 W at 30 K, with an input electric power of 220 W and a reject temperature of 300 K.

  7. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  8. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > RAP, HRRR > Home Operational Products Experimental Data Verification Model Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post

  9. Structural Margins Assessment Approach

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.

    1988-01-01

    A general approach to the structural design and verification used to determine the structural margins of the space vehicle elements under Marshall Space Flight Center (MSFC) management is described. The Space Shuttle results and organization will be used as illustrations for techniques discussed. Given also are: (1) the system analyses performed or to be performed by, and (2) element analyses performed by MSFC and its contractors. Analysis approaches and their verification will be addressed. The Shuttle procedures are general in nature and apply to other than Shuttle space vehicles.

  10. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  11. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  12. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    NASA Astrophysics Data System (ADS)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC-based SBRT treatment planning in the routine clinical setting.

  13. Turbine Engine Testing.

    DTIC Science & Technology

    1981-01-01

    per-rev, ring weighting factor, etc.) and with compression system design . A detailed description of the SAE methodology is provided in Ref. 1...offers insights into the practical application of experimental aeromechanical procedures and establishes the process of valid design assessment, avoiding...considerations given to the total engine system. Design Verification in the Experimental Laboratory Certain key parameters are influencing the design of modern

  14. Resistivity Correction Factor for the Four-Probe Method: Experiment I

    NASA Astrophysics Data System (ADS)

    Yamashita, Masato; Yamaguchi, Shoji; Enjoji, Hideo

    1988-05-01

    Experimental verification of the theoretically derived resistivity correction factor (RCF) is presented. Resistivity and sheet resistance measurements by the four-probe method are made on three samples: isotropic graphite, ITO film and Au film. It is indicated that the RCF can correct the apparent variations of experimental data to yield reasonable resistivities and sheet resistances.

  15. ARC-1980-AC80-0512-2

    NASA Image and Video Library

    1980-06-05

    N-231 High Reynolds Number Channel Facility (An example of a Versatile Wind Tunnel) Tunnel 1 I is a blowdown Facility that utilizes interchangeable test sections and nozzles. The facility provides experimental support for the fluid mechanics research, including experimental verification of aerodynamic computer codes and boundary-layer and airfoil studies that require high Reynolds number simulation. (Tunnel 1)

  16. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > HOME Home Operational Products Experimental Data Verification Model PAGE LOGO NCEP HOME NWS LOGO NOAA HOME NOAA HOME Disclaimer for this non-operational web page

  17. Method for fabrication and verification of conjugated nanoparticle-antibody tuning elements for multiplexed electrochemical biosensors.

    PubMed

    La Belle, Jeffrey T; Fairchild, Aaron; Demirok, Ugur K; Verma, Aman

    2013-05-15

    There is a critical need for more accurate, highly sensitive and specific assay for disease diagnosis and management. A novel, multiplexed, single sensor using rapid and label free electrochemical impedance spectroscopy tuning method has been developed. The key challenges while monitoring multiple targets is frequency overlap. Here we describe the methods to circumvent the overlap, tune by use of nanoparticle (NP) and discuss the various fabrication and characterization methods to develop this technique. First sensors were fabricated using printed circuit board (PCB) technology and nickel and gold layers were electrodeposited onto the PCB sensors. An off-chip conjugation of gold NP's to molecular recognition elements (with verification technique) is described as well. A standard covalent immobilization of the molecular recognition elements is also discussed with quality control techniques. Finally use and verification of sensitivity and specificity is also presented. By use of gold NP's of various sizes, we have demonstrated the possibility and shown little loss of sensitivity and specificity in the molecular recognition of inflammatory markers as "model" targets for our tuning system. By selection of other sized NP's or NP's of various materials, the tuning effect can be further exploited. The novel platform technology developed could be utilized in critical care, clinical management and at home health and disease management. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Methode de Calcul du Flutter en Presence de jeu Mecanique et Verification Experimentale (Flutter Analysis Method in Presence of Mechanical Play and Experimental Verification)

    DTIC Science & Technology

    2000-05-01

    Flexible Aircraft Control", held in Ottawa, Canada, 18-20 October 1999, and published in RTO MP-36. 9-2 INTRODUCTION 2. PRINCIPES DE LA METHODE DE CALCUL...constitude par un .les pressions sur la gouveme et le ensemble de 17 pouts de jauge , de 20 moment de charni~re sont surestimds accildrom~tes, de 5...les corrdlations calcul-essais 130 mm). des rdponses dc jauges de contraintes A 12 Le calcul, comme les essais, permettent chargements statiques. Cette

  19. Bounded Parametric Model Checking for Elementary Net Systems

    NASA Astrophysics Data System (ADS)

    Knapik, Michał; Szreter, Maciej; Penczek, Wojciech

    Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.

  20. Verification of an IGBT Fusing Switch for Over-current Protection of the SNS HVCM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benwell, Andrew; Kemp, Mark; Burkhart, Craig

    2010-06-11

    An IGBT based over-current protection system has been developed to detect faults and limit the damage caused by faults in high voltage converter modulators. During normal operation, an IGBT enables energy to be transferred from storage capacitors to a H-bridge. When a fault occurs, the over-current protection system detects the fault, limits the fault current and opens the IGBT to isolate the remaining stored energy from the fault. This paper presents an experimental verification of the over-current protection system under applicable conditions.

  1. Low level vapor verification of monomethyl hydrazine

    NASA Technical Reports Server (NTRS)

    Mehta, Narinder

    1990-01-01

    The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.

  2. A method for modifying two-dimensional adaptive wind-tunnel walls including analytical and experimental verification

    NASA Technical Reports Server (NTRS)

    Everhart, J. L.

    1983-01-01

    The theoretical development of a simple and consistent method for removing the interference in adaptive-wall wind tunnels is reported. A Cauchy integral formulation of the velocities in an imaginary infinite extension of the real wind-tunnel flow is obtained and evaluated on a closed contour dividing the real and imaginary flow. The contour consists of the upper and lower effective wind-tunnel walls (wall plus boundary-layer displacement thickness) and upstream and downstream boundaries perpendicular to the axial tunnel flow. The resulting integral expressions for the streamwise and normal perturbation velocities on the contour are integrated by assuming a linear variation of the velocities between data-measurement stations along the contour. In an iterative process, the velocity components calculated on the upper and lower boundaries are then used to correct the shape of the wall to remove the interference. Convergence of the technique is shown numerically for the cases of a circular cylinder and a lifting and nonlifting NACA 0012 airfoil in incompressible flow. Experimental convergence at a transonic Mach number is demonstrated by using an NACA 0012 airfoil at zero lift.

  3. Proximity enhanced quantum spin Hall state in graphene

    DOE PAGES

    Kou, Liangzhi; Hu, Feiming; Yan, Binghai; ...

    2015-02-23

    Graphene is the first model system of two-dimensional topological insulator (TI), also known as quantum spin Hall (QSH) insulator. The QSH effect in graphene, however, has eluded direct experimental detection because of its extremely small energy gap due to the weak spin–orbit coupling. Here we predict by ab initio calculations a giant (three orders of magnitude) proximity induced enhancement of the TI energy gap in the graphene layer that is sandwiched between thin slabs of Sb 2Te 3 (or MoTe 2). This gap (1.5 meV) is accessible by existing experimental techniques, and it can be further enhanced by tuning themore » interlayer distance via compression. We reveal by a tight-binding study that the QSH state in graphene is driven by the Kane–Mele interaction in competition with Kekulé deformation and symmetry breaking. As a result, the present work identifies a new family of graphene-based TIs with an observable and controllable bulk energy gap in the graphene layer, thus opening a new avenue for direct verification and exploration of the long-sought QSH effect in graphene.« less

  4. On the possible high +Gz tolerance increase by multimodal brain imaging controlled respiratory AFTE biofeedback training exercise

    NASA Astrophysics Data System (ADS)

    Smietanowski, Maciej; Achimowicz, Jerzy; Lorenc, Kamil; Nowicki, Grzegorz; Zalewska, Ewa; Truszczynski, Olaf

    The experimental data related to Valsalva manouvers and short term voluntary apnea, available in the literature, suggest that the cerebral blood flow increase and reduction of the peripheral one may be expected if the specific AFTE based respiratory training is performed. The authors had verified this hypothesis by studying the relations between EEG measured subject relaxation combined with voluntary apnea by multimodal brain imaging technique (EEG mapping, Neuroscan and fMRI) in a group of healthy volunteers. The SPM analysis of respiratory related changes in cortical and subcortical BOLD signal has partially confirmed the hypothesis. The mechanism of this effect is probably based on the simultaneous blood pressure increase and total peripheral resistance increase. However the question is still open for further experimental verification if AFTE can be treated as the tool which can increase pilot/astronaut situation awareness in the extreme environment typical for aerospace operations where highly variable accelerations due to liftoff, rapid maneuvers, and vibrations can be expected in the critical phases of the mission.

  5. Circular dichroism of magnetically induced transitions for D2 lines of alkali atoms

    NASA Astrophysics Data System (ADS)

    Tonoyan, A.; Sargsyan, A.; Klinger, E.; Hakhumyan, G.; Leroy, C.; Auzinsh, M.; Papoyan, A.; Sarkisyan, D.

    2018-03-01

    In this letter we study magnetic circular dichroism in alkali atoms exhibiting asymmetric behaviour of magnetically induced transitions. The magnetic field \\textbf{B}\\parallel\\textbf{k} induces transitions between Δ F = +/-2 hyperfine levels of alkali atoms and in the range of ∼0.1{\\text{--}}3 \\text{kG} magnetic field, the intensities of these transitions experience significant enhancement. We have inferred a general rule applicable for the D 2 lines of all alkali atoms, that is the transition intensity enhancement is around four times larger for the case of σ+ than for σ- excitation for Δ F = +2 , whereas it is several hundreds of thousand times larger in the case of σ- than that for σ+ polarization for Δ F = -2 . This asymmetric behaviour results in circular dichroism. For experimental verification we employed half-wavelength-thick atomic vapor nanocells using a derivative of the selective reflection technique, which provides a sub-Doppler spectroscopic linewidth (∼50 \\text{MHz} ). The presented theoretical curves well describe the experimental results. This effect can find applications particularly in parity violation experiments.

  6. Modeling and characterization of through-the-thickness properties of 3D woven composites

    NASA Technical Reports Server (NTRS)

    Hartranft, Dru; Pravizi-Majidi, Azar; Chou, Tsu-Wei

    1995-01-01

    The through-the-thickness properties of three-dimensionally (3D) woven carbon/epoxy composites have been studied. The investigation aimed at the evaluation and development of test methodologies for the property characterization in the thickness direction, and the establishment of fiber architectures were studied: layer-to-layer Angle Interlock, through-the-thickness Orthogonal woven preform with surface pile was also designed and manufactured for the fabrication of tensile test coupons with integrated grips. All the preforms were infiltrated by the resin transfer molding technique. The microstructures of the composites were characterized along the warp and fill (weft) directions to determine the degree of yarn undulations, yarn cross-sectional shapes, and microstructural dimensions. These parameters were correlated to the fiber architecture. Specimens were designed and tested for the direct measurement of the through-the-thickness tensile, compressive and shear properties of the composites. Design optimization was conducted through the analysis of the stress fields within the specimen coupled with experimental verification. The experimentally-derived elastic properties in the thickness direction compared well with analytical predictions obtained from a volume averaging model.

  7. Damage detection in composite panels based on mode-converted Lamb waves sensed using 3D laser scanning vibrometer

    NASA Astrophysics Data System (ADS)

    Pieczonka, Łukasz; Ambroziński, Łukasz; Staszewski, Wiesław J.; Barnoncel, David; Pérès, Patrick

    2017-12-01

    This paper introduces damage identification approach based on guided ultrasonic waves and 3D laser Doppler vibrometry. The method is based on the fact that the symmetric and antisymmetric Lamb wave modes differ in amplitude of the in-plane and out-of-plane vibrations. Moreover, the modes differ also in group velocities and normally they are well separated in time. For a given time window both modes can occur simultaneously only close to the wave source or to a defect that leads to mode conversion. By making the comparison between the in-plane and out-of-plane wave vector components the detection of mode conversion is possible, allowing for superior and reliable damage detection. Experimental verification of the proposed damage identification procedure is performed on fuel tank elements of Reusable Launch Vehicles designed for space exploration. Lamb waves are excited using low-profile, surface-bonded piezoceramic transducers and 3D scanning laser Doppler vibrometer is used to characterize the Lamb wave propagation field. The paper presents theoretical background of the proposed damage identification technique as well as experimental arrangements and results.

  8. Combined measurement system for double shield tunnel boring machine guidance based on optical and visual methods.

    PubMed

    Lin, Jiarui; Gao, Kai; Gao, Yang; Wang, Zheng

    2017-10-01

    In order to detect the position of the cutting shield at the head of a double shield tunnel boring machine (TBM) during the excavation, this paper develops a combined measurement system which is mainly composed of several optical feature points, a monocular vision sensor, a laser target sensor, and a total station. The different elements of the combined system are mounted on the TBM in suitable sequence, and the position of the cutting shield in the reference total station frame is determined by coordinate transformations. Subsequently, the structure of the feature points and matching technique for them are expounded, the position measurement method based on monocular vision is presented, and the calibration methods for the unknown relationships among different parts of the system are proposed. Finally, a set of experimental platforms to simulate the double shield TBM is established, and accuracy verification experiments are conducted. Experimental results show that the mean deviation of the system is 6.8 mm, which satisfies the requirements of double shield TBM guidance.

  9. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  10. Sawja: Static Analysis Workshop for Java

    NASA Astrophysics Data System (ADS)

    Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine

    Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.

  11. Review of stochastic hybrid systems with applications in biological systems modeling and analysis.

    PubMed

    Li, Xiangfang; Omotere, Oluwaseyi; Qian, Lijun; Dougherty, Edward R

    2017-12-01

    Stochastic hybrid systems (SHS) have attracted a lot of research interests in recent years. In this paper, we review some of the recent applications of SHS to biological systems modeling and analysis. Due to the nature of molecular interactions, many biological processes can be conveniently described as a mixture of continuous and discrete phenomena employing SHS models. With the advancement of SHS theory, it is expected that insights can be obtained about biological processes such as drug effects on gene regulation. Furthermore, combining with advanced experimental methods, in silico simulations using SHS modeling techniques can be carried out for massive and rapid verification or falsification of biological hypotheses. The hope is to substitute costly and time-consuming in vitro or in vivo experiments or provide guidance for those experiments and generate better hypotheses.

  12. Joint Estimation of Source Range and Depth Using a Bottom-Deployed Vertical Line Array in Deep Water

    PubMed Central

    Li, Hui; Yang, Kunde; Duan, Rui; Lei, Zhixiong

    2017-01-01

    This paper presents a joint estimation method of source range and depth using a bottom-deployed vertical line array (VLA). The method utilizes the information on the arrival angle of direct (D) path in space domain and the interference characteristic of D and surface-reflected (SR) paths in frequency domain. The former is related to a ray tracing technique to backpropagate the rays and produces an ambiguity surface of source range. The latter utilizes Lloyd’s mirror principle to obtain an ambiguity surface of source depth. The acoustic transmission duct is the well-known reliable acoustic path (RAP). The ambiguity surface of the combined estimation is a dimensionless ad hoc function. Numerical efficiency and experimental verification show that the proposed method is a good candidate for initial coarse estimation of source position. PMID:28590442

  13. Improved hybrid isolator with maglev actuator integrated in air spring for active-passive isolation of ship machinery vibration

    NASA Astrophysics Data System (ADS)

    Li, Yan; He, Lin; Shuai, Chang-geng; Wang, Chun-yu

    2017-10-01

    A hybrid isolator consisting of maglev actuator and air spring is proposed and developed for application in active-passive vibration isolation system of ship machinery. The dynamic characteristics of this hybrid isolator are analyzed and tested. The stability and adaptability of this hybrid isolator to shock and swing in the marine environment are improved by a compliant gap protection technique and a disengageable suspended structure. The functions of these new engineering designs are proved by analytical verification and experimental validation of the designed stiffness of such a hybrid isolator, and also by shock adaptability testing of the hybrid isolator. Finally, such hybrid isolators are installed in an engineering mounting loaded with a 200-kW ship diesel generator, and the broadband and low-frequency sinusoidal isolation performance is tested.

  14. The MiniCLEAN Dark Matter Experiment

    NASA Astrophysics Data System (ADS)

    Schnee, Richard; Deap/Clean Collaboration

    2011-10-01

    The MiniCLEAN dark matter experiment exploits a single-phase liquid argon (LAr) detector, instrumented with photomultiplier tubes submerged in the cryogen with nearly 4 π coverage of a 500 kg target (150 kg fiducial) mass. The high light yield and large difference in singlet/triplet scintillation time-profiles in LAr provide effective defense against radioactive backgrounds through pulse-shape discrimination and event position reconstruction. The detector is also designed for a liquid neon target which, in the event of a positive signal in LAr, will enable an independent verification of backgrounds and provide a unique test of the expected A2 dependence of the WIMP interaction rate. The conceptually simple design can be scaled to target masses in excess of 10 tons in a relatively straightforward and economic manner. The experimental technique and current status of MiniCLEAN will be summarized.

  15. Dosimetric verification for primary focal hypermetabolism of nasopharyngeal carcinoma patients treated with dynamic intensity-modulated radiation therapy.

    PubMed

    Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen

    2012-01-01

    To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.

  16. A new verification film system for routine quality control of radiation fields: Kodak EC-L.

    PubMed

    Hermann, A; Bratengeier, K; Priske, A; Flentje, M

    2000-06-01

    The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.

  17. Crack Detection with Lamb Wave Wavenumber Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  18. Verification of Experimental Techniques for Flow Surface Determination

    NASA Technical Reports Server (NTRS)

    Lissenden, Cliff J.; Lerch, Bradley A.; Ellis, John R.; Robinson, David N.

    1996-01-01

    The concept of a yield surface is central to the mathematical formulation of a classical plasticity theory. However, at elevated temperatures, material response can be highly time-dependent, which is beyond the realm of classical plasticity. Viscoplastic theories have been developed for just such conditions. In viscoplastic theories, the flow law is given in terms of inelastic strain rate rather than the inelastic strain increment used in time-independent plasticity. Thus, surfaces of constant inelastic strain rate or flow surfaces are to viscoplastic theories what yield surfaces are to classical plasticity. The purpose of the work reported herein was to validate experimental procedures for determining flow surfaces at elevated temperatures. Since experimental procedures for determining yield surfaces in axial/torsional stress space are well established, they were employed -- except inelastic strain rates were used rather than total inelastic strains. In yield-surface determinations, the use of small-offset definitions of yield minimizes the change of material state and allows multiple loadings to be applied to a single specimen. The key to the experiments reported here was precise, decoupled measurement of axial and torsional strain. With this requirement in mind, the performance of a high-temperature multi-axial extensometer was evaluated by comparing its results with strain gauge results at room temperature. Both the extensometer and strain gauges gave nearly identical yield surfaces (both initial and subsequent) for type 316 stainless steel (316 SS). The extensometer also successfully determined flow surfaces for 316 SS at 650 C. Furthermore, to judge the applicability of the technique for composite materials, yield surfaces were determined for unidirectional tungsten/Kanthal (Fe-Cr-Al).

  19. Experimental setup for the measurement of induction motor cage currents

    NASA Astrophysics Data System (ADS)

    Bottauscio, Oriano; Chiampi, Mario; Donadio, Lorenzo; Zucca, Mauro

    2005-04-01

    An experimental setup for measurement of the currents flowing in the rotor bars of induction motors during synchronous no-load tests is described in the paper. The experimental verification of the high-frequency phenomena in the rotor cage is fundamental for a deep insight of the additional loss estimation by numerical methods. The attention is mainly focused on the analysis and design of the transducers developed for the cage current measurement.

  20. Collapse of Experimental Colloidal Aging using Record Dynamics

    NASA Astrophysics Data System (ADS)

    Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter

    The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.

  1. Verification bias: an under-recognized source of error in assessing the efficacy of MRI of the meniscii.

    PubMed

    Richardson, Michael L; Petscavage, Jonelle M

    2011-11-01

    The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  2. Compressive sensing using optimized sensing matrix for face verification

    NASA Astrophysics Data System (ADS)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  3. Real-Time Impact Visualization Inspection of Aerospace Composite Structures with Distributed Sensors.

    PubMed

    Si, Liang; Baier, Horst

    2015-07-08

    For the future design of smart aerospace structures, the development and application of a reliable, real-time and automatic monitoring and diagnostic technique is essential. Thus, with distributed sensor networks, a real-time automatic structural health monitoring (SHM) technique is designed and investigated to monitor and predict the locations and force magnitudes of unforeseen foreign impacts on composite structures and to estimate in real time mode the structural state when impacts occur. The proposed smart impact visualization inspection (IVI) technique mainly consists of five functional modules, which are the signal data preprocessing (SDP), the forward model generator (FMG), the impact positioning calculator (IPC), the inverse model operator (IMO) and structural state estimator (SSE). With regard to the verification of the practicality of the proposed IVI technique, various structure configurations are considered, which are a normal CFRP panel and another CFRP panel with "orange peel" surfaces and a cutout hole. Additionally, since robustness against several background disturbances is also an essential criterion for practical engineering demands, investigations and experimental tests are carried out under random vibration interfering noise (RVIN) conditions. The accuracy of the predictions for unknown impact events on composite structures using the IVI technique is validated under various structure configurations and under changing environmental conditions. The evaluated errors all fall well within a satisfactory limit range. Furthermore, it is concluded that the IVI technique is applicable for impact monitoring, diagnosis and assessment of aerospace composite structures in complex practical engineering environments.

  4. Real-Time Impact Visualization Inspection of Aerospace Composite Structures with Distributed Sensors

    PubMed Central

    Si, Liang; Baier, Horst

    2015-01-01

    For the future design of smart aerospace structures, the development and application of a reliable, real-time and automatic monitoring and diagnostic technique is essential. Thus, with distributed sensor networks, a real-time automatic structural health monitoring (SHM) technique is designed and investigated to monitor and predict the locations and force magnitudes of unforeseen foreign impacts on composite structures and to estimate in real time mode the structural state when impacts occur. The proposed smart impact visualization inspection (IVI) technique mainly consists of five functional modules, which are the signal data preprocessing (SDP), the forward model generator (FMG), the impact positioning calculator (IPC), the inverse model operator (IMO) and structural state estimator (SSE). With regard to the verification of the practicality of the proposed IVI technique, various structure configurations are considered, which are a normal CFRP panel and another CFRP panel with “orange peel” surfaces and a cutout hole. Additionally, since robustness against several background disturbances is also an essential criterion for practical engineering demands, investigations and experimental tests are carried out under random vibration interfering noise (RVIN) conditions. The accuracy of the predictions for unknown impact events on composite structures using the IVI technique is validated under various structure configurations and under changing environmental conditions. The evaluated errors all fall well within a satisfactory limit range. Furthermore, it is concluded that the IVI technique is applicable for impact monitoring, diagnosis and assessment of aerospace composite structures in complex practical engineering environments. PMID:26184196

  5. A novel method for measurement of MR fluid sedimentation and its experimental verification

    NASA Astrophysics Data System (ADS)

    Roupec, J.; Berka, P.; Mazůrek, I.; Strecker, Z.; Kubík, M.; Macháček, O.; Taheri Andani, M.

    2017-10-01

    This article presents a novel sedimentation measurement technique based on quantifying the changes in magnetic flux density when the magnetorheological fluid (MRF) passes through the air gap of the magnetic circuit. The sedimented MRF appears to have as a result of increased iron content. Accordingly, the sedimented portion of the sample displays a higher magnetic conductivity than the unsedimented area that contains less iron particles. The data analysis and evaluation methodology is elaborated along with an example set of measurements, which are compared against the visual observations and available data in the literature. Experiments indicate that unlike the existing methods, the new technique is able to accurately generate the complete curves of the sedimentation profile in a long-term sedimentation. The proposed method is capable of successfully detecting the area with the tightest particle configuration near the bottom (‘cake’ layer). It also addresses the issues with the development of an unclear boundary between the carrier fluid and the sediment (mudline) during an accelerated sedimentation process; improves the sensitivity of the sedimentation detection and accurately measure the changes in particle concentration with a high resolution.

  6. Bidding Agents That Perpetrate Auction Fraud

    NASA Astrophysics Data System (ADS)

    Trevathan, Jarrod; McCabe, Alan; Read, Wayne

    This paper presents a software bidding agent that inserts fake bids on the seller's behalf to inflate an auction's price. This behaviour is referred to as shill bidding. Shill bidding is strictly prohibited by online auctioneers, as it defrauds unsuspecting buyers by forcing them to pay more for the item. The malicious bidding agent was constructed to aid in developing shill detection techniques. We have previously documented a simple shill bidding agent that incrementally increases the auction price until it reaches the desired profit target, or it becomes too risky to continue bidding. This paper presents an adaptive shill bidding agent which when used over a series of auctions with substitutable items, can revise its strategy based on bidding behaviour in past auctions. The adaptive agent applies a novel prediction technique referred to as the Extremum Consistency (EC) algorithm, to determine the optimal price to aspire for. The EC algorithm has successfully been used in handwritten signature verification for determining the maximum and minimum values in an input stream. The agent's ability to inflate the price has been tested in a simulated marketplace and experimental results are presented.

  7. A collection of articles on S/X-band experiment zero delay ranging tests, volume 1

    NASA Technical Reports Server (NTRS)

    Otoshi, T. Y. (Editor)

    1975-01-01

    Articles are presented which are concerned with the development of special test equipment and a dual-frequency zero delay device (ZDD) that were required for range tests and the measurement of ground station delays for the Mariner-Venus-Mercury 1973 S/X-band experiment. Test data obtained at DSS 14 after installation of the ZDD on the 64-m antenna are given. It is shown that large variations of range were observed as a function of antenna elevation angle and were sensitive to antenna location. A ranging calibration configuration that was subsequently developed and a technique for determining the appropriate Z-correction are described. Zero delay test data at DSS 14 during the Mariner 10 Venus-Mercury-Encounter periods (1974 days 12-150) are presented. The theoretical analysis and experimental verifications are included of the effects of multipath and effects of discontinuities on range delay measurements. A movable subreflector technique and the multipath theory were used to isolate principal multipath errors on the 64-m antenna and to enable a more accurate determination of the actual ground station range delay.

  8. Time-domain imaging

    NASA Technical Reports Server (NTRS)

    Tolliver, C. L.

    1989-01-01

    The quest for the highest resolution microwave imaging and principle of time-domain imaging has been the primary motivation for recent developments in time-domain techniques. With the present technology, fast time varying signals can now be measured and recorded both in magnitude and in-phase. It has also enhanced our ability to extract relevant details concerning the scattering object. In the past, the interface of object geometry or shape for scattered signals has received substantial attention in radar technology. Various scattering theories were proposed to develop analytical solutions to this problem. Furthermore, the random inversion, frequency swept holography, and the synthetic radar imaging, have two things in common: (1) the physical optic far-field approximation, and (2) the utilization of channels as an extra physical dimension, were also advanced. Despite the inherent vectorial nature of electromagnetic waves, these scalar treatments have brought forth some promising results in practice with notable examples in subsurface and structure sounding. The development of time-domain techniques are studied through the theoretical aspects as well as experimental verification. The use of time-domain imaging for space robotic vision applications has been suggested.

  9. Land use/land cover mapping (1:25000) of Taiwan, Republic of China by automated multispectral interpretation of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Sung, Q. C.; Miller, L. D.

    1977-01-01

    Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.

  10. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  11. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  12. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  14. Optical fiber sensors measurement system and special fibers improvement

    NASA Astrophysics Data System (ADS)

    Jelinek, Michal; Hrabina, Jan; Hola, Miroslava; Hucl, Vaclav; Cizek, Martin; Rerucha, Simon; Lazar, Josef; Mikel, Bretislav

    2017-06-01

    We present method for the improvement of the measurement accuracy in the optical frequency spectra measurements based on tunable optical filters. The optical filter was used during the design and realization of the measurement system for the inspection of the fiber Bragg gratings. The system incorporates a reference block for the compensation of environmental influences, an interferometric verification subsystem and a PC - based control software implemented in LabView. The preliminary experimental verification of the measurement principle and the measurement system functionality were carried out on a testing rig with a specially prepared concrete console in the UJV Řež. The presented system is the laboratory version of the special nuclear power plant containment shape deformation measurement system which was installed in the power plant Temelin during last year. On the base of this research we started with preparation other optical fiber sensors to nuclear power plants measurement. These sensors will be based on the microstructured and polarization maintaining optical fibers. We started with development of new methods and techniques of the splicing and shaping optical fibers. We are able to made optical tapers from ultra-short called adiabatic with length around 400 um up to long tapers with length up to 6 millimeters. We developed new techniques of splicing standard Single Mode (SM) and Multimode (MM) optical fibers and splicing of optical fibers with different diameters in the wavelength range from 532 to 1550 nm. Together with development these techniques we prepared other techniques to splicing and shaping special optical fibers like as Polarization-Maintaining (PM) or hollow core Photonic Crystal Fiber (PCF) and theirs cross splicing methods with focus to minimalize backreflection and attenuation. The splicing special optical fibers especially PCF fibers with standard telecommunication and other SM fibers can be done by our developed techniques. Adjustment of the splicing process has to be prepared for any new optical fibers and new fibers combinations. The splicing of the same types of fibers from different manufacturers can be adjusted by several tested changes in the splicing process. We are able to splice PCF with standard telecommunication fiber with attenuation up to 2 dB. The method is also presented. Development of these new techniques and methods of the optical fibers splicing are made with respect to using these fibers to another research and development in the field of optical fibers sensors, laser frequency stabilization and laser interferometry based on optical fibers. Especially for the field of laser frequency stabilization we developed and present new techniques to closing microstructured fibers with gases inside.

  15. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  16. Character Recognition Method by Time-Frequency Analyses Using Writing Pressure

    NASA Astrophysics Data System (ADS)

    Watanabe, Tatsuhito; Katsura, Seiichiro

    With the development of information and communication technology, personal verification becomes more and more important. In the future ubiquitous society, the development of terminals handling personal information requires the personal verification technology. The signature is one of the personal verification methods; however, the number of characters is limited in the case of the signature and therefore false signature is used easily. Thus, personal identification is difficult from handwriting. This paper proposes a “haptic pen” that extracts the writing pressure, and shows a character recognition method by time-frequency analyses. Although the figures of characters written by different amanuenses are similar, the differences appear in the time-frequency domain. As a result, it is possible to use the proposed character recognition for personal identification more exactly. The experimental results showed the viability of the proposed method.

  17. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.

  18. The multi-scattering model for calculations of positron spatial distribution in the multilayer stacks, useful for conventional positron measurements

    NASA Astrophysics Data System (ADS)

    Dryzek, Jerzy; Siemek, Krzysztof

    2013-08-01

    The spatial distribution of positrons emitted from radioactive isotopes into stacks or layered samples is a subject of the presented report. It was found that Monte Carlo (MC) simulations using GEANT4 code are not able to describe correctly the experimental data of the positron fractions in stacks. The mathematical model was proposed for calculations of the implantation profile or positron fractions in separated layers or foils being components of a stack. The model takes into account only two processes, i.e., the positron absorption and backscattering at interfaces. The mathematical formulas were applied in the computer program called LYS-1 (layers profile analysis). The theoretical predictions of the model were in the good agreement with the results of the MC simulations for the semi infinite sample. The experimental verifications of the model were performed on the symmetrical and non-symmetrical stacks of different foils. The good agreement between the experimental and calculated fractions of positrons in components of a stack was achieved. Also the experimental implantation profile obtained using the depth scanning of positron implantation technique is very well described by the theoretical profile obtained within the proposed model. The LYS-1 program allows us also to calculate the fraction of positrons which annihilate in the source, which can be useful in the positron spectroscopy.

  19. Mirrors design, analysis and manufacturing of the 550mm Korsch telescope experimental model

    NASA Astrophysics Data System (ADS)

    Huang, Po-Hsuan; Huang, Yi-Kai; Ling, Jer

    2017-08-01

    In 2015, NSPO (National Space Organization) began to develop the sub-meter resolution optical remote sensing instrument of the next generation optical remote sensing satellite which follow-on to FORMOSAT-5. Upgraded from the Ritchey-Chrétien Cassegrain telescope optical system of FORMOSAT-5, the experimental optical system of the advanced optical remote sensing instrument was enhanced to an off-axis Korsch telescope optical system which consists of five mirrors. It contains: (1) M1: 550mm diameter aperture primary mirror, (2) M2: secondary mirror, (3) M3: off-axis tertiary mirror, (4) FM1 and FM2: two folding flat mirrors, for purpose of limiting the overall volume, reducing the mass, and providing a long focal length and excellent optical performance. By the end of 2015, we implemented several important techniques including optical system design, opto-mechanical design, FEM and multi-physics analysis and optimization system in order to do a preliminary study and begin to develop and design these large-size lightweight aspheric mirrors and flat mirrors. The lightweight mirror design and opto-mechanical interface design were completed in August 2016. We then manufactured and polished these experimental model mirrors in Taiwan; all five mirrors ware completed as spherical surfaces by the end of 2016. Aspheric figuring, assembling tests and optical alignment verification of these mirrors will be done with a Korsch telescope experimental structure model in 2018.

  20. Experimental verification of layout physical verification of silicon photonics

    NASA Astrophysics Data System (ADS)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  1. Odysseus's Sailboat Dilemma

    ERIC Educational Resources Information Center

    Wong, Siu-ling; Chun, Ka-wai Cecilia; Mak, Se-yuen

    2007-01-01

    We describe a physics investigation project inspired by one of the adventures of Odysseus in Homer's "Odyssey." The investigation uses the laws of mechanics, vector algebra and a simple way to construct a fan-and-sail-cart for experimental verification.

  2. Resistivity Correction Factor for the Four-Probe Method: Experiment III

    NASA Astrophysics Data System (ADS)

    Yamashita, Masato; Nishii, Toshifumi; Kurihara, Hiroshi; Enjoji, Hideo; Iwata, Atsushi

    1990-04-01

    Experimental verification of the theoretically derived resistivity correction factor F is presented. Factor F is applied to a system consisting of a rectangular parallelepiped sample and a square four-probe array. Resistivity and sheet resistance measurements are made on isotropic graphites and crystalline ITO films. Factor F corrects experimental data and leads to reasonable resistivity and sheet resistance.

  3. Full-Scale Experimental Verification of Soft-Story-Only Retrofits of Wood-Frame Buildings using Hybrid Testing

    Treesearch

    Elaina Jennings; John W. van de Lindt; Ershad Ziaei; Pouria Bahmani; Sangki Park; Xiaoyun Shao; Weichiang Pang; Douglas Rammer; Gary Mochizuki; Mikhail Gershfeld

    2015-01-01

    The FEMA P-807 Guidelines were developed for retrofitting soft-story wood-frame buildings based on existing data, and the method had not been verified through full-scale experimental testing. This article presents two different retrofit designs based directly on the FEMA P-807 Guidelines that were examined at several different seismic intensity levels. The...

  4. Experimental Verification of the Individual Energy Dependencies of the Partial L-Shell Photoionization Cross Sections of Pd and Mo

    NASA Astrophysics Data System (ADS)

    Hönicke, Philipp; Kolbe, Michael; Müller, Matthias; Mantler, Michael; Krämer, Markus; Beckhoff, Burkhard

    2014-10-01

    An experimental method for the verification of the individually different energy dependencies of L1-, L2-, and L3- subshell photoionization cross sections is described. The results obtained for Pd and Mo are well in line with theory regarding both energy dependency and absolute values, and confirm the theoretically calculated cross sections by Scofield from the early 1970 s and, partially, more recent data by Trzhaskovskaya, Nefedov, and Yarzhemsky. The data also demonstrate the questionability of quantitative x-ray spectroscopical results based on the widely used fixed jump ratio approximated cross sections with energy independent ratios. The experiments are carried out by employing the radiometrically calibrated instrumentation of the Physikalisch-Technische Bundesanstalt at the electron storage ring BESSY II in Berlin; the obtained fluorescent intensities are thereby calibrated at an absolute level in reference to the International System of Units. Experimentally determined fixed fluorescence line ratios for each subshell are used for a reliable deconvolution of overlapping fluorescence lines. The relevant fundamental parameters of Mo and Pd are also determined experimentally in order to calculate the subshell photoionization cross sections independently of any database.

  5. Collinear cluster tri-partition: Kinematics constraints and stability of collinearity

    NASA Astrophysics Data System (ADS)

    Holmvall, P.; Köster, U.; Heinz, A.; Nilsson, T.

    2017-01-01

    Background: A new mode of nuclear fission has been proposed by the FOBOS Collaboration, called collinear cluster tri-partition (CCT), and suggests that three heavy fission fragments can be emitted perfectly collinearly in low-energy fission. This claim is based on indirect observations via missing-energy events using the 2 v 2 E method. This proposed CCT seems to be an extraordinary new aspect of nuclear fission. It is surprising that CCT escaped observation for so long given the relatively high reported yield of roughly 0.5 % relative to binary fission. These claims call for an independent verification with a different experimental technique. Purpose: Verification experiments based on direct observation of CCT fragments with fission-fragment spectrometers require guidance with respect to the allowed kinetic-energy range, which we present in this paper. Furthermore, we discuss corresponding model calculations which, if CCT is found in such verification experiments, could indicate how the breakups proceed. Since CCT refers to collinear emission, we also study the intrinsic stability of collinearity. Methods: Three different decay models are used that together span the timescales of three-body fission. These models are used to calculate the possible kinetic-energy ranges of CCT fragments by varying fragment mass splits, excitation energies, neutron multiplicities, and scission-point configurations. Calculations are presented for the systems 235U(nth,f ) and 252Cf(s f ) , and the fission fragments previously reported for CCT; namely, isotopes of the elements Ni, Si, Ca, and Sn. In addition, we use semiclassical trajectory calculations with a Monte Carlo method to study the intrinsic stability of collinearity. Results: CCT has a high net Q value but, in a sequential decay, the intermediate steps are energetically and geometrically unfavorable or even forbidden. Moreover, perfect collinearity is extremely unstable, and broken by the slightest perturbation. Conclusions: According to our results, the central fragment would be very difficult to detect due to its low kinetic energy, raising the question of why other 2 v 2 E experiments could not detect a missing-mass signature corresponding to CCT. Considering the high kinetic energies of the outer fragments reported in our study, direct-observation experiments should be able to observe CCT. Furthermore, we find that a realization of CCT would require an unphysical fine tuning of the initial conditions. Finally, our stability calculations indicate that, due to the pronounced instability of the collinear configuration, a prolate scission configuration does not necessarily lead to collinear emission, nor does equatorial emission necessarily imply an oblate scission configuration. In conclusion, our results enable independent experimental verification and encourage further critical theoretical studies of CCT.

  6. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  7. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    NASA Astrophysics Data System (ADS)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  8. Analytical and Experimental Investigations of Sodium Heat Pipes and Thermal Energy Storage Systems.

    DTIC Science & Technology

    1982-01-01

    continued) Figure Page 5.1 Cylindrical container for eutectic salt (LiF-NgF -KF) . . . . . . 91 5.2 TESC sample . . . . . . ... . . 0...of fluorides of Mg, Li and K. Experimental results have been used to verify the melting point, and latent heat of fusion of the eutectic salt , in...a melting or solidification curve will provide experimental verification for the latent heat value and melting point of a given eutectic salt . In the

  9. Experimental verification of Pyragas-Schöll-Fiedler control.

    PubMed

    von Loewenich, Clemens; Benner, Hartmut; Just, Wolfram

    2010-09-01

    We present an experimental realization of time-delayed feedback control proposed by Schöll and Fiedler. The scheme enables us to stabilize torsion-free periodic orbits in autonomous systems, and to overcome the so-called odd number limitation. The experimental control performance is in quantitative agreement with the bifurcation analysis of simple model systems. The results uncover some general features of the control scheme which are deemed to be relevant for a large class of setups.

  10. Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior

  11. Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction

    NASA Astrophysics Data System (ADS)

    Aarts, Fides; Jonsson, Bengt; Uijen, Johan

    In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.

  12. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  13. Image Hashes as Templates for Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less

  14. A novel generation of 3D SAR-based passive micromixer: efficient mixing and low pressure drop at a low Reynolds number

    NASA Astrophysics Data System (ADS)

    Viktorov, Vladimir; Nimafar, Mohammad

    2013-05-01

    This study introduces a novel generation of 3D splitting and recombination (SAR) passive micromixer with microstructures placed on the top and bottom floors of microchannels called a ‘chain mixer’. Both experimental verification and numerical analysis of the flow structure of this type of passive micromixer have been performed to evaluate the mixing performance and pressure drop of the microchannel, respectively. We propose here two types of chain mixer—chain 1 and chain 2—and compare their mixing performance and pressure drop with other micromixers, T-, o- and tear-drop micromixers. Experimental tests carried out in the laminar flow regime with a low Reynolds number range, 0.083 ≤ Re ≤ 4.166, and image-based techniques are used to evaluate the mixing efficiency. Also, the computational fluid dynamics code, ANSYS FLUENT-13.0 has been used to analyze the flow and pressure drop in the microchannel. Experimental results show that the chain and tear-drop mixer's efficiency is very high because of the SAR process: specifically, an efficiency of up to 98% can be achieved at the tested Reynolds number. The results also show that chain mixers have a lower required pressure drop in comparison with a tear-drop micromixer.

  15. Detection of multiple thin surface cracks using vibrothermography with low-power piezoceramic-based ultrasonic actuator—a numerical study with experimental verification

    NASA Astrophysics Data System (ADS)

    Parvasi, Seyed Mohammad; Xu, Changhang; Kong, Qingzhao; Song, Gangbing

    2016-05-01

    Ultrasonic vibrations in cracked structures generate heat at the location of defects mainly due to frictional rubbing and viscoelastic losses at the defects. Vibrothermography is an effective nondestructive evaluation method which uses infrared imaging (IR) techniques to locate defects such as cracks and delaminations by detecting the heat generated at the defects. In this paper a coupled thermo-electro-mechanical analysis with the use of implicit finite element method was used to simulate a low power (10 W) piezoceramic-based ultrasonic actuator and the corresponding heat generation in a metallic plate with multiple surface cracks. Numerical results show that the finite element software Abaqus can be used to simultaneously model the electrical properties of the actuator, the ultrasonic waves propagating within the plate, as well as the thermal properties of the plate. Obtained numerical results demonstrate the ability of these low power transducers in detecting multiple cracks in the simulated aluminum plate. The validity of the numerical simulations was verified through experimental studies on a physical aluminum plate with multiple surface cracks while the same low power piezoceramic stack actuator was used to excite the plate and generate heat at the cracks. An excellent qualitative agreement exists between the experimental results and the numerical simulation’s results.

  16. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  17. Defining the IEEE-854 floating-point standard in PVS

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.

  18. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  19. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  20. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  1. Active Interrogation for Spent Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swinhoe, Martyn Thomas; Dougan, Arden

    2015-11-05

    The DDA instrument for nuclear safeguards is a fast, non-destructive assay, active neutron interrogation technique using an external 14 MeV DT neutron generator for characterization and verification of spent nuclear fuel assemblies.

  2. AIR QUALITY FORECAST VERIFICATION USING SATELLITE DATA

    EPA Science Inventory

    NOAA 's operational geostationary satellite retrievals of aerosol optical depths (AODs) were used to verify National Weather Service (NWS) experimental (research mode) particulate matter (PM2.5) forecast guidance issued during the summer 2004 International Consortium for Atmosp...

  3. Ac electronic tunneling at optical frequencies

    NASA Technical Reports Server (NTRS)

    Faris, S. M.; Fan, B.; Gustafson, T. K.

    1974-01-01

    Rectification characteristics of non-superconducting metal-barrier-metal junctions deduced from electronic tunneling have been observed experimentally for optical frequency irradiation of the junction. The results provide verification of optical frequency Fermi level modulation and electronic tunneling current modulation.

  4. Verification of a computer-aided replica technique for evaluating prosthesis adaptation using statistical agreement analysis.

    PubMed

    Mai, Hang-Nga; Lee, Kyeong Eun; Lee, Kyu-Bok; Jeong, Seung-Mi; Lee, Seok-Jae; Lee, Cheong-Hee; An, Seo-Young; Lee, Du-Hyeong

    2017-10-01

    The purpose of this study was to evaluate the reliability of computer-aided replica technique (CART) by calculating its agreement with the replica technique (RT), using statistical agreement analysis. A prepared metal die and a metal crown were fabricated. The gap between the restoration and abutment was replicated using silicone indicator paste (n = 25). Gap measurements differed in the control (RT) and experimental (CART) groups. In the RT group, the silicone replica was manually sectioned, and the marginal and occlusal gaps were measured using a microscope. In the CART group, the gap was digitized using optical scanning and image superimposition, and the gaps were measured using a software program. The agreement between the measurement techniques was evaluated by using the 95% Bland-Altman limits of agreement and concordance correlation coefficients (CCC). The least acceptable CCC was 0.90. The RT and CART groups showed linear association, with a strong positive correlation in gap measurements, but without significant differences. The 95% limits of agreement between the paired gap measurements were 3.84% and 7.08% of the mean. The lower 95% confidence limits of CCC were 0.9676 and 0.9188 for the marginal and occlusal gap measurements, respectively, and the values were greater than the allowed limit. The CART is a reliable digital approach for evaluating the fit accuracy of fixed dental prostheses.

  5. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A.

    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasismore » was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.« less

  6. Circulation of spoof surface plasmon polaritons: Implementation and verification

    NASA Astrophysics Data System (ADS)

    Pan, Junwei; Wang, Jiafu; Qiu, Tianshuo; Pang, Yongqiang; Li, Yongfeng; Zhang, Jieqiu; Qu, Shaobo

    2018-05-01

    In this letter, we are dedicated to implementation and experimental verification of broadband circulator for spoof surface plasmon polaritons (SSPPs). For the ease of fabrication, a circulator operating in X band was firstly designed. The comb-like transmission lines (CL-TLs), a typical SSPP structure, are adopted as the three branches of the Y-junction. To enable broadband coupling of SSPP, a transition section is added on each end of the CL-TLs. Through such a design, the circulator can operate under the sub-wavelength SSPP mode in a broad band. The simulation results show that the insertion loss is less than 0.5dB while the isolation and return loss are higher than 20dB in 9.4-12.0GHz. A prototype was fabricated and measured. The experimental results are consistent with the simulation results and verify the broadband circulation performance in X band.

  7. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  8. Plasma Model V&V of Collisionless Electrostatic Shock

    NASA Astrophysics Data System (ADS)

    Martin, Robert; Le, Hai; Bilyeu, David; Gildea, Stephen

    2014-10-01

    A simple 1D electrostatic collisionless shock was selected as an initial validation and verification test case for a new plasma modeling framework under development at the Air Force Research Laboratory's In-Space Propulsion branch (AFRL/RQRS). Cross verification between PIC, Vlasov, and Fluid plasma models within the framework along with expected theoretical results will be shown. The non-equilibrium velocity distributions (VDF) captured by PIC and Vlasov will be compared to each other and the assumed VDF of the fluid model at selected points. Validation against experimental data from the University of California, Los Angeles double-plasma device will also be presented along with current work in progress at AFRL/RQRS towards reproducing the experimental results using higher fidelity diagnostics to help elucidate differences between model results and between the models and original experiment. DISTRIBUTION A: Approved for public release; unlimited distribution; PA (Public Affairs) Clearance Number 14332.

  9. Adapted RF pulse design for SAR reduction in parallel excitation with experimental verification at 9.4 T.

    PubMed

    Wu, Xiaoping; Akgün, Can; Vaughan, J Thomas; Andersen, Peter; Strupp, John; Uğurbil, Kâmil; Van de Moortele, Pierre-François

    2010-07-01

    Parallel excitation holds strong promises to mitigate the impact of large transmit B1 (B+1) distortion at very high magnetic field. Accelerated RF pulses, however, inherently tend to require larger values in RF peak power which may result in substantial increase in Specific Absorption Rate (SAR) in tissues, which is a constant concern for patient safety at very high field. In this study, we demonstrate adapted rate RF pulse design allowing for SAR reduction while preserving excitation target accuracy. Compared with other proposed implementations of adapted rate RF pulses, our approach is compatible with any k-space trajectories, does not require an analytical expression of the gradient waveform and can be used for large flip angle excitation. We demonstrate our method with numerical simulations based on electromagnetic modeling and we include an experimental verification of transmit pattern accuracy on an 8 transmit channel 9.4 T system.

  10. Low cost solar array project silicon materials task. Development of a process for high capacity arc heater production of silicon for solar arrays

    NASA Technical Reports Server (NTRS)

    Fey, M. G.

    1981-01-01

    The experimental verification system for the production of silicon via the arc heater-sodium reduction of SiCl4 was designed, fabricated, installed, and operated. Each of the attendant subsystems was checked out and operated to insure performance requirements. These subsystems included: the arc heaters/reactor, cooling water system, gas system, power system, Control & Instrumentation system, Na injection system, SiCl4 injection system, effluent disposal system and gas burnoff system. Prior to introducing the reactants (Na and SiCl4) to the arc heater/reactor, a series of gas only-power tests was conducted to establish the operating parameters of the three arc heaters of the system. Following the successful completion of the gas only-power tests and the readiness tests of the sodium and SiCl4 injection systems, a shakedown test of the complete experimental verification system was conducted.

  11. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  12. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  13. A silicon strip detector array for energy verification and quality assurance in heavy ion therapy.

    PubMed

    Debrot, Emily; Newall, Matthew; Guatelli, Susanna; Petasecca, Marco; Matsufuji, Naruhiro; Rosenfeld, Anatoly B

    2018-02-01

    The measurement of depth dose profiles for range and energy verification of heavy ion beams is an important aspect of quality assurance procedures for heavy ion therapy facilities. The steep dose gradients in the Bragg peak region of these profiles require the use of detectors with high spatial resolution. The aim of this work is to characterize a one dimensional monolithic silicon detector array called the "serial Dose Magnifying Glass" (sDMG) as an independent ion beam energy and range verification system used for quality assurance conducted for ion beams used in heavy ion therapy. The sDMG detector consists of two linear arrays of 128 silicon sensitive volumes each with an effective size of 2mm × 50μm × 100μm fabricated on a p-type substrate at a pitch of 200 μm along a single axis of detection. The detector was characterized for beam energy and range verification by measuring the response of the detector when irradiated with a 290 MeV/u 12 C ion broad beam incident along the single axis of the detector embedded in a PMMA phantom. The energy of the 12 C ion beam incident on the detector and the residual energy of an ion beam incident on the phantom was determined from the measured Bragg peak position in the sDMG. Ad hoc Monte Carlo simulations of the experimental setup were also performed to give further insight into the detector response. The relative response profiles along the single axis measured with the sDMG detector were found to have good agreement between experiment and simulation with the position of the Bragg peak determined to fall within 0.2 mm or 1.1% of the range in the detector for the two cases. The energy of the beam incident on the detector was found to vary less than 1% between experiment and simulation. The beam energy incident on the phantom was determined to be (280.9 ± 0.8) MeV/u from the experimental and (280.9 ± 0.2) MeV/u from the simulated profiles. These values coincide with the expected energy of 281 MeV/u. The sDMG detector response was studied experimentally and characterized using a Monte Carlo simulation. The sDMG detector was found to accurately determine the 12 C beam energy and is suited for fast energy and range verification quality assurance. It is proposed that the sDMG is also applicable for verification of treatment planning systems that rely on particle range. © 2017 American Association of Physicists in Medicine.

  14. Ultrasound functional imaging in an ex vivo beating porcine heart platform

    NASA Astrophysics Data System (ADS)

    Petterson, Niels J.; Fixsen, Louis S.; Rutten, Marcel C. M.; Pijls, Nico H. J.; van de Vosse, Frans N.; Lopata, Richard G. P.

    2017-12-01

    In recent years, novel ultrasound functional imaging (UFI) techniques have been introduced to assess cardiac function by measuring, e.g. cardiac output (CO) and/or myocardial strain. Verification and reproducibility assessment in a realistic setting remain major issues. Simulations and phantoms are often unrealistic, whereas in vivo measurements often lack crucial hemodynamic parameters or ground truth data, or suffer from the large physiological and clinical variation between patients when attempting clinical validation. Controlled validation in certain pathologies is cumbersome and often requires the use of lab animals. In this study, an isolated beating pig heart setup was adapted and used for performance assessment of UFI techniques such as volume assessment and ultrasound strain imaging. The potential of performing verification and reproducibility studies was demonstrated. For proof-of-principle, validation of UFI in pathological hearts was examined. Ex vivo porcine hearts (n  =  6, slaughterhouse waste) were resuscitated and attached to a mock circulatory system. Radio frequency ultrasound data of the left ventricle were acquired in five short axis views and one long axis view. Based on these slices, the CO was measured, where verification was performed using flow sensor measurements in the aorta. Strain imaging was performed providing radial, circumferential and longitudinal strain to assess reproducibility and inter-subject variability under steady conditions. Finally, strains in healthy hearts were compared to a heart with an implanted left ventricular assist device, simulating a failing, supported heart. Good agreement between ultrasound and flow sensor based CO measurements was found. Strains were highly reproducible (intraclass correlation coefficients  >0.8). Differences were found due to biological variation and condition of the hearts. Strain magnitude and patterns in the assisted heart were available for different pump action, revealing large changes compared to the normal condition. The setup provides a valuable benchmarking platform for UFI techniques. Future studies will include work on different pathologies and other means of measurement verification.

  15. A Validation and Code-to-Code Verification of FAST for a Megawatt-Scale Wind Turbine with Aeroelastically Tailored Blades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan

    This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less

  16. A Validation and Code-to-Code Verification of FAST for a Megawatt-Scale Wind Turbine with Aeroelastically Tailored Blades

    DOE PAGES

    Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan; ...

    2017-08-29

    This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less

  17. Experimental verification of theoretical equations for acoustic radiation force on compressible spherical particles in traveling waves.

    PubMed

    Johnson, Kennita A; Vormohr, Hannah R; Doinikov, Alexander A; Bouakaz, Ayache; Shields, C Wyatt; López, Gabriel P; Dayton, Paul A

    2016-05-01

    Acoustophoresis uses acoustic radiation force to remotely manipulate particles suspended in a host fluid for many scientific, technological, and medical applications, such as acoustic levitation, acoustic coagulation, contrast ultrasound imaging, ultrasound-assisted drug delivery, etc. To estimate the magnitude of acoustic radiation forces, equations derived for an inviscid host fluid are commonly used. However, there are theoretical predictions that, in the case of a traveling wave, viscous effects can dramatically change the magnitude of acoustic radiation forces, which make the equations obtained for an inviscid host fluid invalid for proper estimation of acoustic radiation forces. To date, experimental verification of these predictions has not been published. Experimental measurements of viscous effects on acoustic radiation forces in a traveling wave were conducted using a confocal optical and acoustic system and values were compared with available theories. Our results show that, even in a low-viscosity fluid such as water, the magnitude of acoustic radiation forces is increased manyfold by viscous effects in comparison with what follows from the equations derived for an inviscid fluid.

  18. Experimental verification of theoretical equations for acoustic radiation force on compressible spherical particles in traveling waves

    NASA Astrophysics Data System (ADS)

    Johnson, Kennita A.; Vormohr, Hannah R.; Doinikov, Alexander A.; Bouakaz, Ayache; Shields, C. Wyatt; López, Gabriel P.; Dayton, Paul A.

    2016-05-01

    Acoustophoresis uses acoustic radiation force to remotely manipulate particles suspended in a host fluid for many scientific, technological, and medical applications, such as acoustic levitation, acoustic coagulation, contrast ultrasound imaging, ultrasound-assisted drug delivery, etc. To estimate the magnitude of acoustic radiation forces, equations derived for an inviscid host fluid are commonly used. However, there are theoretical predictions that, in the case of a traveling wave, viscous effects can dramatically change the magnitude of acoustic radiation forces, which make the equations obtained for an inviscid host fluid invalid for proper estimation of acoustic radiation forces. To date, experimental verification of these predictions has not been published. Experimental measurements of viscous effects on acoustic radiation forces in a traveling wave were conducted using a confocal optical and acoustic system and values were compared with available theories. Our results show that, even in a low-viscosity fluid such as water, the magnitude of acoustic radiation forces is increased manyfold by viscous effects in comparison with what follows from the equations derived for an inviscid fluid.

  19. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies.

    PubMed

    Caswell, Joseph M; Singh, Manraj; Persinger, Michael A

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  20. A physical zero-knowledge object-comparison system for nuclear warhead verification

    PubMed Central

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

Top