NASA Astrophysics Data System (ADS)
Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei
2018-03-01
A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.
Direct and full-scale experimental verifications towards ground-satellite quantum key distribution
NASA Astrophysics Data System (ADS)
Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei
2013-05-01
Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
National Centers for Environmental Prediction
Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
National Centers for Environmental Prediction
Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael
2007-08-21
Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.
Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario
2016-01-01
Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441
Elaina Jennings; John W. van de Lindt; Ershad Ziaei; Pouria Bahmani; Sangki Park; Xiaoyun Shao; Weichiang Pang; Douglas Rammer; Gary Mochizuki; Mikhail Gershfeld
2015-01-01
The FEMA P-807 Guidelines were developed for retrofitting soft-story wood-frame buildings based on existing data, and the method had not been verified through full-scale experimental testing. This article presents two different retrofit designs based directly on the FEMA P-807 Guidelines that were examined at several different seismic intensity levels. The...
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NAM > EXPERIMENTAL DATA Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION
NASA Technical Reports Server (NTRS)
Cantrell, J. H., Jr.; Winfree, W. P.
1980-01-01
The solution of the nonlinear differential equation which describes an initially sinusoidal finite-amplitude elastic wave propagating in a solid contains a static-displacement term in addition to the harmonic terms. The static-displacement amplitude is theoretically predicted to be proportional to the product of the squares of the driving-wave amplitude and the driving-wave frequency. The first experimental verification of the elastic-wave static displacement in a solid (the 111 direction of single-crystal germanium) is reported, and agreement is found with the theoretical predictions.
Indirect current control with separate IZ drop compensation for voltage source converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanetkar, V.R.; Dawande, M.S.; Dubey, G.K.
1995-12-31
Indirect Current Control (ICC) of boost type Voltage Source Converters (VSCs) using separate compensation of line IZ voltage drop is presented. A separate bi-directional VSC is used to produce the compensation voltage. This simplifies the ICC regulator scheme as the power flow is controlled through single modulation index. Experimental verification is provided for bi-directional control of the power flow.
On the assessment of biological life support system operation range
NASA Astrophysics Data System (ADS)
Bartsev, Sergey
Biological life support systems (BLSS) can be used in long-term space missions only if well-thought-out assessment of the allowable operating range is obtained. The range has to account both permissible working parameters of BLSS and the critical level of perturbations of BLSS stationary state. Direct approach to outlining the range by statistical treatment of experimental data on BLSS destruction seems to be not applicable due to ethical, economical, and saving time reasons. Mathematical model is the unique tool for the generalization of experimental data and the extrapolation of the revealed regularities beyond empirical experience. The problem is that the quality of extrapolation depends on the adequacy of corresponding model verification, but good verification requires wide range of experimental data for fitting, which is not achievable for manned experimental BLSS. Possible way to improve the extrapolation quality of inevitably poorly verified models of manned BLSS is to extrapolate general tendency obtained from unmanned LSS theoretical-experiment investigations. Possibilities and limitations of such approach are discussed.
Experimental verification of layout physical verification of silicon photonics
NASA Astrophysics Data System (ADS)
El Shamy, Raghi S.; Swillam, Mohamed A.
2018-02-01
Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.
Constitutive modeling of superalloy single crystals with verification testing
NASA Technical Reports Server (NTRS)
Jordan, Eric; Walker, Kevin P.
1985-01-01
The goal is the development of constitutive equations to describe the elevated temperature stress-strain behavior of single crystal turbine blade alloys. The program includes both the development of a suitable model and verification of the model through elevated temperature-torsion testing. A constitutive model is derived from postulated constitutive behavior on individual crystallographic slip systems. The behavior of the entire single crystal is then arrived at by summing up the slip on all the operative crystallographic slip systems. This type of formulation has a number of important advantages, including the prediction orientation dependence and the ability to directly represent the constitutive behavior in terms which metallurgists use in describing the micromechanisms. Here, the model is briefly described, followed by the experimental set-up and some experimental findings to date.
NASA Astrophysics Data System (ADS)
Lin, Y. Q.; Ren, W. X.; Fang, S. E.
2011-11-01
Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.
Collapse of Experimental Colloidal Aging using Record Dynamics
NASA Astrophysics Data System (ADS)
Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter
The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.
Pastor, D; Amaya, W; García-Olcina, R; Sales, S
2007-07-01
We present a simple theoretical model of and the experimental verification for vanishing of the autocorrelation peak due to wavelength detuning on the coding-decoding process of coherent direct sequence optical code multiple access systems based on a superstructured fiber Bragg grating. Moreover, the detuning vanishing effect has been explored to take advantage of this effect and to provide an additional degree of multiplexing and/or optical code tuning.
NASA Technical Reports Server (NTRS)
Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)
2000-01-01
Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.
Arithmetic Circuit Verification Based on Symbolic Computer Algebra
NASA Astrophysics Data System (ADS)
Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo
This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.
Direct measurements of the Gibbs free energy of OH using a CW tunable laser
NASA Technical Reports Server (NTRS)
Killinger, D. K.; Wang, C. C.
1979-01-01
The paper describes an absorption measurement for determining the Gibbs free energy of OH generated in a mixture of water and oxygen vapor. These measurements afford a direct verification of the accuracy of thermochemical data of H2O at high temperatures and pressures. The results indicate that values for the heat capacity of H2O obtained through numerical computations are correct within an experimental uncertainty of 0.15 cal/mole K.
Research of aerohydrodynamic and aeroelastic processes on PNRPU HPC system
NASA Astrophysics Data System (ADS)
Modorskii, V. Ya.; Shevelev, N. A.
2016-10-01
Research of aerohydrodynamic and aeroelastic processes with the High Performance Computing Complex in PNIPU is actively conducted within the university priority development direction "Aviation engine and gas turbine technology". Work is carried out in two areas: development and use of domestic software and use of well-known foreign licensed applied software packets. In addition, the third direction associated with the verification of computational experiments - physical modeling, with unique proprietary experimental installations is being developed.
Experimental preparation and verification of quantum money
NASA Astrophysics Data System (ADS)
Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei
2018-03-01
A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.
Direct measurement of Kramers turnover with a levitated nanoparticle
NASA Astrophysics Data System (ADS)
Rondin, Loïc; Gieseler, Jan; Ricci, Francesco; Quidant, Romain; Dellago, Christoph; Novotny, Lukas
2017-12-01
Understanding the thermally activated escape from a metastable state is at the heart of important phenomena such as the folding dynamics of proteins, the kinetics of chemical reactions or the stability of mechanical systems. In 1940, Kramers calculated escape rates both in the high damping and low damping regimes, and suggested that the rate must have a maximum for intermediate damping. This phenomenon, today known as the Kramers turnover, has triggered important theoretical and numerical studies. However, as yet, there is no direct and quantitative experimental verification of this turnover. Using a nanoparticle trapped in a bistable optical potential, we experimentally measure the nanoparticle's transition rates for variable damping and directly resolve the Kramers turnover. Our measurements are in agreement with an analytical model that is free of adjustable parameters. The levitated nanoparticle presented here is a versatile experimental platform for studying and simulating a wide range of stochastic processes and testing theoretical models and predictions.
Unravelling the electrochemical double layer by direct probing of the solid/liquid interface
Favaro, Marco; Jeong, Beomgyun; Ross, Philip N.; Yano, Junko; Hussain, Zahid; Liu, Zhi; Crumlin, Ethan J.
2016-01-01
The electrochemical double layer plays a critical role in electrochemical processes. Whilst there have been many theoretical models predicting structural and electrical organization of the electrochemical double layer, the experimental verification of these models has been challenging due to the limitations of available experimental techniques. The induced potential drop in the electrolyte has never been directly observed and verified experimentally, to the best of our knowledge. In this study, we report the direct probing of the potential drop as well as the potential of zero charge by means of ambient pressure X-ray photoelectron spectroscopy performed under polarization conditions. By analyzing the spectra of the solvent (water) and a spectator neutral molecule with numerical simulations of the electric field, we discern the shape of the electrochemical double layer profile. In addition, we determine how the electrochemical double layer changes as a function of both the electrolyte concentration and applied potential. PMID:27576762
Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.
Experimental verification of the rainbow trapping effect in adiabatic plasmonic gratings
Gan, Qiaoqiang; Gao, Yongkang; Wagner, Kyle; Vezenov, Dmitri; Ding, Yujie J.; Bartoli, Filbert J.
2011-01-01
We report the experimental observation of a trapped rainbow in adiabatically graded metallic gratings, designed to validate theoretical predictions for this unique plasmonic structure. One-dimensional graded nanogratings were fabricated and their surface dispersion properties tailored by varying the grating groove depth, whose dimensions were confirmed by atomic force microscopy. Tunable plasmonic bandgaps were observed experimentally, and direct optical measurements on graded grating structures show that light of different wavelengths in the 500–700-nm region is “trapped” at different positions along the grating, consistent with computer simulations, thus verifying the “rainbow” trapping effect. PMID:21402936
2012-10-23
Naeini A H, Hill J T, Krause A, Groblacher S, Aspelmeyer M and Painter O 2011 Nature 478 89 [14] Siegman A E 1986 Lasers (Sausalito, CA: University... laser rate equation theory and experimental verification 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...coherent mechanical oscillators: the laser rate equation theory and experimental verification J B Khurgin1, M W Pruessner2,3, T H Stievater2 and W S
Mathematical Modeling of Ni/H2 and Li-Ion Batteries
NASA Technical Reports Server (NTRS)
Weidner, John W.; White, Ralph E.; Dougal, Roger A.
2001-01-01
The modelling effort outlined in this viewgraph presentation encompasses the following topics: 1) Electrochemical Deposition of Nickel Hydroxide; 2) Deposition rates of thin films; 3) Impregnation of porous electrodes; 4) Experimental Characterization of Nickel Hydroxide; 5) Diffusion coefficients of protons; 6) Self-discharge rates (i.e., oxygen-evolution kinetics); 7) Hysteresis between charge and discharge; 8) Capacity loss on cycling; 9) Experimental Verification of the Ni/H2 Battery Model; 10) Mathematical Modeling Li-Ion Batteries; 11) Experimental Verification of the Li-Ion Battery Model; 11) Integrated Power System Models for Satellites; and 12) Experimental Verification of Integrated-Systems Model.
National Centers for Environmental Prediction
Reference List Table of Contents NCEP OPERATIONAL MODEL FORECAST GRAPHICS PARALLEL/EXPERIMENTAL MODEL Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS VERIFICATION (GRID VS.OBS) WEB PAGE (NCEP EXPERIMENTAL PAGE, INTERNAL USE ONLY) Interactive web page tool for
ERIC Educational Resources Information Center
Logan, Christopher W.; Cole, Nancy; Kamara, Sheku G.
2010-01-01
Purpose/Objectives: The Direct Verification Pilot tested the feasibility, effectiveness, and costs of using Medicaid and State Children's Health Insurance Program (SCHIP) data to verify applications for free and reduced-price (FRP) school meals instead of obtaining documentation from parents and guardians. Methods: The Direct Verification Pilot…
Experimental demonstration of three-dimensional broadband underwater acoustic carpet cloak
NASA Astrophysics Data System (ADS)
Bi, Yafeng; Jia, Han; Sun, Zhaoyong; Yang, Yuzhen; Zhao, Han; Yang, Jun
2018-05-01
We present the design, architecture, and detailed performance of a three-dimensional (3D) underwater acoustic carpet cloak (UACC). The proposed system of the 3D UACC is an octahedral pyramid, which is composed of periodical steel strips. This underwater acoustic device, placed over the target to hide, is able to manipulate the scattered wavefront to mimic a reflecting plane. The effectiveness of the prototype is experimentally demonstrated in an anechoic tank. The measured acoustic pressure distributions show that the 3D UACC can work in all directions in a wide frequency range. This experimental verification of 3D device paves the way for guidelines on future practical applications.
Unravelling the electrochemical double layer by direct probing of the solid/liquid interface
Favaro, Marco; Jeong, Beomgyun; Ross, Philip N.; ...
2016-08-31
The electrochemical double layer plays a critical role in electrochemical processes. Whilst there have been many theoretical models predicting structural and electrical organization of the electrochemical double layer, the experimental verification of these models has been challenging due to the limitations of available experimental techniques. The induced potential drop in the electrolyte has never been directly observed and verified experimentally, to the best of our knowledge. In this study, we report the direct probing of the potential drop as well as the potential of zero charge by means of ambient pressure X-ray photoelectron spectroscopy performed under polarization conditions. By analyzingmore » the spectra of the solvent (water) and a spectator neutral molecule with numerical simulations of the electric field, we discern the shape of the electrochemical double layer profile. In addition, we determine how the electrochemical double layer changes as a function of both the electrolyte concentration and applied potential.« less
Study on verifying the angle measurement performance of the rotary-laser system
NASA Astrophysics Data System (ADS)
Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui
2018-04-01
An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.
Interfering with the neutron spin
NASA Astrophysics Data System (ADS)
Wagh, Apoorva G.; Rakhecha, Veer Chand
2004-07-01
Charge neutrality, a spin frac{1}{2} and an associated magnetic moment of the neu- tron make it an ideal probe of quantal spinor evolutions. Polarized neutron interferometry in magnetic field Hamiltonians has thus scored several firsts such as direct verification of Pauli anticommutation, experimental separation of geometric and dynamical phases and observation of non-cyclic amplitudes and phases. This paper provides a flavour of the physics learnt from such experiments.
Zero-point angular momentum of supersymmetric Penning trap
NASA Astrophysics Data System (ADS)
Zhang, Jian-zu; Xu, Qiang
2000-10-01
The quantum behavior of supersymmetric Penning trap, specially the superpartner of its angular momentum, is investigated in the formulation of multi-dimensional semiunitary transformation of supersymmetric quantum mechanics. In the limit case of vanishing kinetic energy it is found that its lowest angular momentum is 3ℏ/2, which provides a possibility of directly checking the idea of supersymmetric quantum mechanics and thus suggests a possible experimental verification about this prediction.
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
Self-testing through EPR-steering
NASA Astrophysics Data System (ADS)
Šupić, Ivan; Hoban, Matty J.
2016-07-01
The verification of quantum devices is an important aspect of quantum information, especially with the emergence of more advanced experimental implementations of quantum computation and secure communication. Within this, the theory of device-independent robust self-testing via Bell tests has reached a level of maturity now that many quantum states and measurements can be verified without direct access to the quantum systems: interaction with the devices is solely classical. However, the requirements for this robust level of verification are daunting and require high levels of experimental accuracy. In this paper we discuss the possibility of self-testing where we only have direct access to one part of the quantum device. This motivates the study of self-testing via EPR-steering, an intermediate form of entanglement verification between full state tomography and Bell tests. Quantum non-locality implies EPR-steering so results in the former can apply in the latter, but we ask what advantages may be gleaned from the latter over the former given that one can do partial state tomography? We show that in the case of self-testing a maximally entangled two-qubit state, or ebit, EPR-steering allows for simpler analysis and better error tolerance than in the case of full device-independence. On the other hand, this improvement is only a constant improvement and (up to constants) is the best one can hope for. Finally, we indicate that the main advantage in self-testing based on EPR-steering could be in the case of self-testing multi-partite quantum states and measurements. For example, it may be easier to establish a tensor product structure for a particular party’s Hilbert space even if we do not have access to their part of the global quantum system.
Numerical Simulation of the Fluid-Structure Interaction of a Surface Effect Ship Bow Seal
NASA Astrophysics Data System (ADS)
Bloxom, Andrew L.
Numerical simulations of fluid-structure interaction (FSI) problems were performed in an effort to verify and validate a commercially available FSI tool. This tool uses an iterative partitioned coupling scheme between CD-adapco's STAR-CCM+ finite volume fluid solver and Simulia's Abaqus finite element structural solver to simulate the FSI response of a system. Preliminary verification and validation work (V&V) was carried out to understand the numerical behavior of the codes individually and together as a FSI tool. Verification and Validation work that was completed included code order verification of the respective fluid and structural solvers with Couette-Poiseuille flow and Euler-Bernoulli beam theory. These results confirmed the 2 nd order accuracy of the spatial discretizations used. Following that, a mixture of solution verifications and model calibrations was performed with the inclusion of the physics models implemented in the solution of the FSI problems. Solution verifications were completed for fluid and structural stand-alone models as well as for the coupled FSI solutions. These results re-confirmed the spatial order of accuracy but for more complex flows and physics models as well as the order of accuracy of the temporal discretizations. In lieu of a good material definition, model calibration is performed to reproduce the experimental results. This work used model calibration for both instances of hyperelastic materials which were presented in the literature as validation cases because these materials were defined as linear elastic. Calibrated, three dimensional models of the bow seal on the University of Michigan bow seal test platform showed the ability to reproduce the experimental results qualitatively through averaging of the forces and seal displacements. These simulations represent the only current 3D results for this case. One significant result of this study is the ability to visualize the flow around the seal and to directly measure the seal resistances at varying cushion pressures, seal immersions, forward speeds, and different seal materials. SES design analysis could greatly benefit from the inclusion of flexible seals in simulations, and this work is a positive step in that direction. In future work, the inclusion of more complex seal geometries and contact will further enhance the capability of this tool.
Experimental verification of a model of a two-link flexible, lightweight manipulator. M.S. Thesis
NASA Technical Reports Server (NTRS)
Huggins, James David
1988-01-01
Experimental verification is presented for an assumed modes model of a large, two link, flexible manipulator design and constructed in the School of Mechanical Engineering at Georgia Institute of Technology. The structure was designed to have typical characteristics of a lightweight manipulator.
Thermal noise in space-charge-limited hole current in silicon
NASA Technical Reports Server (NTRS)
Shumka, A.; Golder, J.; Nicolet, M.
1972-01-01
Present theories on noise in single-carrier space-charge-limited currents in solids have not been quantitatively substantiated by experimental evidence. To obtain such experimental verification, the noise in specially fabricated silicon structures is being measured and analyzed. The first results of this verification effort are reported.
Experimental verification of a new Bell-type inequality
NASA Astrophysics Data System (ADS)
Zhao, Jia-Qiang; Cao, Lian-Zhen; Yang, Yang; Li, Ying-De; Lu, Huai-Xin
2018-05-01
Arpan Das et al. proposed a set of new Bell inequalities (Das et al., 2017 [16]) for a three-qubit system and claimed that each inequality within this set is violated by all generalized Greenberger-Horne-Zeilinger (GGHZ) states. We investigate experimentally the new inequalities in the three-photon GGHZ class states. Since the inequalities are symmetric under the identical particles system, we chose one Bell-type inequality from the set arbitrarily. The experimental data well verified the theoretical prediction. Moreover, the experimental results show that the amount of violation of the new Bell inequality against locality realism increases monotonically following the increase of the tangle of the GGHZ state. The most profound physical essence revealed by the results is that the nonlocality of GGHZ state correlate with three tangles directly.
Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium
ERIC Educational Resources Information Center
Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.
2017-01-01
This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…
Experimental Verification of Boyle's Law and the Ideal Gas Law
ERIC Educational Resources Information Center
Ivanov, Dragia Trifonov
2007-01-01
Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…
1976-01-01
Environmental Systems Laboratory P. 0. Box 631, Vlcksburg, Mississippi 39100 10. PROGRAM ELEMENT. PROJECT, TASK AREA » WORK UNIT NUMBERS Project...phases of the study were under the general supervision of Messrs. W. G. Shockley, Chief, Mobility and Environmental Systemo Labo- ratory (MESL), and...W. E. Grabau, former Chief, Environmental Systems Division (ESD) and now Special Assistant, MESL, and under the direct supervision of Mr. J. K
Multibody modeling and verification
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1989-01-01
A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.
Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets
NASA Technical Reports Server (NTRS)
Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.
1978-01-01
A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.
Experimental verification of vapor deposition rate theory in high velocity burner rigs
NASA Technical Reports Server (NTRS)
Gokoglu, Suleyman A.; Santoro, Gilbert J.
1985-01-01
The main objective has been the experimental verification of the corrosive vapor deposition theory in high-temperature, high-velocity environments. Towards this end a Mach 0.3 burner-rig appartus was built to measure deposition rates from salt-seeded (mostly Na salts) combustion gases on the internally cooled cylindrical collector. Deposition experiments are underway.
Campione, Salvatore; Kim, Iltai; de Ceglia, Domenico; ...
2016-01-01
Here, we investigate optical polariton modes supported by subwavelength-thick degenerately doped semiconductor nanolayers (e.g. indium tin oxide) on glass in the epsilon-near-zero (ENZ) regime. The dispersions of the radiative (R, on the left of the light line) and non-radiative (NR, on the right of the light line) ENZ polariton modes are experimentally measured and theoretically analyzed through the transfer matrix method and the complex-frequency/real-wavenumber analysis, which are in remarkable agreement. We observe directional near-perfect absorption using the Kretschmann geometry for incidence conditions close to the NR-ENZ polariton mode dispersion. Along with field enhancement, this provides us with an unexplored pathwaymore » to enhance nonlinear optical processes and to open up directions for ultrafast, tunable thermal emission.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campione, Salvatore; Kim, Iltai; de Ceglia, Domenico
Here, we investigate optical polariton modes supported by subwavelength-thick degenerately doped semiconductor nanolayers (e.g. indium tin oxide) on glass in the epsilon-near-zero (ENZ) regime. The dispersions of the radiative (R, on the left of the light line) and non-radiative (NR, on the right of the light line) ENZ polariton modes are experimentally measured and theoretically analyzed through the transfer matrix method and the complex-frequency/real-wavenumber analysis, which are in remarkable agreement. We observe directional near-perfect absorption using the Kretschmann geometry for incidence conditions close to the NR-ENZ polariton mode dispersion. Along with field enhancement, this provides us with an unexplored pathwaymore » to enhance nonlinear optical processes and to open up directions for ultrafast, tunable thermal emission.« less
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
Seebeck coefficient of one electron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durrani, Zahid A. K., E-mail: z.durrani@imperial.ac.uk
2014-03-07
The Seebeck coefficient of one electron, driven thermally into a semiconductor single-electron box, is investigated theoretically. With a finite temperature difference ΔT between the source and charging island, a single electron can charge the island in equilibrium, directly generating a Seebeck effect. Seebeck coefficients for small and finite ΔT are calculated and a thermally driven Coulomb staircase is predicted. Single-electron Seebeck oscillations occur with increasing ΔT, as one electron at a time charges the box. A method is proposed for experimental verification of these effects.
Multichannel forward scattering meter for oceanography
NASA Technical Reports Server (NTRS)
Mccluney, W. R.
1974-01-01
An instrument was designed and built that measures the light scattered at several angles in the forward direction simultaneously. The instrument relies on an optical multiplexing technique for frequency encoding of the different channels suitable for detection by a single photodetector. A Mie theory computer program was used to calculate the theoretical volume scattering function for a suspension of polystyrene latex spheres. The agreement between the theoretical and experimental volume scattering functions is taken as a verification of the calibration technique used.
NASA Technical Reports Server (NTRS)
Hanley, G. M.
1980-01-01
An evolutionary Satellite Power Systems development plan was prepared. Planning analysis was directed toward the evolution of a scenario that met the stated objectives, was technically possible and economically attractive, and took into account constraining considerations, such as requirements for very large scale end-to-end demonstration in a compressed time frame, the relative cost/technical merits of ground testing versus space testing, and the need for large mass flow capability to low Earth orbit and geosynchronous orbit at reasonable cost per pound.
Shen, Jiajian; Tryggestad, Erik; Younkin, James E; Keole, Sameer R; Furutani, Keith M; Kang, Yixiu; Herman, Michael G; Bues, Martin
2017-10-01
To accurately model the beam delivery time (BDT) for a synchrotron-based proton spot scanning system using experimentally determined beam parameters. A model to simulate the proton spot delivery sequences was constructed, and BDT was calculated by summing times for layer switch, spot switch, and spot delivery. Test plans were designed to isolate and quantify the relevant beam parameters in the operation cycle of the proton beam therapy delivery system. These parameters included the layer switch time, magnet preparation and verification time, average beam scanning speeds in x- and y-directions, proton spill rate, and maximum charge and maximum extraction time for each spill. The experimentally determined parameters, as well as the nominal values initially provided by the vendor, served as inputs to the model to predict BDTs for 602 clinical proton beam deliveries. The calculated BDTs (T BDT ) were compared with the BDTs recorded in the treatment delivery log files (T Log ): ∆t = T Log -T BDT . The experimentally determined average layer switch time for all 97 energies was 1.91 s (ranging from 1.9 to 2.0 s for beam energies from 71.3 to 228.8 MeV), average magnet preparation and verification time was 1.93 ms, the average scanning speeds were 5.9 m/s in x-direction and 19.3 m/s in y-direction, the proton spill rate was 8.7 MU/s, and the maximum proton charge available for one acceleration is 2.0 ± 0.4 nC. Some of the measured parameters differed from the nominal values provided by the vendor. The calculated BDTs using experimentally determined parameters matched the recorded BDTs of 602 beam deliveries (∆t = -0.49 ± 1.44 s), which were significantly more accurate than BDTs calculated using nominal timing parameters (∆t = -7.48 ± 6.97 s). An accurate model for BDT prediction was achieved by using the experimentally determined proton beam therapy delivery parameters, which may be useful in modeling the interplay effect and patient throughput. The model may provide guidance on how to effectively reduce BDT and may be used to identifying deteriorating machine performance. © 2017 American Association of Physicists in Medicine.
Progress on an external occulter testbed at flight Fresnel numbers
NASA Astrophysics Data System (ADS)
Kim, Yunjong; Sirbu, Dan; Galvin, Michael; Kasdin, N. Jeremy; Vanderbei, Robert J.
2016-01-01
An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The occulter testbed uses 78 m optical propagation distance to realize the flight Fresnel numbers. We will use an etched silicon mask as the occulter. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the mechanical design of the testbed. We compare the experimental results with simulations that predict the ultimate contrast performance.
Design of an occulter testbed at flight Fresnel numbers
NASA Astrophysics Data System (ADS)
Sirbu, Dan; Kasdin, N. Jeremy; Kim, Yunjong; Vanderbei, Robert J.
2015-01-01
An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we are designing and building a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. Here, we present a sample design operating at a flight Fresnel number and is thus representative of a realistic space mission. We present calculations of experimental limits arising from the finite size and propagation distance available in the testbed, limitations due to manufacturing feature size, and non-ideal input beam. We demonstrate how the testbed is designed to be feature-size limited, and provide an estimation of the expected performance.
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.
1985-01-01
Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.
Modeling and characterization of through-the-thickness properties of 3D woven composites
NASA Technical Reports Server (NTRS)
Hartranft, Dru; Pravizi-Majidi, Azar; Chou, Tsu-Wei
1995-01-01
The through-the-thickness properties of three-dimensionally (3D) woven carbon/epoxy composites have been studied. The investigation aimed at the evaluation and development of test methodologies for the property characterization in the thickness direction, and the establishment of fiber architectures were studied: layer-to-layer Angle Interlock, through-the-thickness Orthogonal woven preform with surface pile was also designed and manufactured for the fabrication of tensile test coupons with integrated grips. All the preforms were infiltrated by the resin transfer molding technique. The microstructures of the composites were characterized along the warp and fill (weft) directions to determine the degree of yarn undulations, yarn cross-sectional shapes, and microstructural dimensions. These parameters were correlated to the fiber architecture. Specimens were designed and tested for the direct measurement of the through-the-thickness tensile, compressive and shear properties of the composites. Design optimization was conducted through the analysis of the stress fields within the specimen coupled with experimental verification. The experimentally-derived elastic properties in the thickness direction compared well with analytical predictions obtained from a volume averaging model.
Quantum money with classical verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavinsky, Dmitry
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
Quantum money with classical verification
NASA Astrophysics Data System (ADS)
Gavinsky, Dmitry
2014-12-01
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
The Multiple Doppler Radar Workshop, November 1979.
NASA Astrophysics Data System (ADS)
Carbone, R. E.; Harris, F. I.; Hildebrand, P. H.; Kropfli, R. A.; Miller, L. J.; Moninger, W.; Strauch, R. G.; Doviak, R. J.; Johnson, K. W.; Nelson, S. P.; Ray, P. S.; Gilet, M.
1980-10-01
The findings of the Multiple Doppler Radar Workshop are summarized by a series of six papers. Part I of this series briefly reviews the history of multiple Doppler experimentation, fundamental concepts of Doppler signal theory, and organization and objectives of the Workshop. Invited presentations by dynamicists and cloud physicists are also summarized.Experimental design and procedures (Part II) are shown to be of critical importance. Well-defined and limited experimental objectives are necessary in view of technological limitations. Specified radar scanning procedures that balance temporal and spatial resolution considerations are discussed in detail. Improved siting for suppression of ground clutter as well as scanning procedures to minimize errors at echo boundaries are discussed. The need for accelerated research using numerically simulated proxy data sets is emphasized.New technology to eliminate various sampling limitations is cited as an eventual solution to many current problems in Part III. Ground clutter contamination may be curtailed by means of full spectral processing, digital filters in real time, and/or variable pulse repetition frequency. Range and velocity ambiguities also may be minimized by various pulsing options as well as random phase transmission. Sidelobe contamination can be reduced through improvements in radomes, illumination patterns, and antenna feed types. Radar volume-scan time can be sharply reduced by means of wideband transmission, phased array antennas, multiple beam antennas, and frequency agility.Part IV deals with synthesis of data from several radars in the context of scientific requirements in cumulus clouds, widespread precipitation, and severe convective storms. The important temporal and spatial scales are examined together with the accuracy required for vertical air motion in each phenomenon. Factors that introduce errors in the vertical velocity field are identified and synthesis techniques are discussed separately for the dual Doppler and multiple Doppler cases. Various filters and techniques, including statistical and variational approaches, are mentioned. Emphasis is placed on the importance of experiment design and procedures, technological improvements, incorporation of all information from supporting sensors, and analysis priority for physically simple cases. Integrated reliability is proposed as an objective tool for radar siting.Verification of multiple Doppler-derived vertical velocity is discussed in Part V. Three categories of verification are defined as direct, deductive, and theoretical/numerical. Direct verification consists of zenith-pointing radar measurements (from either airborne or ground-based systems), air motion sensing aircraft, instrumented towers, and tracking of radar chaff. Deductive sources include mesonetworks, aircraft (thermodynamic and microphysical) measurements, satellite observations, radar reflectivity, multiple Doppler consistency, and atmospheric soundings. Theoretical/numerical sources of verification include proxy data simulation, momentum checking, and numerical cloud models. New technology, principally in the form of wide bandwidth radars, is seen as a development that may reduce the need for extensive verification of multiple Doppler-derived vertical air motions. Airborne Doppler radar is perceived as the single most important source of verification within the bounds of existing technology.Nine stages of data processing and display are identified in Part VI. The stages are identified as field checks, archival, selection, editing, coordinate transformation, synthesis of Cartesian fields, filtering, display, and physical analysis. Display of data is considered to be a problem critical to assimilation of data at all stages. Interactive computing systems and software are concluded to be very important, particularly for the editing stage. Three- and 4-dimensional displays are considered essential for data assimilation, particularly at the physical analysis stage. The concept of common data tape formats is approved both for data in radar spherical space as well as for synthesized Cartesian output.1169
Electromechanical modeling and experimental verification of a directly printed nanocomposite
NASA Astrophysics Data System (ADS)
Nafari, Alireza; Sodano, Henry A.
2018-03-01
Piezoelectric materials are currently among the most promising building blocks of sensing, actuating and energy harvesting systems. However, these materials are limited in applications due to difficulty in machining and casting it on to curve surfaces. To mitigate this issue, one method is through additive manufacturing (direct printing) of piezoelectric nanocomposite in which piezoelectric nanomaterials are embedded into a polymer matrix. Although significant progress has been recently made in this area, modeling the electromechanical response of a directly printed nanocomposite remains a challenge. Thus the objective of this study is to develop robust micromechanical and finite element models that allows the study of the electroelastic properties of a directly printed nanocomposite containing piezoelectric inclusions. Furthermore, the dependence of these properties on geometrical parameters such as aspect ratio and alignment of the active phase are investigated. The focus of this work is a demonstration of the effect gradual alignment of piezoelectric nanowires in a nanocomposite from randomly oriented to purely aligned improves the electroelastic properties of a directly printed nanocomposite. Finally, these models are verified through experimental measurement of electroelastic properties of the nanocomposites containing barium titanate nanowires in Polydimethylsiloxane (PDMS) polymer.
Fast ℓ1-regularized space-time adaptive processing using alternating direction method of multipliers
NASA Astrophysics Data System (ADS)
Qin, Lilong; Wu, Manqing; Wang, Xuan; Dong, Zhen
2017-04-01
Motivated by the sparsity of filter coefficients in full-dimension space-time adaptive processing (STAP) algorithms, this paper proposes a fast ℓ1-regularized STAP algorithm based on the alternating direction method of multipliers to accelerate the convergence and reduce the calculations. The proposed algorithm uses a splitting variable to obtain an equivalent optimization formulation, which is addressed with an augmented Lagrangian method. Using the alternating recursive algorithm, the method can rapidly result in a low minimum mean-square error without a large number of calculations. Through theoretical analysis and experimental verification, we demonstrate that the proposed algorithm provides a better output signal-to-clutter-noise ratio performance than other algorithms.
47 CFR 73.151 - Field strength measurements to establish performance of directional antennas.
Code of Federal Regulations, 2010 CFR
2010-10-01
... verified either by field strength measurement or by computer modeling and sampling system verification. (a... specifically identified by the Commission. (c) Computer modeling and sample system verification of modeled... performance verified by computer modeling and sample system verification. (1) A matrix of impedance...
7 CFR 1980.353 - Filing and processing applications.
Code of Federal Regulations, 2010 CFR
2010-01-01
... subject to the availability of funds. (15) A copy of a valid verification of income for each adult member... method of verifying information. Verifications must pass directly from the source of information to the Lender and shall not pass through the hands of a third party or applicant. (1) Income verification...
78 FR 18305 - Notice of Request for Extension of a Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Identity Verification (PIV) Request for Credential, the USDA Homeland Security Presidential Directive 12... consists of two phases of implementation: Personal Identity Verification phase I (PIV I) and Personal Identity Verification phase II (PIV II). The information requested must be provided by Federal employees...
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
An accurate computational method for the diffusion regime verification
NASA Astrophysics Data System (ADS)
Zhokh, Alexey A.; Strizhak, Peter E.
2018-04-01
The diffusion regime (sub-diffusive, standard, or super-diffusive) is defined by the order of the derivative in the corresponding transport equation. We develop an accurate computational method for the direct estimation of the diffusion regime. The method is based on the derivative order estimation using the asymptotic analytic solutions of the diffusion equation with the integer order and the time-fractional derivatives. The robustness and the computational cheapness of the proposed method are verified using the experimental methane and methyl alcohol transport kinetics through the catalyst pellet.
Measurements of VLF polarization and wave normal direction on OGO-F
NASA Technical Reports Server (NTRS)
Helliwell, R. A.
1973-01-01
A major achievement of the F-24 experiment on OGO 6 was a verification of the theory of the polarization of proton whistlers. As predicted, the electron whistler was found to be right-hand polarized and the proton whistler left hand polarized. The transition from right- to left-hand polarization was found to occur very rapidly. Thus it appears that the experimental technique may allow great accuracy in the measurement of the cross-over frequency, a frequency that provides information on the ionic composition of the ionosphere.
Current Results and Proposed Activities in Microgravity Fluid Dynamics
NASA Technical Reports Server (NTRS)
Polezhaev, V. I.
1996-01-01
The Institute for Problems in Mechanics' Laboratory work in mathematical and physical modelling of fluid mechanics develops models, methods, and software for analysis of fluid flow, instability analysis, direct numerical modelling and semi-empirical models of turbulence, as well as experimental research and verification of these models and their applications in technological fluid dynamics, microgravity fluid mechanics, geophysics, and a number of engineering problems. This paper presents an overview of the results in microgravity fluid dynamics research during the last two years. Nonlinear problems of weakly compressible and compressible fluid flows are discussed.
Offline signature verification using convolution Siamese network
NASA Astrophysics Data System (ADS)
Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin
2018-04-01
This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.
Artificial tektites: an experimental technique for capturing the shapes of spinning drops
NASA Astrophysics Data System (ADS)
Baldwin, Kyle A.; Butler, Samuel L.; Hill, Richard J. A.
2015-01-01
Determining the shapes of a rotating liquid droplet bound by surface tension is an archetypal problem in the study of the equilibrium shapes of a spinning and charged droplet, a problem that unites models of the stability of the atomic nucleus with the shapes of astronomical-scale, gravitationally-bound masses. The shapes of highly deformed droplets and their stability must be calculated numerically. Although the accuracy of such models has increased with the use of progressively more sophisticated computational techniques and increases in computing power, direct experimental verification is still lacking. Here we present an experimental technique for making wax models of these shapes using diamagnetic levitation. The wax models resemble splash-form tektites, glassy stones formed from molten rock ejected from asteroid impacts. Many tektites have elongated or `dumb-bell' shapes due to their rotation mid-flight before solidification, just as we observe here. Measurements of the dimensions of our wax `artificial tektites' show good agreement with equilibrium shapes calculated by our numerical model, and with previous models. These wax models provide the first direct experimental validation for numerical models of the equilibrium shapes of spinning droplets, of importance to fundamental physics and also to studies of tektite formation.
Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens
Lucon, Enrico; McCowan, Chris N.; Santoyo, Ray L.
2015-01-01
The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of −40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at −40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator’s skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at −40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses. PMID:26958453
Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens.
Lucon, Enrico; McCowan, Chris N; Santoyo, Ray L
2015-01-01
The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of -40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at -40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator's skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at -40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
Bat noseleaf model: echolocation function, design considerations, and experimental verification.
Kuc, Roman
2011-05-01
This paper describes a possible bat noseleaf echolocation function that improves target elevation resolution. Bats with a protruding noseleaf can rotate the lancet to act as an acoustic mirror that reflects the nostril emission, modeled as a virtual nostril that produces a delayed emission. The cancellation of the nostril and virtual nostril components at a target produces a sharp spectral notch whose frequency location relates to target elevation. This notch can be observed directly from the swept-frequency emission waveform, suggesting cochlear processing capabilities. Physical acoustic principles indicate the design considerations and trade-offs that a bat can accomplish through noseleaf shape and emission characteristics. An experimental model verifies the analysis and exhibits an elevation versus notch frequency sensitivity of approximately 1°/kHz.
A Comprehensive Validation Methodology for Sparse Experimental Data
NASA Technical Reports Server (NTRS)
Norman, Ryan B.; Blattnig, Steve R.
2010-01-01
A comprehensive program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as models are developed over time. The models are placed under configuration control, and automated validation tests are used so that comparisons can readily be made as models are improved. Though direct comparisons between theoretical results and experimental data are desired for validation purposes, such comparisons are not always possible due to lack of data. In this work, two uncertainty metrics are introduced that are suitable for validating theoretical models against sparse experimental databases. The nuclear physics models, NUCFRG2 and QMSFRG, are compared to an experimental database consisting of over 3600 experimental cross sections to demonstrate the applicability of the metrics. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by analyzing subsets of the model parameter space.
Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister
2017-01-01
Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... non-federal community, including the academic, commercial, and public safety sectors, to implement a..., Verification, Demonstration and Trials: Technical Workshop II on Coordinating Federal Government/Private Sector Spectrum Innovation Testing Needs AGENCY: The National Coordination Office (NCO) for Networking and...
Bayesian truthing as experimental verification of C4ISR sensors
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew
2015-05-01
In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.
One-time pad, complexity of verification of keys, and practical security of quantum cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molotkov, S. N., E-mail: sergei.molotkov@gmail.com
2016-11-15
A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
NASA Technical Reports Server (NTRS)
1978-01-01
The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Verification of component mode techniques for flexible multibody systems
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1990-01-01
Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.
Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon
NASA Astrophysics Data System (ADS)
Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen
Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.
NASA Astrophysics Data System (ADS)
Miyamoto, Yuki; Mizoguchi, Asao; Kanamori, Hideto
2017-03-01
The bleaching process in the C-F stretching mode (ν3 band) of CH3F-(ortho-H2)n [n = 0 and 1] clusters in solid para-H2 was monitored using pump and probe laser spectroscopy on the C-H stretching mode (ν1 and 2ν5 bands). From an analysis of the depleted spectral profiles, the transition frequency and linewidth of each cluster were directly determined. The results agree with the values previously derived from a deconvolution analysis of the broadened ν1/2ν5 spectrum observed by FTIR spectroscopy. The complementary increase and decrease between the n = 0 and 1 components were also verified through monitoring the ν1 and 2ν5 bands, which suggests a closed system among the CH3F-(ortho-H2)n clusters. These observations provide experimental verification of the CH3F-(ortho-H2)n cluster model. On the other hand, a trial to observe the bleaching process by pumping the C-H stretching mode was not successful. This result may be important for understanding the dynamics of vibrational relaxation processes in CH3F-(ortho-H2)n in solid para-H2.
NASA Astrophysics Data System (ADS)
Yoon, K. J.; Park, K. H.; Lee, S. K.; Goo, N. S.; Park, H. C.
2004-06-01
This paper describes an analytical design model for a layered piezo-composite unimorph actuator and its numerical and experimental verification using a LIPCA (lightweight piezo-composite curved actuator) that is lighter than other conventional piezo-composite type actuators. The LIPCA is composed of top fiber composite layers with high modulus and low CTE (coefficient of thermal expansion), a middle PZT ceramic wafer, and base layers with low modulus and high CTE. The advantages of the LIPCA design are to replace the heavy metal layer of THUNDER by lightweight fiber-reinforced plastic layers without compromising the generation of high force and large displacement and to have design flexibility by selecting the fiber direction and the number of prepreg layers. In addition to the lightweight advantage and design flexibility, the proposed device can be manufactured without adhesive layers when we use a resin prepreg system. A piezo-actuation model for a laminate with piezo-electric material layers and fiber composite layers is proposed to predict the curvature and residual stress of the LIPCA. To predict the actuation displacement of the LIPCA with curvature, a finite element analysis method using the proposed piezo-actuation model is introduced. The predicted deformations are in good agreement with the experimental ones.
Miyamoto, Yuki; Mizoguchi, Asao; Kanamori, Hideto
2017-03-21
The bleaching process in the C-F stretching mode (ν 3 band) of CH 3 F-(ortho-H 2 ) n [n = 0 and 1] clusters in solid para-H 2 was monitored using pump and probe laser spectroscopy on the C-H stretching mode (ν 1 and 2ν 5 bands). From an analysis of the depleted spectral profiles, the transition frequency and linewidth of each cluster were directly determined. The results agree with the values previously derived from a deconvolution analysis of the broadened ν 1 /2ν 5 spectrum observed by FTIR spectroscopy. The complementary increase and decrease between the n = 0 and 1 components were also verified through monitoring the ν 1 and 2ν 5 bands, which suggests a closed system among the CH 3 F-(ortho-H 2 ) n clusters. These observations provide experimental verification of the CH 3 F-(ortho-H 2 ) n cluster model. On the other hand, a trial to observe the bleaching process by pumping the C-H stretching mode was not successful. This result may be important for understanding the dynamics of vibrational relaxation processes in CH 3 F-(ortho-H 2 ) n in solid para-H 2 .
Design and experimental verification of a water-like pentamode material
NASA Astrophysics Data System (ADS)
Zhao, Aiguo; Zhao, Zhigao; Zhang, Xiangdong; Cai, Xuan; Wang, Lei; Wu, Tao; Chen, Hong
2017-01-01
Pentamode materials approximate tailorable artificial liquids. Recently, microscopic versions of these intricate structures have been fabricated, and the static mechanical experiments reveal that the ratio of bulk modulus to shear modulus as large as 1000 can be obtained. However, no direct acoustic experimental characterizations have been reported yet. In this paper, a water-like two-dimensional pentamode material sample is designed and fabricated with a single metallic material, which is a hollow metallic foam-like structure at centimeter scale. Acoustic simulation and experimental testing results indicate that the designed pentamode material mimics water in acoustic properties over a wide frequency range, i.e., it exhibits transparency when surrounded by water. This work contributes to the development of microstructural design of materials with specific modulus and density distribution, thus paving the way for the physical realization of special acoustic devices such as metamaterial lenses and vibration isolation.
Wright, Kevin B; King, Shawn; Rosenberg, Jenny
2014-01-01
This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.
NASA Technical Reports Server (NTRS)
Miller, Rolf W.; Argrow, Brian M.; Center, Kenneth B.; Brauckmann, Gregory J.; Rhode, Matthew N.
1998-01-01
The NASA Langley Research Center Unitary Plan Wind Tunnel and the 20-Inch Mach 6 Tunnel were used to test two osculating cones waverider models. The Mach-4 and Mach-6 shapes were generated using the interactive design tool WIPAR. WIPAR performance predictions are compared to the experimental results. Vapor screen results for the Mach-4 model at the on- design Mach number provide visual verification that the shock is attached along the entire leading edge, within the limits of observation. WIPAR predictions of pressure distributions and aerodynamic coefficients show general agreement with the corresponding experimental values.
Wang, Guoqiang; Zhang, Honglin; Zhao, Jiyang; Li, Wei; Cao, Jia; Zhu, Chengjian; Li, Shuhua
2016-05-10
Density functional theory (DFT) investigations revealed that 4-cyanopyridine was capable of homolytically cleaving the B-B σ bond of diborane via the cooperative coordination to the two boron atoms of the diborane to generate pyridine boryl radicals. Our experimental verification provides supportive evidence for this new B-B activation mode. With this novel activation strategy, we have experimentally realized the catalytic reduction of azo-compounds to hydrazine derivatives, deoxygenation of sulfoxides to sulfides, and reduction of quinones with B2 (pin)2 at mild conditions. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Podkościelny, P.; Nieszporek, K.
2007-01-01
Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.
Cleanup Verification Package for the 100-F-20, Pacific Northwest Laboratory Parallel Pits
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. J. Appel
2007-01-22
This cleanup verification package documents completion of remedial action for the 100-F-20, Pacific Northwest Laboratory Parallel Pits waste site. This waste site consisted of two earthen trenches thought to have received both radioactive and nonradioactive material related to the 100-F Experimental Animal Farm.
Real-space mapping of electronic orbitals.
Löffler, Stefan; Bugnet, Matthieu; Gauquelin, Nicolas; Lazar, Sorin; Assmann, Elias; Held, Karsten; Botton, Gianluigi A; Schattschneider, Peter
2017-06-01
Electronic states are responsible for most material properties, including chemical bonds, electrical and thermal conductivity, as well as optical and magnetic properties. Experimentally, however, they remain mostly elusive. Here, we report the real-space mapping of selected transitions between p and d states on the Ångström scale in bulk rutile (TiO 2 ) using electron energy-loss spectrometry (EELS), revealing information on individual bonds between atoms. On the one hand, this enables the experimental verification of theoretical predictions about electronic states. On the other hand, it paves the way for directly investigating electronic states under conditions that are at the limit of the current capabilities of numerical simulations such as, e.g., the electronic states at defects, interfaces, and quantum dots. Copyright © 2017 Elsevier B.V. All rights reserved.
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
NASA Technical Reports Server (NTRS)
Ponchak, George E.; Chun, Donghoon; Katehi, Linda P. B.; Yook, Jong-Gwan
1999-01-01
Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior 3D-FEM electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually increases coupling between the lines; however, if the top of the via posts are connected by a metal Strip, coupling is reduced. In this paper, experimental verification of the 3D-FEM simulations Is demonstrated for commercially fabricated LTCC packages.
OpenMP 4.5 Validation and Verification Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pophale, Swaroop S; Bernholdt, David E; Hernandez, Oscar R
2017-12-15
OpenMP, a directive-based programming API, introduce directives for accelerator devices that programmers are starting to use more frequently in production codes. To make sure OpenMP directives work correctly across architectures, it is critical to have a mechanism that tests for an implementation's conformance to the OpenMP standard. This testing process can uncover ambiguities in the OpenMP specification, which helps compiler developers and users make a better use of the standard. We fill this gap through our validation and verification test suite that focuses on the offload directives available in OpenMP 4.5.
Verification technology of remote sensing camera satellite imaging simulation based on ray tracing
NASA Astrophysics Data System (ADS)
Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun
2017-08-01
Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.
Jeong, Min Yong; Chang, Seo Hyoung; Kim, Beom Hyun; ...
2017-10-04
Strong spin-orbit coupling lifts the degeneracy of t 2g orbitals in 5d transition-metal systems, leaving a Kramers doublet and quartet with effective angular momentum of J eff = 1/2 and 3/2, respectively. These spin-orbit entangled states can host exotic quantum phases such as topological Mott state, unconventional superconductivity, and quantum spin liquid. The lacunar spinel GaTa 4Se 8 was theoretically predicted to form the molecular J eff = 3/2 ground state. Experimental verification of its existence is an important first step to exploring the consequences of the J eff = 3/2 state. Here, we report direct experimental evidence of themore » J eff = 3/2 state in GaTa 4Se 8 by means of excitation spectra of resonant inelastic x-rays scattering at the Ta L 3 and L 2 edges. In conclusion, we found that the excitations involving the J eff = 1/2 molecular orbital were absent only at the Ta L 2 edge, manifesting the realization of the molecular J eff = 3/2 ground state in GaTa 4Se 8.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, Min Yong; Chang, Seo Hyoung; Kim, Beom Hyun
Strong spin-orbit coupling lifts the degeneracy of t 2g orbitals in 5d transition-metal systems, leaving a Kramers doublet and quartet with effective angular momentum of J eff = 1/2 and 3/2, respectively. These spin-orbit entangled states can host exotic quantum phases such as topological Mott state, unconventional superconductivity, and quantum spin liquid. The lacunar spinel GaTa 4Se 8 was theoretically predicted to form the molecular J eff = 3/2 ground state. Experimental verification of its existence is an important first step to exploring the consequences of the J eff = 3/2 state. Here, we report direct experimental evidence of themore » J eff = 3/2 state in GaTa 4Se 8 by means of excitation spectra of resonant inelastic x-rays scattering at the Ta L 3 and L 2 edges. In conclusion, we found that the excitations involving the J eff = 1/2 molecular orbital were absent only at the Ta L 2 edge, manifesting the realization of the molecular J eff = 3/2 ground state in GaTa 4Se 8.« less
Campbell, J Q; Coombs, D J; Rao, M; Rullkoetter, P J; Petrella, A J
2016-09-06
The purpose of this study was to seek broad verification and validation of human lumbar spine finite element models created using a previously published automated algorithm. The automated algorithm takes segmented CT scans of lumbar vertebrae, automatically identifies important landmarks and contact surfaces, and creates a finite element model. Mesh convergence was evaluated by examining changes in key output variables in response to mesh density. Semi-direct validation was performed by comparing experimental results for a single specimen to the automated finite element model results for that specimen with calibrated material properties from a prior study. Indirect validation was based on a comparison of results from automated finite element models of 18 individual specimens, all using one set of generalized material properties, to a range of data from the literature. A total of 216 simulations were run and compared to 186 experimental data ranges in all six primary bending modes up to 7.8Nm with follower loads up to 1000N. Mesh convergence results showed less than a 5% difference in key variables when the original mesh density was doubled. The semi-direct validation results showed that the automated method produced results comparable to manual finite element modeling methods. The indirect validation results showed a wide range of outcomes due to variations in the geometry alone. The studies showed that the automated models can be used to reliably evaluate lumbar spine biomechanics, specifically within our intended context of use: in pure bending modes, under relatively low non-injurious simulated in vivo loads, to predict torque rotation response, disc pressures, and facet forces. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xiaohui; Song, Yingxiong
2018-02-01
By exploiting the non-Kolmogorov model and Rytov approximation theory, a propagation model of Bessel-Gaussian vortex beams (BGVB) propagating in a subway tunnel is derived. Based on the propagation model, a model of orbital angular momentum (OAM) mode probability distribution is established to evaluate the propagation performance when the beam propagates along both longitudinal and transverse directions in the subway tunnel. By numerical simulations and experimental verifications, the influences of the various parameters of BGVB and turbulence on the OAM mode probability distribution are evaluated, and the results of simulations are consistent with the experimental statistics. The results verify that the middle area of turbulence is more beneficial for the vortex beam propagation than the edge; when the BGVB propagates along the longitudinal direction in the subway tunnel, the effects of turbulence on the OAM mode probability distribution can be decreased by selecting a larger anisotropy parameter, smaller coherence length, larger non-Kolmogorov power spectrum coefficient, smaller topological charge number, deeper subway tunnel, lower train speed, and longer wavelength. When the BGVB propagates along the transverse direction, the influences can be also mitigated by adopting a larger topological charge number, less non-Kolmogorov power spectrum coefficient, smaller refractive structure index, shorter wavelength, and shorter propagation distance.
Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi
2012-04-05
Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.
NASA Technical Reports Server (NTRS)
Pierzga, M. J.
1981-01-01
The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.
NASA Astrophysics Data System (ADS)
Bańkowski, Wojciech; Król, Jan; Gałązka, Karol; Liphardt, Adam; Horodecka, Renata
2018-05-01
Recycling of bituminous pavements is an issue increasingly being discussed in Poland. The analysis of domestic and foreign experience indicates a need to develop this technology in our country, in particular the hot feeding and production technologies. Various steps are being taken in this direction, including research projects. One of them is the InnGA project entitled: “Reclaimed asphalt pavement: Innovative technology of bituminous mixtures using material from reclaimed asphalt pavement”. The paper presents the results of research involving the design of bituminous mixtures in accordance with the required properties and in excess of the content of reclaimed asphalt permitted by the technical guidelines. It presents selected bituminous mixtures with the content of RAP of up to 50% and the results of tests from verification of industrial production of those mixtures. The article discusses the details of the design process of mixtures with a high content of reclaimed asphalt, the carried out production tests and discusses the results of tests under the verification of industrial production. Testing included basic tests according to the Polish technical requirements of WT- 2 and the extended functional testing. The conducted tests and analyses helped to determine the usefulness of the developed bituminous mixtures for use in experimental sections and confirmed the possibility of using an increased amount of reclaimed asphalt up to 50% in mixtures intended for construction of national roads.
Verification of the Uncertainty Principle by Using Diffraction of Light Waves
ERIC Educational Resources Information Center
Nikolic, D.; Nesic, Lj
2011-01-01
We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…
Simulation of Laboratory Tests of Steel Arch Support
NASA Astrophysics Data System (ADS)
Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof
2017-03-01
The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.
Experimental measurement-device-independent verification of quantum steering
NASA Astrophysics Data System (ADS)
Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.
2015-01-01
Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.
Experimental measurement-device-independent verification of quantum steering.
Kocsis, Sacha; Hall, Michael J W; Bennet, Adam J; Saunders, Dylan J; Pryde, Geoff J
2015-01-07
Bell non-locality between distant quantum systems--that is, joint correlations which violate a Bell inequality--can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.
Jongen, H A; Thijssen, J M; van den Aarssen, M; Verhoef, W A
1986-02-01
In this paper, a closed-form expression is derived for the absorption of ultrasound by biological tissues. In this expression, the viscothermal and viscoelastic theories of relaxation processes are combined. Three relaxation time distribution functions are introduced, and it is assumed that each of these distributions can be described by an identical and simple hyperbolic function. Several simplifying assumptions had to be made to enable the experimental verification of the derived closed-form expression of the absorption coefficient. The simplified expression leaves two degrees of freedom and it was fitted to the experimental data obtained from homogenized beef liver. The model produced a considerably better fit to the data than other, more pragmatic models for the absorption coefficient as a function of frequency that could be found in the literature. Scattering in beef liver was estimated indirectly from the difference between attenuation in in vitro liver tissue as compared to absorption in a homogenate. The frequency dependence of the scattering coefficient could be described by a power law with a power of the order of 2. A comparable figure was found in direct backscattering measurements, performed at our laboratory with the same liver samples [Van den Aarssen et al., J. Acoust. Soc. Am. (to be published)]. A model for scattering recently proposed by Sehgal and Greenleaf [Ultrason. Imag. 6, 60-80 (1984)] was fitted to the scattering data as well. This latter model enabled the estimation of a maximum scatterer distance, which appeared to be of the order of 25 micron.
Seismic velocities in fractured rocks: An experimental verification of Hudson`s theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peacock, S.; McCann, C.; Sothcott, J.
1994-01-01
Flow of fluids in many hydrocarbon reservoirs aquifers is enhanced by the presence of cracks and fractures. These cracks could be detected by their effects on propagation of compressional and shear waves through the reservoir: several theories, including Hudson`s, claim to predict the seismic effects of cracks. Although Hudson`s theory has already been used to calculate crack densities from seismic survey`s, the predictions of the theory have not yet been tested experimentally on rocks containing a known crack distribution. This paper describes an experimental verification of the theory. The rock used, Carrara marble, was chosen for its uniformity and lowmore » porosity, so that the effect of cracks would not be obscured by other influences. Cracks were induced by loading of laboratory specimens. Velocities of compressional and shear waves were measured by ultrasound at 0.85 MHz in dry and water-saturated specimens at high and low effective pressures.The cracks were then counted in polished sections of the specimens. In ``dry`` specimens with both dry and saturated cracks, Hudson`s theory overpredicted observed crack densities by a constant amount that is attributed to the observed value being systematically underestimated. The theory made poor predictions for fully saturated specimens. Shear-wave splitting, caused by anisotropy due to both crystal and crack alignment, was observed. Cracks were seen to follow grain boundaries rather than the direction of maximum compression due to loading. The results demonstrate that Hudson`s theory may be used in some cases to determine crack and fracture densities from compressional- and shear-wave velocity data.« less
NASA Technical Reports Server (NTRS)
Masiulaniec, K. Cyril; Vanfossen, G. James, Jr.; Dewitt, Kenneth J.; Dukhan, Nihad
1995-01-01
A technique was developed to cast frozen ice shapes that had been grown on a metal surface. This technique was applied to a series of ice shapes that were grown in the NASA Lewis Icing Research Tunnel on flat plates. Nine flat plates, 18 inches square, were obtained from which aluminum castings were made that gave good ice shape characterizations. Test strips taken from these plates were outfitted with heat flux gages, such that when placed in a dry wind tunnel, can be used to experimentally map out the convective heat transfer coefficient in the direction of flow from the roughened surfaces. The effects on the heat transfer coefficient for both parallel and accelerating flow will be studied. The smooth plate model verification baseline data as well as one ice roughened test case are presented.
Jiménez-Osés, Gonzalo; Brockway, Anthony J; Shaw, Jared T; Houk, K N
2013-05-01
The mechanism of direct displacement of alkoxy groups in vinylogous and aromatic esters by Grignard reagents, a reaction that is not observed with expectedly better tosyloxy leaving groups, is elucidated computationally. The mechanism of this reaction has been determined to proceed through the inner-sphere attack of nucleophilic alkyl groups from magnesium to the reacting carbons via a metalaoxetane transition state. The formation of a strong magnesium chelate with the reacting alkoxy and carbonyl groups dictates the observed reactivity and selectivity. The influence of ester, ketone, and aldehyde substituents was investigated. In some cases, the calculations predicted the formation of products different than those previously reported; these predictions were then verified experimentally. The importance of studying the actual system, and not simplified models as computational systems, is demonstrated.
Jiménez-Osés, Gonzalo; Brockway, Anthony J.; Shaw, Jared T.; Houk, K. N.
2013-01-01
The mechanism of direct displacement of alkoxy groups in vinylogous and aromatic esters by Grignard reagents, a reaction that is not observed with expectedly better tosyloxy leaving groups, is elucidated computationally. The mechanism of this reaction has been determined to proceed through the inner-sphere attack of nucleophilic alkyl groups from magnesium to the reacting carbons via a metalaoxetane transition state. The formation of a strong magnesium chelate with the reacting alkoxy and carbonyl groups dictates the observed reactivity and selectivity. The influence of ester, ketone and aldehyde substituents was investigated. In some cases, the calculations predicted the formation of products different than those previously reported; these predictions were then verified experimentally. The importance of studying the actual system, and not simplified models as computational systems, is demonstrated. PMID:23601086
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Anzic, G.
1979-01-01
NASA is conducting a series of millimeter wave satellite communication systems and market studies to: (1) determine potential domestic 30/20 GHz satellite concepts and market potential, and (2) establish the requirements for a suitable technology verification payload which, although intended to be modest in capacity, would sufficiently demonstrate key technologies and experimentally address key operational issues. Preliminary results and critical issues of the current contracted effort are described. Also included is a description of a NASA-developed multibeam satellite payload configuration which may be representative of concepts utilized in a technology flight verification program.
NASA Technical Reports Server (NTRS)
Neal, G.
1988-01-01
Flexible walled wind tunnels have for some time been used to reduce wall interference effects at the model. A necessary part of the 3-D wall adjustment strategy being developed for the Transonic Self-Streamlining Wind Tunnel (TSWT) of Southampton University is the use of influence coefficients. The influence of a wall bump on the centerline flow in TSWT has been calculated theoretically using a streamline curvature program. This report details the experimental verification of these influence coefficients and concludes that it is valid to use the theoretically determined values in 3-D model testing.
Analysis and discussion on the experimental data of electrolyte analyzer
NASA Astrophysics Data System (ADS)
Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei
2018-06-01
In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.
78 FR 53027 - Balloting Materials Postage
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-28
... Postal Service recognizes the potential for inconsistency when voters self-print ballots and use... supplementary notification specifically directed to voters who self-print and return ballots, such as: ``Please... verification to ensure compliance with the marking requirements. As part of the verification procedure, mailers...
9 CFR 416.17 - Agency verification.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...
9 CFR 416.17 - Agency verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...
9 CFR 416.17 - Agency verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...
9 CFR 416.17 - Agency verification.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...
9 CFR 416.17 - Agency verification.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment. ...
Cross-checking of Large Evaluated and Experimental Nuclear Reaction Databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeydina, O.; Koning, A.J.; Soppera, N.
2014-06-15
Automated methods are presented for the verification of large experimental and evaluated nuclear reaction databases (e.g. EXFOR, JEFF, TENDL). These methods allow an assessment of the overall consistency of the data and detect aberrant values in both evaluated and experimental databases.
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
Energy- and time-resolved detection of prompt gamma-rays for proton range verification.
Verburg, Joost M; Riley, Kent; Bortfeld, Thomas; Seco, Joao
2013-10-21
In this work, we present experimental results of a novel prompt gamma-ray detector for proton beam range verification. The detection system features an actively shielded cerium-doped lanthanum(III) bromide scintillator, coupled to a digital data acquisition system. The acquisition was synchronized to the cyclotron radio frequency to separate the prompt gamma-ray signals from the later-arriving neutron-induced background. We designed the detector to provide a high energy resolution and an effective reduction of background events, enabling discrete proton-induced prompt gamma lines to be resolved. Measuring discrete prompt gamma lines has several benefits for range verification. As the discrete energies correspond to specific nuclear transitions, the magnitudes of the different gamma lines have unique correlations with the proton energy and can be directly related to nuclear reaction cross sections. The quantification of discrete gamma lines also enables elemental analysis of tissue in the beam path, providing a better prediction of prompt gamma-ray yields. We present the results of experiments in which a water phantom was irradiated with proton pencil-beams in a clinical proton therapy gantry. A slit collimator was used to collimate the prompt gamma-rays, and measurements were performed at 27 positions along the path of proton beams with ranges of 9, 16 and 23 g cm(-2) in water. The magnitudes of discrete gamma lines at 4.44, 5.2 and 6.13 MeV were quantified. The prompt gamma lines were found to be clearly resolved in dimensions of energy and time, and had a reproducible correlation with the proton depth-dose curve. We conclude that the measurement of discrete prompt gamma-rays for in vivo range verification of clinical proton beams is feasible, and plan to further study methods and detector designs for clinical use.
Investigation of designated eye position and viewing zone for a two-view autostereoscopic display.
Huang, Kuo-Chung; Chou, Yi-Heng; Lin, Lang-chin; Lin, Hoang Yan; Chen, Fu-Hao; Liao, Ching-Chiu; Chen, Yi-Han; Lee, Kuen; Hsu, Wan-Hsuan
2014-02-24
Designated eye position (DEP) and viewing zone (VZ) are important optical parameters for designing a two-view autostereoscopic display. Although much research has been done to date, little empirical evidence has been found to establish a direct relationship between design and measurement. More rigorous studies and verifications to investigate DEP and to ascertain the VZ criterion will be valuable. We propose evaluation metrics based on equivalent luminance (EL) and binocular luminance (BL) to figure out DEP and VZ for a two-view autostereoscopic display. Simulation and experimental results prove that our proposed evaluation metrics can be used to find the DEP and VZ accurately.
NASA Astrophysics Data System (ADS)
Remy, Samuel; Benedetti, Angela; Jones, Luke; Razinger, Miha; Haiden, Thomas
2014-05-01
The WMO-sponsored Working Group on Numerical Experimentation (WGNE) set up a project aimed at understanding the importance of aerosols for numerical weather prediction (NWP). Three cases are being investigated by several NWP centres with aerosol capabilities: a severe dust case that affected Southern Europe in April 2012, a biomass burning case in South America in September 2012, and an extreme pollution event in Beijing (China) which took place in January 2013. At ECMWF these cases are being studied using the MACC-II system with radiatively interactive aerosols. Some preliminary results related to the dust and the fire event will be presented here. A preliminary verification of the impact of the aerosol-radiation direct interaction on surface meteorological parameters such as 2m Temperature and surface winds over the region of interest will be presented. Aerosol optical depth (AOD) verification using AERONET data will also be discussed. For the biomass burning case, the impact of using injection heights estimated by a Plume Rise Model (PRM) for the biomass burning emissions will be presented.
Proximity enhanced quantum spin Hall state in graphene
Kou, Liangzhi; Hu, Feiming; Yan, Binghai; ...
2015-02-23
Graphene is the first model system of two-dimensional topological insulator (TI), also known as quantum spin Hall (QSH) insulator. The QSH effect in graphene, however, has eluded direct experimental detection because of its extremely small energy gap due to the weak spin–orbit coupling. Here we predict by ab initio calculations a giant (three orders of magnitude) proximity induced enhancement of the TI energy gap in the graphene layer that is sandwiched between thin slabs of Sb 2Te 3 (or MoTe 2). This gap (1.5 meV) is accessible by existing experimental techniques, and it can be further enhanced by tuning themore » interlayer distance via compression. We reveal by a tight-binding study that the QSH state in graphene is driven by the Kane–Mele interaction in competition with Kekulé deformation and symmetry breaking. As a result, the present work identifies a new family of graphene-based TIs with an observable and controllable bulk energy gap in the graphene layer, thus opening a new avenue for direct verification and exploration of the long-sought QSH effect in graphene.« less
Verification of a VRF Heat Pump Computer Model in EnergyPlus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nigusse, Bereket; Raustad, Richard
2013-06-15
This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-loadmore » performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.« less
Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. D. Habel
2008-05-20
This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2014 CFR
2014-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2012 CFR
2012-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2013 CFR
2013-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
9 CFR 417.8 - Agency verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...
Generic Protocol for the Verification of Ballast Water Treatment Technology. Version 5.1
2010-09-01
the Protocol ..................................................................................... 2 1.4 Verification Testing Process ...Volumes, Containers and Processing .................................................................38 Table 10. Recommendation for Water...or persistent distortion of a measurement process that causes errors in one direction. Challenge Water: Water supplied to a treatment system under
NASA Technical Reports Server (NTRS)
Kashangaki, Thomas A. L.
1992-01-01
This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.
Determination of the V- I characteristic of NbTi wires in a wide resistivity range
NASA Astrophysics Data System (ADS)
Musenich, R.; Fabbricatore, P.; Farinon, S.; Greco, M.
2004-01-01
The voltage-current curve of superconducting wires and cables is generally directly measured within the resistivity range 10 -15-10 -12 Ω m being limited by the sensitivity and the Joule dissipation. Indirect measurements, based on the current decay in a superconducting loop, allow the determination of the curve in lower resistivity regions. Using a loop made with a Cu-NbTi wire we performed indirect V- I measurements in the range 10 -19-10 -16 Ω m. The comparison of the curves obtained by the direct and indirect method allows the experimental verification of the power law describing the transition of the superconducting wire to the normal state in a wide resistivity range. The law is discussed and justified on the basis of the superconductor behaviour in the flux creep dynamic regime.
NASA Technical Reports Server (NTRS)
Martinez, Pedro A.; Dunn, Kevin W.
1987-01-01
This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.
Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics
NASA Astrophysics Data System (ADS)
Camati, Patrice A.; Serra, Roberto M.
2018-04-01
Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.
A critique of the hypothesis, and a defense of the question, as a framework for experimentation.
Glass, David J
2010-07-01
Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.
Collinear cluster tri-partition: Kinematics constraints and stability of collinearity
NASA Astrophysics Data System (ADS)
Holmvall, P.; Köster, U.; Heinz, A.; Nilsson, T.
2017-01-01
Background: A new mode of nuclear fission has been proposed by the FOBOS Collaboration, called collinear cluster tri-partition (CCT), and suggests that three heavy fission fragments can be emitted perfectly collinearly in low-energy fission. This claim is based on indirect observations via missing-energy events using the 2 v 2 E method. This proposed CCT seems to be an extraordinary new aspect of nuclear fission. It is surprising that CCT escaped observation for so long given the relatively high reported yield of roughly 0.5 % relative to binary fission. These claims call for an independent verification with a different experimental technique. Purpose: Verification experiments based on direct observation of CCT fragments with fission-fragment spectrometers require guidance with respect to the allowed kinetic-energy range, which we present in this paper. Furthermore, we discuss corresponding model calculations which, if CCT is found in such verification experiments, could indicate how the breakups proceed. Since CCT refers to collinear emission, we also study the intrinsic stability of collinearity. Methods: Three different decay models are used that together span the timescales of three-body fission. These models are used to calculate the possible kinetic-energy ranges of CCT fragments by varying fragment mass splits, excitation energies, neutron multiplicities, and scission-point configurations. Calculations are presented for the systems 235U(nth,f ) and 252Cf(s f ) , and the fission fragments previously reported for CCT; namely, isotopes of the elements Ni, Si, Ca, and Sn. In addition, we use semiclassical trajectory calculations with a Monte Carlo method to study the intrinsic stability of collinearity. Results: CCT has a high net Q value but, in a sequential decay, the intermediate steps are energetically and geometrically unfavorable or even forbidden. Moreover, perfect collinearity is extremely unstable, and broken by the slightest perturbation. Conclusions: According to our results, the central fragment would be very difficult to detect due to its low kinetic energy, raising the question of why other 2 v 2 E experiments could not detect a missing-mass signature corresponding to CCT. Considering the high kinetic energies of the outer fragments reported in our study, direct-observation experiments should be able to observe CCT. Furthermore, we find that a realization of CCT would require an unphysical fine tuning of the initial conditions. Finally, our stability calculations indicate that, due to the pronounced instability of the collinear configuration, a prolate scission configuration does not necessarily lead to collinear emission, nor does equatorial emission necessarily imply an oblate scission configuration. In conclusion, our results enable independent experimental verification and encourage further critical theoretical studies of CCT.
Bistatic radar sea state monitoring system design
NASA Technical Reports Server (NTRS)
Ruck, G. T.; Krichbaum, C. K.; Everly, J. O.
1975-01-01
Remote measurement of the two-dimensional surface wave height spectrum of the ocean by the use of bistatic radar techniques was examined. Potential feasibility and experimental verification by field experiment are suggested. The required experimental hardware is defined along with the designing, assembling, and testing of several required experimental hardware components.
Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results
NASA Astrophysics Data System (ADS)
Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.
2017-05-01
The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.
Flow Friction or Spontaneous Ignition?
NASA Technical Reports Server (NTRS)
Stoltzfus, Joel M.; Gallus, Timothy D.; Sparks, Kyle
2012-01-01
"Flow friction," a proposed ignition mechanism in oxygen systems, has proved elusive in attempts at experimental verification. In this paper, the literature regarding flow friction is reviewed and the experimental verification attempts are briefly discussed. Another ignition mechanism, a form of spontaneous combustion, is proposed as an explanation for at least some of the fire events that have been attributed to flow friction in the literature. In addition, the results of a failure analysis performed at NASA Johnson Space Center White Sands Test Facility are presented, and the observations indicate that spontaneous combustion was the most likely cause of the fire in this 2000 psig (14 MPa) oxygen-enriched system.
2016-02-02
understanding is the experimental verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in...and additional qualifiers separated by commas, e.g. Smith, Richard, J, Jr. 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES). Self -explanatory... verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in shape and magnitude with all of our
Cleanup Verification Package for the 118-F-6 Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. M. Sulloway
2008-10-02
This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.
Tichit, Paul-Henri; Burokur, Shah Nawaz; Qiu, Cheng-Wei; de Lustrac, André
2013-09-27
It has long been conjectured that isotropic radiation by a simple coherent source is impossible due to changes in polarization. Though hypothetical, the isotropic source is usually taken as the reference for determining a radiator's gain and directivity. Here, we demonstrate both theoretically and experimentally that an isotropic radiator can be made of a simple and finite source surrounded by electric-field-driven LC resonator metamaterials designed by space manipulation. As a proof-of-concept demonstration, we show the first isotropic source with omnidirectional radiation from a dipole source (applicable to all distributed sources), which can open up several possibilities in axion electrodynamics, optical illusion, novel transformation-optic devices, wireless communication, and antenna engineering. Owing to the electric- field-driven LC resonator realization scheme, this principle can be readily applied to higher frequency regimes where magnetism is usually not present.
Experimental Results from the Thermal Energy Storage-1 (TES-1) Flight Experiment
NASA Technical Reports Server (NTRS)
Wald, Lawrence W.; Tolbert, Carol; Jacqmin, David
1995-01-01
The Thermal Energy Storage-1 (TES-1) is a flight experiment that flew on the Space Shuttle Columbia (STS-62), in March 1994, as part of the OAST-2 mission. TES-1 is the first experiment in a four experiment suite designed to provide data for understanding the long duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data have never been obtained before and have direct application for the development of space-based solar dynamic (SD) power systems. These power systems will store solar energy in a thermal energy salt such as lithium fluoride or calcium fluoride. The stored energy is extracted during the shade portion of the orbit. This enables the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed for predicting performance of a spaced-based solar dynamic power system. Experimental verification of the analytical predictions is needed prior to using the analytical results for future space power design applications. The four TES flight experiments will be used to obtain the needed experimental data. This paper will focus on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code. The TES-1 conceptual development, hardware design, final development, and system verification testing were accomplished at the NASA lewis Research Center (LeRC). TES-1 was developed under the In-Space Technology Experiment Program (IN-STEP), which sponsors NASA, industry, and university flight experiments designed to enable and enhance space flight technology. The IN-STEP Program is sponsored by the Office of Space Access and Technology (OSAT).
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
Rudolph A. Marcus and His Theory of Electron Transfer Reactions
early 1950s and soon discovered ... a strong experimental program at Brookhaven on electron-transfer experimental work provided the first verification of several of the predictions of his theory. This, in turn Marcus theory, namely, experimental evidence for the so-called "inverted region" where rates
NASA Astrophysics Data System (ADS)
Yang, Wenming; Wang, Pengkai; Hao, Ruican; Ma, Buchuan
2017-03-01
Analytical and numerical calculation methods of the radial magnetic levitation force on the cylindrical magnets in cylindrical vessels filled with ferrofluid was reviewed. An experimental apparatus to measure this force was designed and tailored, which could measure the forces in a range of 0-2.0 N with an accuracy of 0.001 N. After calibrated, this apparatus was used to study the radial magnetic levitation force experimentally. The results showed that the numerical method overestimates this force, while the analytical ones underestimate it. The maximum deviation between the numerical results and the experimental ones was 18.5%, while that between the experimental results with the analytical ones attained 68.5%. The latter deviation narrowed with the lengthening of the magnets. With the aids of the experimental verification of the radial magnetic levitation force, the effect of eccentric distance of magnets on the viscous energy dissipation in ferrofluid dampers could be assessed. It was shown that ignorance of the eccentricity of magnets during the estimation could overestimate the viscous dissipation in ferrofluid dampers.
Paulides, Margarethus M; Bakker, Jurriaan F; van Rhoon, Gerard C
2007-06-01
To experimentally verify the feasibility of focused heating in the neck region by an array of two rings of six electromagnetic antennas. We also measured the dynamic specific absorption rate (SAR) steering possibilities of this setup and compared these SAR patterns to simulations. Using a specially constructed laboratory prototype head-and-neck applicator, including a neck-mimicking cylindrical muscle phantom, we performed SAR measurements by electric field, Schottky-diode sheet measurements and, using the power-pulse technique, by fiberoptic thermometry and infrared thermography. Using phase steering, we also steered the SAR distribution in radial and axial directions. All measured distributions were compared with the predictions by a finite-difference time-domain-based electromagnetic simulator. A central 50% iso-SAR focus of 35 +/- 3 mm in diameter and about 100 +/- 15 mm in length was obtained for all investigated settings. Furthermore, this SAR focus could be steered toward the desired location in the radial and axial directions with an accuracy of approximately 5 mm. The SAR distributions as measured by all three experimental methods were well predicted by the simulations. The results of our study have shown that focused heating in the neck is feasible and that this focus can be effectively steered in the radial and axial directions. For quality assurance measurements, we believe that the Schottky-diode sheet provides the best compromise among effort, speed, and accuracy, although a more specific and improved design is warranted.
Comparison of Caplan's irreversible thermodynamic theory of muscle contraction with chemical data.
Bornhorst, W J; Minardi, J E
1969-05-01
Recently Caplan (1) applied the concepts of irreversible thermodynamics and cybernetics to contracting muscle and derived Hill's force-velocity relation. Wilkie and Woledge (2) then compared Caplan's theory to chemical rates inferred from heat data and concluded that the theory was not consistent with the data. Caplan defended his theory in later papers (3, 4) but without any direct experimental verifications. As Wilkie and Woledge (2) point out, the rate of phosphorylcreatine (PC) breakdown during steady states of shortening has not been observed because of technical difficulties. In this paper it is shown that the rate equations may be directly integrated with time to obtain relations among actual quantities instead of rates. The validity of this integration is based on experimental evidence which indicates that certain combinations of the transport coefficients are constant with muscle length. These equations are then directly compared to experimental data of Cain, Infante, and Davies (5) with the following conclusions: (a) The measured variations of DeltaPC for isotonic contractions are almost exactly as predicted by Caplan's theory. (b) The value of the chemical rate ratio, nu(m)/nu(o), obtained from these data was 3.53 which is close to the value of 3 suggested by Caplan (3). (c) The maximum value of the chemical affinity for PC splitting was found to be 10.6 k cal/mole which is as expected from in vitro measurements (2). Because of the excellent agreement between theory and experiment, we conclude that Caplan's theory definitely warrants further investigation.
Behavioral biometrics for verification and recognition of malicious software agents
NASA Astrophysics Data System (ADS)
Yampolskiy, Roman V.; Govindaraju, Venu
2008-04-01
Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.
Verification of the databases EXFOR and ENDF
NASA Astrophysics Data System (ADS)
Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel
2017-09-01
The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.
NASA Astrophysics Data System (ADS)
Miedzinska, Danuta; Boczkowska, Anna; Zubko, Konrad
2010-07-01
In the article a method of numerical verification of experimental results for magnetorheological elastomer samples (MRE) is presented. The samples were shaped into cylinders with diameter of 8 mm and height of 20 mm with various carbonyl iron volume shares (1,5%, 11,5% and 33%). The diameter of soft ferromagnetic substance particles ranged from 6 to 9 μm. During the experiment, initially bended samples were exposed to the magnetic field with intensity levels at 0,1T, 0,3T, 0,5T, 0,7 and 1T. The reaction of the sample to the field action was measured as a displacement of a specimen. Numerical calculation was carried out with the MSC Patran/Marc computer code. For the purpose of numerical analysis the orthotropic material model with the material properties of magnetorheological elastomer along the iron chains, and of the pure elastomer along other directions, was applied. The material properties were obtained from the experimental tests. During the numerical analysis, the initial mechanical load resulting from cylinder deflection was set. Then, the equivalent external force, that was set on the basis of analytical calculations of intermolecular reaction within iron chains in the specific magnetic field, was put on the bended sample. Correspondence of such numerical model with results of the experiment was verified. Similar results of the experiments and both theoretical and FEM analysis indicates that macroscopic modeling of magnetorheological elastomer mechanical properties as orthotropic material delivers accurate enough description of the material's behavior.
Star motion around rotating black hole in the Galactic Center in real time
NASA Astrophysics Data System (ADS)
Dokuchaev, Vyacheslav; Nazarova, Natalia
2017-12-01
The Event Horizon Telescope team intends by the 2020 to resolve the shadow of supermassive black hole SgrA* in the Galactic Center. It would be the first attempt for direct identification of the enigmatic black hole. In other words, it would be the first experimental verification of the General Relativity in the strong field limit. There is a chance to find a star moving on the relativistic orbit close to this black hole. We present the animated numerical model of the gravitational lensing of a star (or any other lighting probe), moving around rotating Kerr black hole in the Galactic Center and viewed by the distant observer.
DESIGN OF CIRCUITS FOR THE PATTERN ARTICULATION UNIT. Report No. 127
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, K.C.
1962-08-31
The Pattern Articulation Unit embodies a central core of 1024 identical processing modules called stalactites'' arranged in a two-dimensional array with only local connectivity. Two possible complete circuit realizations of the stalactite are described. Stalactites of either design contain about 50 transistors, 250 diodes, 250 resistors, and 50 capacitors. Stalactite organization, signal flow, the bubbling register connection, the requirements of a working register, design of stacking logic, mode of operation, circuit design, direct and conditional input, design of bubbling logic, complement circuits, output and circuit, up and down drivers, and cable diivers and terminators are described. Experimental verification of variousmore » components is discussed. (M.C.G.)« less
The absence of gravitational waves and the foundations of Relativistic Cosmology
NASA Astrophysics Data System (ADS)
Djidjian, Robert
2015-07-01
Modern relativistic cosmology is based on Albert Einstein's teaching of general relativity. Observational and experimental impressive verification of general relativity have created among the astrophysicists the conviction that general relativity and relativistic cosmology are absolutely true theories. Unfortunately, the most important conclusion of general relativity is that the necessary existence of gravitational waves has been rejected by all the experiments up to the present time. There is also a kind of direct objection to the conception of expanding Universe: with the expansion of space identically expands the measuring stick, which makes the distances between the galaxies unchanged. So it should be quite reasonable to open discussions regarding the status of both general relativity and relativistic cosmology.
Dynamic tire pressure sensor for measuring ground vibration.
Wang, Qi; McDaniel, James Gregory; Wang, Ming L
2012-11-07
This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application.
Dynamic Tire Pressure Sensor for Measuring Ground Vibration
Wang, Qi; McDaniel, James Gregory; Wang, Ming L.
2012-01-01
This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application. PMID:23202206
Elastic suspension of a wind tunnel test section
NASA Technical Reports Server (NTRS)
Hacker, R.; Rock, S.; Debra, D. B.
1982-01-01
Experimental verification of the theory describing arbitrary motions of an airfoil is reported. The experimental apparatus is described. A mechanism was designed to provide two separate degrees of freedom without friction or backlash to mask the small but important aerodynamic effects of interest.
A Modeling Approach for Plastic-Metal Laser Direct Joining
NASA Astrophysics Data System (ADS)
Lutey, Adrian H. A.; Fortunato, Alessandro; Ascari, Alessandro; Romoli, Luca
2017-09-01
Laser processing has been identified as a feasible approach to direct joining of metal and plastic components without the need for adhesives or mechanical fasteners. The present work sees development of a modeling approach for conduction and transmission laser direct joining of these materials based on multi-layer optical propagation theory and numerical heat flow simulation. The scope of this methodology is to predict process outcomes based on the calculated joint interface and upper surface temperatures. Three representative cases are considered for model verification, including conduction joining of PBT and aluminum alloy, transmission joining of optically transparent PET and stainless steel, and transmission joining of semi-transparent PA 66 and stainless steel. Conduction direct laser joining experiments are performed on black PBT and 6082 anticorodal aluminum alloy, achieving shear loads of over 2000 N with specimens of 2 mm thickness and 25 mm width. Comparison with simulation results shows that consistently high strength is achieved where the peak interface temperature is above the plastic degradation temperature. Comparison of transmission joining simulations and published experimental results confirms these findings and highlights the influence of plastic layer optical absorption on process feasibility.
Accelerating functional verification of an integrated circuit
Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.
2015-10-27
Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.
Experimental verification of nanofluid shear-wave reconversion in ultrasonic fields.
Forrester, Derek Michael; Huang, Jinrui; Pinfield, Valerie J; Luppé, Francine
2016-03-14
Here we present the verification of shear-mediated contributions to multiple scattering of ultrasound in suspensions. Acoustic spectroscopy was carried out with suspensions of silica of differing particle sizes and concentrations in water to find the attenuation at a broad range of frequencies. As the particle sizes approach the nanoscale, commonly used multiple scattering models fail to match experimental results. We develop a new model, taking into account shear mediated contributions, and find excellent agreement with the attenuation spectra obtained using two types of spectrometer. The results determine that shear-wave phenomena must be considered in ultrasound characterisation of nanofluids at even relatively low concentrations of scatterers that are smaller than one micrometre in diameter.
NASA Astrophysics Data System (ADS)
Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young
2018-05-01
This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.
Explaining Verification Conditions
NASA Technical Reports Server (NTRS)
Deney, Ewen; Fischer, Bernd
2006-01-01
The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.
Comments for A Conference on Verification in the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, James E.
2012-06-12
The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less
Verification of ANSYS Fluent and OpenFOAM CFD platforms for prediction of impact flow
NASA Astrophysics Data System (ADS)
Tisovská, Petra; Peukert, Pavel; Kolář, Jan
The main goal of the article is a verification of the heat transfer coefficient numerically predicted by two CDF platforms - ANSYS-Fluent and OpenFOAM on the problem of impact flows oncoming from 2D nozzle. Various mesh parameters and solver settings were tested under several boundary conditions and compared to known experimental results. The best solver setting, suitable for further optimization of more complex geometry is evaluated.
Experimental Verification of the Theory of Oscillating Airfoils
NASA Technical Reports Server (NTRS)
Silverstein, Abe; Joyner, Upshur T
1939-01-01
Measurements have been made of the lift on an airfoil in pitching oscillation with a continuous-recording, instantaneous-force balance. The experimental values for the phase difference between the angle of attack and the lift are shown to be in close agreement with theory.
Whisenant, Thomas C.; Ho, David T.; Benz, Ryan W.; Rogers, Jeffrey S.; Kaake, Robyn M.; Gordon, Elizabeth A.; Huang, Lan; Baldi, Pierre; Bardwell, Lee
2010-01-01
In order to fully understand protein kinase networks, new methods are needed to identify regulators and substrates of kinases, especially for weakly expressed proteins. Here we have developed a hybrid computational search algorithm that combines machine learning and expert knowledge to identify kinase docking sites, and used this algorithm to search the human genome for novel MAP kinase substrates and regulators focused on the JNK family of MAP kinases. Predictions were tested by peptide array followed by rigorous biochemical verification with in vitro binding and kinase assays on wild-type and mutant proteins. Using this procedure, we found new ‘D-site’ class docking sites in previously known JNK substrates (hnRNP-K, PPM1J/PP2Czeta), as well as new JNK-interacting proteins (MLL4, NEIL1). Finally, we identified new D-site-dependent MAPK substrates, including the hedgehog-regulated transcription factors Gli1 and Gli3, suggesting that a direct connection between MAP kinase and hedgehog signaling may occur at the level of these key regulators. These results demonstrate that a genome-wide search for MAP kinase docking sites can be used to find new docking sites and substrates. PMID:20865152
Verification of target motion effects on SAR imagery using the Gotcha GMTI challenge dataset
NASA Astrophysics Data System (ADS)
Hack, Dan E.; Saville, Michael A.
2010-04-01
This paper investigates the relationship between a ground moving target's kinematic state and its SAR image. While effects such as cross-range offset, defocus, and smearing appear well understood, their derivations in the literature typically employ simplifications of the radar/target geometry and assume point scattering targets. This study adopts a geometrical model for understanding target motion effects in SAR imagery, termed the target migration path, and focuses on experimental verification of predicted motion effects using both simulated and empirical datasets based on the Gotcha GMTI challenge dataset. Specifically, moving target imagery is generated from three data sources: first, simulated phase history for a moving point target; second, simulated phase history for a moving vehicle derived from a simulated Mazda MPV X-band signature; and third, empirical phase history from the Gotcha GMTI challenge dataset. Both simulated target trajectories match the truth GPS target position history from the Gotcha GMTI challenge dataset, allowing direct comparison between all three imagery sets and the predicted target migration path. This paper concludes with a discussion of the parallels between the target migration path and the measurement model within a Kalman filtering framework, followed by conclusions.
Black carbon is a term that is commonly used to describe strongly light absorbing carbon (LAC), which is thought to play a significant role in global climate change through direct absorption of light, interaction with clouds, and by reducing the reflectivity of snow and ice. BC ...
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NOAH > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code
National Centers for Environmental Prediction
/ VISION | About EMC EMC > GEFS > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code
Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar A.
2009-01-01
This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.
Structural verification for GAS experiments
NASA Technical Reports Server (NTRS)
Peden, Mark Daniel
1992-01-01
The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.
Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder
NASA Technical Reports Server (NTRS)
Lindsey, A. E.; Pecheur, Charles
2004-01-01
AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.
Ontology Matching with Semantic Verification.
Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R
2009-09-01
ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.
Implementing fluid dynamics obtained from GeoPET in reactive transport models
NASA Astrophysics Data System (ADS)
Lippmann-Pipke, Johanna; Eichelbaum, Sebastian; Kulenkampff, Johannes
2016-04-01
Flow and transport simulations in geomaterials are commonly conducted on high-resolution tomograms (μCT) of the pore structure or stochastic models that are calibrated with measured integral quantities, like break through curves (BTC). Yet, there existed virtually no method for experimental verification of the simulated velocity distribution results. Positron emission tomography (PET) has unrivaled sensitivity and robustness for non-destructive, quantitative, spatio-temporal measurement of tracer concentrations in body tissue. In the past decade, we empowered PET for its applicability in opaque/geological media - GeoPET (Kulenkampff et al.; Kulenkampff et al., 2008; Zakhnini et al., 2013) and have developed detailed correction schemes to bring the images into sharp focus. Thereby it is the appropriate method for experimental verification and calibration of computer simulations of pore-scale transport by means of the observed propagation of a tracer pulse, c_PET(x,y,z,t). In parallel, we aimed at deriving velocity and porosity distributions directly from our concentration time series of fluid flow processes in geomaterials. This would allow us to directly benefit from lab scale observations and to parameterize respective numerical transport models. For this we have developed a robust spatiotemporal (3D+t) parameter extraction algorithm. Here, we will present its functionality, and demonstrate the use of obtained velocity distributions in finite element simulations of reactive transport processes on drill core scale. Kulenkampff, J., Gruendig, M., Zakhnini, A., Gerasch, R., and Lippmann-Pipke, J.: Process tomography of diffusion with PET for evaluating anisotropy and heterogeneity, Clay Minerals, in press. Kulenkampff, J., Gründig, M., Richter, M., and Enzmann, F.: Evaluation of positron emission tomography for visualisation of migration processes in geomaterials, Physics and Chemistry of the Earth, 33, 937-942, 2008. Zakhnini, A., Kulenkampff, J., Sauerzapf, S., Pietrzyk, U., and Lippmann-Pipke, J.: Monte Carlo simulations of GeoPET experiments: 3D images of tracer distributions (18-F, 124-I and 58-Co) in Opalinus Clay, anhydrite and quartz, Computers and Geosciences, 57 183-196, 2013.
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
Manipulation strategies for massive space payloads
NASA Technical Reports Server (NTRS)
Book, Wayne J.
1989-01-01
Control for the bracing strategy is being examined. It was concluded earlier that trajectory planning must be improved to best achieve the bracing motion. Very interesting results were achieved which enable the inverse dynamics of flexible arms to be calculated for linearized motion in a more efficient manner than previously published. The desired motion of the end point beginning at t=0 and ending at t=t sub f is used to calculate the required torque at the joint. The solution is separated into a causal function that is zero for t is less than 0 and an accusal function which is zero for t is greater than t sub f. A number of alternative end point trajectories were explored in terms of the peak torque required, the amount of anticipatory action, and other issues. The single link case is the immediate subject and an experimental verification of that case is being performed. Modeling with experimental verification of closed chain dynamics continues. Modeling effort has pointed out inaccuracies that result from the choice of numerical techniques used to incorporate the closed chain constraints when modeling our experimental prototype RALF (Robotic Arm Large and Flexible). Results were compared to TREETOPS, a multi body code. The experimental verification work is suggesting new ways to make comparisons with systems having structural linearity and joint and geometric nonlinearity. The generation of inertial forces was studied with a small arm that will damp the large arm's vibration.
Experimental investigation of practical unforgeable quantum money
NASA Astrophysics Data System (ADS)
Bozzio, Mathieu; Orieux, Adeline; Trigo Vidarte, Luis; Zaquine, Isabelle; Kerenidis, Iordanis; Diamanti, Eleni
2018-01-01
Wiesner's unforgeable quantum money scheme is widely celebrated as the first quantum information application. Based on the no-cloning property of quantum mechanics, this scheme allows for the creation of credit cards used in authenticated transactions offering security guarantees impossible to achieve by classical means. However, despite its central role in quantum cryptography, its experimental implementation has remained elusive because of the lack of quantum memories and of practical verification techniques. Here, we experimentally implement a quantum money protocol relying on classical verification that rigorously satisfies the security condition for unforgeability. Our system exploits polarization encoding of weak coherent states of light and operates under conditions that ensure compatibility with state-of-the-art quantum memories. We derive working regimes for our system using a security analysis taking into account all practical imperfections. Our results constitute a major step towards a real-world realization of this milestone protocol.
The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?
Schaun, Gustavo Z
2017-12-08
Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
Qiao, Ruimin; Wray, L. Andrew; Kim, Jung -Hyun; ...
2015-11-11
The LiNi 0.5Mn 1.5O 4 spinel is an appealing cathode material for next generation rechargeable Li-ion batteries due to its high operating voltage of ~4.7 V (vs Li/Li +). Although it is widely believed that the full range of electrochemical cycling involves the redox of Ni(II)/(IV), it has not been experimentally clarified whether Ni(III) exists as the intermediate state or a double-electron transfer takes place. Here, combined with theoretical calculations, we show unambiguous spectroscopic evidence of the Ni(III) state when the LiNi 0.5Mn 1.5O 4 electrode is half charged. This provides a direct verification of single-electron-transfer reactions in LiNi 0.5Mnmore » 1.5O 4 upon cycling, namely, from Ni(II) to Ni(III), then to Ni(IV). Additionally, by virtue of its surface sensitivity, soft X-ray absorption spectroscopy also reveals the electrochemically inactive Ni 2+ and Mn 2+ phases on the electrode surface. Our work provides the long-awaited clarification of the single-electron transfer mechanism in LiNi 0.5Mn 1.5O 4 electrodes. Furthermore, the experimental results serve as a benchmark for further spectroscopic characterizations of Ni-based battery electrodes.« less
Face verification with balanced thresholds.
Yan, Shuicheng; Xu, Dong; Tang, Xiaoou
2007-01-01
The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.
Signature Verification Based on Handwritten Text Recognition
NASA Astrophysics Data System (ADS)
Viriri, Serestina; Tapamo, Jules-R.
Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.
Optical detection of random features for high security applications
NASA Astrophysics Data System (ADS)
Haist, T.; Tiziani, H. J.
1998-02-01
Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.
NASA Technical Reports Server (NTRS)
Luck, Rogelio; Ray, Asok
1990-01-01
The implementation and verification of the delay-compensation algorithm are addressed. The delay compensator has been experimentally verified at an IEEE 802.4 network testbed for velocity control of a DC servomotor. The performance of the delay-compensation algorithm was also examined by combined discrete-event and continuous-time simulation of the flight control system of an advanced aircraft that uses the SAE (Society of Automotive Engineers) linear token passing bus for data communications.
Implementation of a direct-imaging and FX correlator for the BEST-2 array
NASA Astrophysics Data System (ADS)
Foster, G.; Hickish, J.; Magro, A.; Price, D.; Zarb Adami, K.
2014-04-01
A new digital backend has been developed for the Basic Element for SKA Training II (BEST-2) array at Radiotelescopi di Medicina, INAF-IRA, Italy, which allows concurrent operation of an FX correlator, and a direct-imaging correlator and beamformer. This backend serves as a platform for testing some of the spatial Fourier transform concepts which have been proposed for use in computing correlations on regularly gridded arrays. While spatial Fourier transform-based beamformers have been implemented previously, this is, to our knowledge, the first time a direct-imaging correlator has been deployed on a radio astronomy array. Concurrent observations with the FX and direct-imaging correlator allow for direct comparison between the two architectures. Additionally, we show the potential of the direct-imaging correlator for time-domain astronomy, by passing a subset of beams though a pulsar and transient detection pipeline. These results provide a timely verification for spatial Fourier transform-based instruments that are currently in commissioning. These instruments aim to detect highly redshifted hydrogen from the epoch of reionization and/or to perform wide-field surveys for time-domain studies of the radio sky. We experimentally show the direct-imaging correlator architecture to be a viable solution for correlation and beamforming.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire
Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less
NASA Technical Reports Server (NTRS)
Ponchak, George E.; Chun, Donghoon; Yook, Jong-Gwan; Katehi, Linda P. B.
2001-01-01
Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior three-dimensional-finite element method (3-D-FEM) electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually Increases coupling between the lines: however, if the top of the via posts are connected by a metal strip, coupling is reduced. In this paper, experimental verification of the 3-D-FEM simulations is demonstrated for commercially fabricated low temperature cofired ceramic (LTCC) packages. In addition, measured attenuation of microstrip lines surrounded by the shielding structures is presented and shows that shielding structures do not change the attenuation characteristics of the line.
DOT National Transportation Integrated Search
1987-08-01
One of the primary reasons that highway departments are hesitant to use heat-straightening techniques to repair damaged steel girders is the lack of experimental verification of the process. A comprehensive experimental program on the subject has bee...
Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.
ERIC Educational Resources Information Center
Kaya, Azmi
1982-01-01
Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…
Comparison of Caplan's Irreversible Thermodynamic Theory of Muscle Contraction with Chemical Data
Bornhorst, W. J.; Minardi, J. E.
1969-01-01
Recently Caplan (1) applied the concepts of irreversible thermodynamics and cybernetics to contracting muscle and derived Hill's force-velocity relation. Wilkie and Woledge (2) then compared Caplan's theory to chemical rates inferred from heat data and concluded that the theory was not consistent with the data. Caplan defended his theory in later papers (3, 4) but without any direct experimental verifications. As Wilkie and Woledge (2) point out, the rate of phosphorylcreatine (PC) breakdown during steady states of shortening has not been observed because of technical difficulties. In this paper it is shown that the rate equations may be directly integrated with time to obtain relations among actual quantities instead of rates. The validity of this integration is based on experimental evidence which indicates that certain combinations of the transport coefficients are constant with muscle length. These equations are then directly compared to experimental data of Cain, Infante, and Davies (5) with the following conclusions: (a) The measured variations of ΔPC for isotonic contractions are almost exactly as predicted by Caplan's theory. (b) The value of the chemical rate ratio, νm/νo, obtained from these data was 3.53 which is close to the value of 3 suggested by Caplan (3). (c) The maximum value of the chemical affinity for PC splitting was found to be 10.6 k cal/mole which is as expected from in vitro measurements (2). Because of the excellent agreement between theory and experiment, we conclude that Caplan's theory definitely warrants further investigation. PMID:5786314
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
Dynamics of mechanical feedback-type hydraulic servomotors under inertia loads
NASA Technical Reports Server (NTRS)
Gold, Harold; Otto, Edward W; Ransom, Victor L
1953-01-01
An analysis of the dynamics of mechanical feedback-type hydraulic servomotors under inertia loads is developed and experimental verification is presented. The analysis, which is developed in terms of two physical parameters, yields direct expressions for the following dynamic responses: (1) the transient response to a step input and the maximum cylinder pressure during the transient and (2) the variation of amplitude attenuation and phase shift with the frequency of a sinusoidally varying input. The validity of the analysis is demonstrated by means of recorded transient and frequency responses obtained on two servomotors. The calculated responses are in close agreement with the measured responses. The relations presented are readily applicable to the design as well as to the analysis of hydraulic servomotors.
Interference and partial which-way information: A quantitative test of duality in two-atom resonance
NASA Astrophysics Data System (ADS)
Abranyos, Y.; Jakob, M.; Bergou, J.
2000-01-01
We propose for the experimental verification of an inequality concerning wave-particle duality by Englert [Phys. Rev. Lett. 77, 2154 (1996)] relating (or setting) an upper limit on distinguishability and visibility in a two-way interferometer. The inequality, quantifies the concept of wave-particle duality. The considered two-way interferometer is a Young's double-slit experiment involving two four-level atoms and is a slightly modified version of that of the recent experiment by Eichmann et al. [Phys. Rev. Lett. 70, 2359 (1993)]. The fringe visibility depends on the detected polarization direction of the scattered light and a read out of the internal state of one of the two atoms provides a partial which-way information.
Electron-neutrino charged-current quasi-elastic scattering in MINERvA
NASA Astrophysics Data System (ADS)
Wolcott, Jeremy
2014-03-01
The electron-neutrino charged-current quasi-elastic (CCQE) cross-section on nuclei is an important input parameter to appearance-type neutrino oscillation experiments. Current experiments typically work from the muon neutrino CCQE cross-section and apply corrections from theoretical arguments to obtain a prediction for the electron neutrino CCQE cross-section, but to date there has been no precise experimental verification of these estimates at an energy scale appropriate to such experiments. We present the current status of a direct measurement of the electron neutrino CCQE differential cross-section as a function of the squared four-momentum transfer to the nucleus, Q2, in MINERvA. This talk will discuss event selection, background constraints, and the flux prediction used in the calculation.
Investigation of new semiinsulating behavior of III-V compounds
NASA Technical Reports Server (NTRS)
Lagowski, Jacek
1990-01-01
The investigation of defect interactions and properties related to semiinsulating behavior of III-V semiconductors resulted in about twenty original publications, six doctoral thesis, one masters thesis and numerous conference presentations. The studies of new compensation mechanisms involving transition metal impurities have defined direct effects associated with deep donor/acceptor levels acting as compensating centers. Electrical and optical properties of vanadium and titanium levels were determined in GaAs, InP and also in ternary compounds InGaAs. The experimental data provided basis for the verification of chemical trends and the VRBE method. They also defined compositional range for III-V mixed crystals whereby semiinsulating behavior can be achieved using transition elements deep levels and a suitable codoping with shallow donor/acceptor impurities.
Pinealitis accompanying equine recurrent uveitis.
Kalsow, C M; Dwyer, A E; Smith, A W; Nifong, T P
1993-01-01
There is no direct verification of pineal gland involvement in human uveitis. Specimens of pineal tissue are not available during active uveitis in human patients. Naturally occurring uveitis in horses gives us an opportunity to examine tissues during active ocular inflammation. We examined the pineal gland of a horse that was killed because it had become blind during an episode of uveitis. The clinical history and histopathology of the eyes were consistent with post-leptospiral equine recurrent uveitis. The pineal gland of this horse had significant inflammatory infiltration consisting mainly of lymphocytes with some eosinophils. This observation of pinealitis accompanying equine uveitis supports the animal models of experimental autoimmune uveoretinitis with associated pinealitis and suggests that the pineal gland may be involved in some human uveitides. Images PMID:8435400
NASA Astrophysics Data System (ADS)
Zhou, Changjiang; Hu, Bo; Chen, Siyu; He, Liping
2017-12-01
An enhanced flexible dynamic model for a valve train with clearance and multi-directional deformations is proposed based on finite element method (FEM), and verified by experiment. According to the measured cam profile, the available internal excitations in numerical solution to the model are achieved by using piecewise cubic Hermite interpolating polynomial. The comparative analysis demonstrates that the bending deformation of the rocker arm is much larger than the radial deformation, signifying the necessities of multi-directional deformations in dynamic analysis for the valve train. The effects of valve clearance and cam rotation speed on contact force, acceleration and dynamic transmission error (DTE) are investigated. Both theoretical predictions and experimental measurements show that the amplitudes and fluctuations of contact force, acceleration and DTE become larger, when the valve clearance or cam speed increases. It is found that including the elasticity and the damping will weaken the impact between the rocker arm and the valve on the components (not adjacent to the valve) at either unseating or seating scenario. Additionally, as valve clearance or cam rotation speed becomes larger, the valve lift and the working phase decrease, which eventually leads to inlet air reduction. Furthermore, our study shows that the combustion rate improvement, input torque, and components durability can be improved by tuning valve clearance or adjustment the cam profile.
Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features
NASA Technical Reports Server (NTRS)
Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed
2012-01-01
Phase I: The use of adhesive locking features or liquid locking compounds (LLCs) (e.g., Loctite) as a means of providing a secondary locking feature has been used on NASA programs since the Apollo program. In many cases Loctite was used as a last resort when (a) self-locking fasteners were no longer functioning per their respective drawing specification, (b) access was limited for removal & replacement, or (c) replacement could not be accomplished without severe impact to schedule. Long-term use of Loctite became inevitable in cases where removal and replacement of worn hardware was not cost effective and Loctite was assumed to be fully cured and working. The NASA Engineering & Safety Center (NESC) and United Space Alliance (USA) recognized the need for more extensive testing of Loctite grades to better understand their capabilities and limitations as a secondary locking feature. These tests, identified as Phase I, were designed to identify processing sensitivities, to determine proper cure time, the correct primer to use on aerospace nutplate, insert and bolt materials such as A286 and MP35N, and the minimum amount of Loctite that is required to achieve optimum breakaway torque values. The .1900-32 was the fastener size tested, due to wide usage in the aerospace industry. Three different grades of Loctite were tested. Results indicate that, with proper controls, adhesive locking features can be successfully used in the repair of locking features and should be considered for design. Phase II: Threaded fastening systems used in aerospace programs typically have a requirement for a redundant locking feature. The primary locking method is the fastener preload and the traditional redundant locking feature is a self-locking mechanical device that may include deformed threads, non-metallic inserts, split beam features, or other methods that impede movement between threaded members. The self-locking resistance of traditional locking features can be directly verified during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.
A framework of multitemplate ensemble for fingerprint verification
NASA Astrophysics Data System (ADS)
Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li
2012-12-01
How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.
NASA Astrophysics Data System (ADS)
Chen, Tian-Yu; Chen, Yang; Yang, Hu-Jiang; Xiao, Jing-Hua; Hu, Gang
2018-03-01
Nowadays, massive amounts of data have been accumulated in various and wide fields, it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data. It is often that the output data of systems are measurable while dynamic structures producing these data are hidden, and thus studies to reveal system structures by analyzing available data, i.e., reconstructions of systems become one of the most important tasks of information extractions. In the past, most of the works in this respect were based on theoretical analyses and numerical verifications. Direct analyses of experimental data are very rare. In physical science, most of the analyses of experimental setups were based on the first principles of physics laws, i.e., so-called top-down analyses. In this paper, we conducted an experiment of “Boer resonant instrument for forced vibration” (BRIFV) and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data, i.e., by applying the bottom-up strategy. Dynamics of the experimental set is strongly nonlinear and chaotic, and itʼs subjects to inevitable noises. We proposed to use high-order correlation computations to treat nonlinear dynamics; use two-time correlations to treat noise effects. By applying these approaches, we have successfully reconstructed the structure of the experimental setup, and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, D.; Yang, L. J., E-mail: yanglj@mail.xjtu.edu.cn; Ma, J. B.
The paper has proposed a new triggering method for long spark gap based on capillary plasma ejection and conducted the experimental verification under the extremely low working coefficient, which represents that the ratio of the spark gap charging voltage to the breakdown voltage is particularly low. The quasi-neutral plasma is ejected from the capillary and develops through the axial direction of the spark gap. The electric field in the spark gap is thus changed and its breakdown is incurred. It is proved by the experiments that the capillary plasma ejection is effective in triggering the long spark gap under themore » extremely low working coefficient in air. The study also indicates that the breakdown probabilities, the breakdown delay, and the delay dispersion are all mainly determined by the characteristics of the ejected plasma, including the length of the plasma flow, the speed of the plasma ejection, and the ionization degree of the plasma. Moreover, the breakdown delay and the delay dispersion increase with the length of the long spark gap, and the polarity effect exists in the triggering process. Lastly, compared with the working patterns of the triggering device installed in the single electrode, the working pattern of the devices installed in both the two electrodes, though with the same breakdown process, achieves the ignition under longer gap distance. To be specific, at the gap length of 14 cm and the working coefficient of less than 2%, the spark gap is still ignited accurately.« less
NASA Astrophysics Data System (ADS)
Zhao, Yibo; Yu, Guorui; Tan, Jun; Mao, Xiaochen; Li, Jiaqi; Zha, Rui; Li, Ning; Dang, Haizheng
2018-03-01
This paper presents the CFD modeling and experimental verifications of oscillating flow and heat transfer processes in the micro coaxial Stirling-type pulse tube cryocooler (MCSPTC) operating at 90-170 Hz. It uses neither double-inlet nor multi-bypass while the inertance tube with a gas reservoir becomes the only phase-shifter. The effects of the frequency on flow and heat transfer processes in the pulse tube are investigated, which indicates that a low enough frequency would lead to a strong mixing between warm and cold fluids, thereby significantly deteriorating the cooling performance, whereas a high enough frequency would produce the downward sloping streams flowing from the warm end to the axis and almost puncturing the gas displacer from the warm end, thereby creating larger temperature gradients in radial directions and thus undermining the cooling performance. The influence of the pulse tube length on the temperature and velocity when the frequencies are much higher than the optimal one are also discussed. A MCSPTC with an overall mass of 1.1 kg is worked out and tested. With an input electric power of 59 W and operating at 144 Hz, it achieves a no-load temperature of 61.4 K and a cooling capacity of 1.0 W at 77 K. The changing tendencies of tested results are in good agreement with the simulations. The above studies will help to thoroughly understand the underlying mechanism of the inertance MCSPTC operating at very high frequencies.
VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.
2015-12-01
A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.
Model Checking Satellite Operational Procedures
NASA Astrophysics Data System (ADS)
Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri
2011-08-01
We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
Alonso-González, P; Albella, P; Neubrech, F; Huck, C; Chen, J; Golmar, F; Casanova, F; Hueso, L E; Pucci, A; Aizpurua, J; Hillenbrand, R
2013-05-17
Theory predicts a distinct spectral shift between the near- and far-field optical response of plasmonic antennas. Here we combine near-field optical microscopy and far-field spectroscopy of individual infrared-resonant nanoantennas to verify experimentally this spectral shift. Numerical calculations corroborate our experimental results. We furthermore discuss the implications of this effect in surface-enhanced infrared spectroscopy.
NASA Astrophysics Data System (ADS)
Maiti, Santanu K.
2014-07-01
The experimentally obtained (Venkataraman et al. [1]) cosine squared relation of electronic conductance in a biphenyl molecule is verified theoretically within a tight-binding framework. Using Green's function formalism we numerically calculate two-terminal conductance as a function of relative twist angle among the molecular rings and find that the results are in good agreement with the experimental observation.
NASA Astrophysics Data System (ADS)
Rieben, James C., Jr.
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matloch, L.; Vaccaro, S.; Couland, M.
The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less
Numerical and experimental studies of hydrodynamics of flapping foils
NASA Astrophysics Data System (ADS)
Zhou, Kai; Liu, Jun-kao; Chen, Wei-shan
2018-04-01
The flapping foil based on bionics is a sort of simplified models which imitate the motion of wings or fins of fish or birds. In this paper, a universal kinematic model with three degrees of freedom is adopted and the motion parallel to the flow direction is considered. The force coefficients, the torque coefficient, and the flow field characteristics are extracted and analyzed. Then the propulsive efficiency is calculated. The influence of the motion parameters on the hydrodynamic performance of the bionic foil is studied. The results show that the motion parameters play important roles in the hydrodynamic performance of the flapping foil. To validate the reliability of the numerical method used in this paper, an experiment platform is designed and verification experiments are carried out. Through the comparison, it is found that the numerical results compare well with the experimental results, to show that the adopted numerical method is reliable. The results of this paper provide a theoretical reference for the design of underwater vehicles based on the flapping propulsion.
Experimental Demonstration of Counterfactual Quantum Communication
NASA Astrophysics Data System (ADS)
Liu, Yang; Ju, Lei; Liang, Xiao-Lei; Tang, Shi-Biao; Tu, Guo-Liang Shen; Zhou, Lei; Peng, Cheng-Zhi; Chen, Kai; Chen, Teng-Yun; Chen, Zeng-Bing; Pan, Jian-Wei
2012-07-01
Quantum effects, besides offering substantial superiority in many tasks over classical methods, are also expected to provide interesting ways to establish secret keys between remote parties. A striking scheme called “counterfactual quantum cryptography” proposed by Noh [Phys. Rev. Lett. 103, 230501 (2009).PRLTAO0031-900710.1103/PhysRevLett.103.230501] allows one to maintain secure key distributions, in which particles carrying secret information are seemingly not being transmitted through quantum channels. We have experimentally demonstrated, for the first time, a faithful implementation for such a scheme with an on-table realization operating at telecom wavelengths. To verify its feasibility for extension over a long distance, we have furthermore reported an illustration on a 1 km fiber. In both cases, high visibilities of more than 98% are achieved through active stabilization of interferometers. Our demonstration is crucial as a direct verification of such a remarkable application, and this procedure can become a key communication module for revealing fundamental physics through counterfactuals.
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NOAH > PEOPLE Home Operational Products Experimental Data Verification / Development Contacts Change Log Events Calendar Events People Numerical Forecast Systems Coming Soon. NOAA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz
This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less
Design Considerations of a Transverse Flux Machine for Direct-Drive Wind Turbine Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz
This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less
Backward spoof surface wave in plasmonic metamaterial of ultrathin metallic structure.
Liu, Xiaoyong; Feng, Yijun; Zhu, Bo; Zhao, Junming; Jiang, Tian
2016-02-04
Backward wave with anti-parallel phase and group velocities is one of the basic properties associated with negative refraction and sub-diffraction image that have attracted considerable interest in the context of photonic metamaterials. It has been predicted theoretically that some plasmonic structures can also support backward wave propagation of surface plasmon polaritons (SPPs), however direct experimental demonstration has not been reported, to the best of our knowledge. In this paper, a specially designed plasmonic metamaterial of corrugated metallic strip has been proposed that can support backward spoof SPP wave propagation. The dispersion analysis, the full electromagnetic field simulation and the transmission measurement of the plasmonic metamaterial waveguide have clearly validated the backward wave propagation with dispersion relation possessing negative slope and opposite directions of group and phase velocities. As a further verification and application, a contra-directional coupler is designed and tested that can route the microwave signal to opposite terminals at different operating frequencies, indicating new application opportunities of plasmonic metamaterial in integrated functional devices and circuits for microwave and terahertz radiation.
Li, Yongfeng; Ma, Hua; Wang, Jiafu; Pang, Yongqiang; Zheng, Qiqi; Chen, Hongya; Han, Yajuan; Zhang, Jieqiu; Qu, Shaobo
2017-01-01
A high-efficiency tri-band quasi-continuous phase gradient metamaterial is designed and demonstrated based on spoof surface plasmon polaritons (SSPPs). High-efficiency polarizaiton conversion transmission is firstly achieved via tailoring phase differece between the transmisive SSPP and the space wave in orthogonal directions. As an example, a tri-band circular-to-circular (CTC) polarization conversion metamateiral (PCM) was designed by a nonlinearly dispersive phase difference. Using such PCM unit cell, a tri-band quasi-continuous phase gradient metamaterial (PGM) was then realized by virtue of the Pancharatnam-Berry phase. The distribution of the cross-polarization transmission phase along the x-direction is continuous except for two infinitely small intervals near the phases 0° and 360°, and thus the phase gradient has definition at any point along the x-direction. The simulated normalized polarization conversion transmission spectrums together with the electric field distributions for circularly polarized wave and linearly polarized wave demonstrated the high-efficiency anomalous refraction of the quasi-continuous PGM. The experimental verification for the linearly polarized incidence was also provided. PMID:28079185
An Overview and Empirical Comparison of Distance Metric Learning Methods.
Moutafis, Panagiotis; Leng, Mengjun; Kakadiaris, Ioannis A
2016-02-16
In this paper, we first offer an overview of advances in the field of distance metric learning. Then, we empirically compare selected methods using a common experimental protocol. The number of distance metric learning algorithms proposed keeps growing due to their effectiveness and wide application. However, existing surveys are either outdated or they focus only on a few methods. As a result, there is an increasing need to summarize the obtained knowledge in a concise, yet informative manner. Moreover, existing surveys do not conduct comprehensive experimental comparisons. On the other hand, individual distance metric learning papers compare the performance of the proposed approach with only a few related methods and under different settings. This highlights the need for an experimental evaluation using a common and challenging protocol. To this end, we conduct face verification experiments, as this task poses significant challenges due to varying conditions during data acquisition. In addition, face verification is a natural application for distance metric learning because the encountered challenge is to define a distance function that: 1) accurately expresses the notion of similarity for verification; 2) is robust to noisy data; 3) generalizes well to unseen subjects; and 4) scales well with the dimensionality and number of training samples. In particular, we utilize well-tested features to assess the performance of selected methods following the experimental protocol of the state-of-the-art database labeled faces in the wild. A summary of the results is presented along with a discussion of the insights obtained and lessons learned by employing the corresponding algorithms.
RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.
NASA Astrophysics Data System (ADS)
Brown, Alexander; Eviston, Connor
2017-02-01
Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.
NASA Technical Reports Server (NTRS)
Mckenzie, Robert L.
1988-01-01
An analytical study and its experimental verification are described which show the performance capabilities and the hardware requirements of a method for measuring atmospheric density along the Space Shuttle flightpath during entry. Using onboard instrumentation, the technique relies on Rayleigh scattering of light from a pulsed ArF excimer laser operating at a wavelength of 193 nm. The method is shown to be capable of providing density measurements with an uncertainty of less than 1 percent and with a spatial resolution along the flightpath of 1 km, over an altitude range from 50 to 90 km. Experimental verification of the signal linearity and the expected signal-to-noise ratios is demonstrated in a simulation facility at conditions that duplicate the signal levels of the flight environment.
Experimental verification of cleavage characteristic stress vs grain size
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lei, W.; Li, D.; Yao, M.
Instead of the accepted cleavage fracture stress [sigma][sub f] proposed by Knott et al, a new parameter S[sub co], named as ''cleavage characteristic stress,'' has been recently recommended to characterize the microscopic resistance to cleavage fracture. To give a definition, S[sub co] is the fracture stress at the brittle/ductile transition temperature of steels in plain tension, below which the yield strength approximately equals the true fracture stress combined with an abrupt curtailment of ductility. By considering a single-grain microcrack arrested at a boundary, Huang and Yao set up an expression of S[sub co] as a function of grain size. Themore » present work was arranged to provide an experimental verification of S[sub co] vs grain size.« less
Experimental verification of multipartite entanglement in quantum networks
McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.
2016-01-01
Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361
Gaia challenging performances verification: combination of spacecraft models and test results
NASA Astrophysics Data System (ADS)
Ecale, Eric; Faye, Frédéric; Chassat, François
2016-08-01
To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
Test load verification through strain data analysis
NASA Technical Reports Server (NTRS)
Verderaime, V.; Harrington, F.
1995-01-01
A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.
Optimized Temporal Monitors for SystemC
NASA Technical Reports Server (NTRS)
Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.
2012-01-01
SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E
2016-06-15
Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less
Quality assurance of a gimbaled head swing verification using feature point tracking.
Miura, Hideharu; Ozawa, Shuichi; Enosaki, Tsubasa; Kawakubo, Atsushi; Hosono, Fumika; Yamada, Kiyoshi; Nagata, Yasushi
2017-01-01
To perform dynamic tumor tracking (DTT) for clinical applications safely and accurately, gimbaled head swing verification is important. We propose a quantitative gimbaled head swing verification method for daily quality assurance (QA), which uses feature point tracking and a web camera. The web camera was placed on a couch at the same position for every gimbaled head swing verification, and could move based on a determined input function (sinusoidal patterns; amplitude: ± 20 mm; cycle: 3 s) in the pan and tilt directions at isocenter plane. Two continuous images were then analyzed for each feature point using the pyramidal Lucas-Kanade (LK) method, which is an optical flow estimation algorithm. We used a tapped hole as a feature point of the gimbaled head. The period and amplitude were analyzed to acquire a quantitative gimbaled head swing value for daily QA. The mean ± SD of the period were 3.00 ± 0.03 (range: 3.00-3.07) s and 3.00 ± 0.02 (range: 3.00-3.07) s in the pan and tilt directions, respectively. The mean ± SD of the relative displacement were 19.7 ± 0.08 (range: 19.6-19.8) mm and 18.9 ± 0.2 (range: 18.4-19.5) mm in the pan and tilt directions, respectively. The gimbaled head swing was reliable for DTT. We propose a quantitative gimbaled head swing verification method for daily QA using the feature point tracking method and a web camera. Our method can quantitatively assess the gimbaled head swing for daily QA from baseline values, measured at the time of acceptance and commissioning. © 2016 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
2017-01-01
Several reactions, known from other amine systems for CO2 capture, have been proposed for Lewatit R VP OC 1065. The aim of this molecular modeling study is to elucidate the CO2 capture process: the physisorption process prior to the CO2-capture and the reactions. Molecular modeling yields that the resin has a structure with benzyl amine groups on alternating positions in close vicinity of each other. Based on this structure, the preferred adsorption mode of CO2 and H2O was established. Next, using standard Density Functional Theory two catalytic reactions responsible for the actual CO2 capture were identified: direct amine and amine-H2O catalyzed formation of carbamic acid. The latter is a new type of catalysis. Other reactions are unlikely. Quantitative verification of the molecular modeling results with known experimental CO2 adsorption isotherms, applying a dual site Langmuir adsorption isotherm model, further supports all results of this molecular modeling study. PMID:29142339
DOE Office of Scientific and Technical Information (OSTI.GOV)
Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz
This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.
We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less
NASA Astrophysics Data System (ADS)
Dang, Haizheng; Zhao, Yibo
2016-09-01
This paper presents the CFD modeling and experimental verifications of a single-stage inertance tube coaxial Stirling-type pulse tube cryocooler operating at 30-35 K using mixed stainless steel mesh regenerator matrices without either double-inlet or multi-bypass. A two-dimensional axis-symmetric CFD model with the thermal non-equilibrium mode is developed to simulate the internal process, and the underlying mechanism of significantly reducing the regenerator losses with mixed matrices is discussed in detail based on the given six cases. The modeling also indicates that the combination of the given different mesh segments can be optimized to achieve the highest cooling efficiency or the largest exergy ratio, and then the verification experiments are conducted in which the satisfactory agreements between simulated and tested results are observed. The experiments achieve a no-load temperature of 27.2 K and the cooling power of 0.78 W at 35 K, or 0.29 W at 30 K, with an input electric power of 220 W and a reject temperature of 300 K.
Video-Based Fingerprint Verification
Qin, Wei; Yin, Yilong; Liu, Lili
2013-01-01
Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283
National Centers for Environmental Prediction
/ VISION | About EMC EMC > RAP, HRRR > Home Operational Products Experimental Data Verification Model Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post
NASA Astrophysics Data System (ADS)
Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.
2015-11-01
High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.
NASA Technical Reports Server (NTRS)
Baumann, William T.; Saunders, William R.; Vandsburger, Uri; Saus, Joseph (Technical Monitor)
2003-01-01
The VACCG team is comprised of engineers at Virginia Tech who specialize in the subject areas of combustion physics, chemical kinetics, dynamics and controls, and signal processing. Currently, the team's work on this NRA research grant is designed to determine key factors that influence combustion control performance through a blend of theoretical and experimental investigations targeting design and demonstration of active control for three different combustors. To validiate the accuracy of conclusions about control effectiveness, a sequence of experimental verifications on increasingly complex lean, direct injection combustors is underway. During the work period January 1, 2002 through October 15, 2002, work has focused on two different laboratory-scale combustors that allow access for a wide variety of measurements. As the grant work proceeds, one key goal will be to obtain certain knowledge about a particular combustor process using a minimum of sophisticated measurements, due to the practical limitations of measurements on full-scale combustors. In the second year, results obtained in the first year will be validated on test combustors to be identified in the first quarter of that year. In the third year, it is proposed to validate the results at more realistic pressure and power levels by utilizing the facilities at the Glenn Research Center.
NASA Astrophysics Data System (ADS)
Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano
2016-04-01
The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.
Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje
2012-05-01
The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Mailhot, J.; Milbrandt, J. A.; Giguère, A.; McTaggart-Cowan, R.; Erfani, A.; Denis, B.; Glazer, A.; Vallée, M.
2014-01-01
Environment Canada ran an experimental numerical weather prediction (NWP) system during the Vancouver 2010 Winter Olympic and Paralympic Games, consisting of nested high-resolution (down to 1-km horizontal grid-spacing) configurations of the GEM-LAM model, with improved geophysical fields, cloud microphysics and radiative transfer schemes, and several new diagnostic products such as density of falling snow, visibility, and peak wind gust strength. The performance of this experimental NWP system has been evaluated in these winter conditions over complex terrain using the enhanced mesoscale observing network in place during the Olympics. As compared to the forecasts from the operational regional 15-km GEM model, objective verification generally indicated significant added value of the higher-resolution models for near-surface meteorological variables (wind speed, air temperature, and dewpoint temperature) with the 1-km model providing the best forecast accuracy. Appreciable errors were noted in all models for the forecasts of wind direction and humidity near the surface. Subjective assessment of several cases also indicated that the experimental Olympic system was skillful at forecasting meteorological phenomena at high-resolution, both spatially and temporally, and provided enhanced guidance to the Olympic forecasters in terms of better timing of precipitation phase change, squall line passage, wind flow channeling, and visibility reduction due to fog and snow.
Chambers, Andrew G; Percy, Andrew J; Simon, Romain; Borchers, Christoph H
2014-04-01
Accurate cancer biomarkers are needed for early detection, disease classification, prediction of therapeutic response and monitoring treatment. While there appears to be no shortage of candidate biomarker proteins, a major bottleneck in the biomarker pipeline continues to be their verification by enzyme linked immunosorbent assays. Multiple reaction monitoring (MRM), also known as selected reaction monitoring, is a targeted mass spectrometry approach to protein quantitation and is emerging to bridge the gap between biomarker discovery and clinical validation. Highly multiplexed MRM assays are readily configured and enable simultaneous verification of large numbers of candidates facilitating the development of biomarker panels which can increase specificity. This review focuses on recent applications of MRM to the analysis of plasma and serum from cancer patients for biomarker verification. The current status of this approach is discussed along with future directions for targeted mass spectrometry in clinical biomarker validation.
Positron emission imaging device and method of using the same
Bingham, Philip R.; Mullens, James Allen
2013-01-15
An imaging system and method of imaging are disclosed. The imaging system can include an external radiation source producing pairs of substantially simultaneous radiation emissions of a picturization emission and a verification emissions at an emission angle. The imaging system can also include a plurality of picturization sensors and at least one verification sensor for detecting the picturization and verification emissions, respectively. The imaging system also includes an object stage is arranged such that a picturization emission can pass through an object supported on said object stage before being detected by one of said plurality of picturization sensors. A coincidence system and a reconstruction system can also be included. The coincidence can receive information from the picturization and verification sensors and determine whether a detected picturization emission is direct radiation or scattered radiation. The reconstruction system can produce a multi-dimensional representation of an object imaged with the imaging system.
Exploring system interconnection architectures with VIPACES: from direct connections to NOCs
NASA Astrophysics Data System (ADS)
Sánchez-Peña, Armando; Carballo, Pedro P.; Núñez, Antonio
2007-05-01
This paper presents a simple environment for the verification of AMBA 3 AXI systems in Verification IP (VIP) production called VIPACES (Verification Interface Primitives for the development of AXI Compliant Elements and Systems). These primitives are presented as a not compiled library written in SystemC where interfaces are the core of the library. The definition of interfaces instead of generic modules let the user construct custom modules improving the resources spent during the verification phase as well as easily adapting his modules to the AMBA 3 AXI protocol. This topic is the main discussion in the VIPACES library. The paper focuses on comparing and contrasting the main interconnection schemes for AMBA 3 AXI as modeled by VIPACES. For assessing these results we propose a validation scenario with a particular architecture belonging to the domain of MPEG4 video decoding, which is compound by an AXI bus connecting an IDCT and other processing resources.
1981-01-01
per-rev, ring weighting factor, etc.) and with compression system design . A detailed description of the SAE methodology is provided in Ref. 1...offers insights into the practical application of experimental aeromechanical procedures and establishes the process of valid design assessment, avoiding...considerations given to the total engine system. Design Verification in the Experimental Laboratory Certain key parameters are influencing the design of modern
Resistivity Correction Factor for the Four-Probe Method: Experiment I
NASA Astrophysics Data System (ADS)
Yamashita, Masato; Yamaguchi, Shoji; Enjoji, Hideo
1988-05-01
Experimental verification of the theoretically derived resistivity correction factor (RCF) is presented. Resistivity and sheet resistance measurements by the four-probe method are made on three samples: isotropic graphite, ITO film and Au film. It is indicated that the RCF can correct the apparent variations of experimental data to yield reasonable resistivities and sheet resistances.
1980-06-05
N-231 High Reynolds Number Channel Facility (An example of a Versatile Wind Tunnel) Tunnel 1 I is a blowdown Facility that utilizes interchangeable test sections and nozzles. The facility provides experimental support for the fluid mechanics research, including experimental verification of aerodynamic computer codes and boundary-layer and airfoil studies that require high Reynolds number simulation. (Tunnel 1)
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NOAH > HOME Home Operational Products Experimental Data Verification Model PAGE LOGO NCEP HOME NWS LOGO NOAA HOME NOAA HOME Disclaimer for this non-operational web page
Laboratory directed research and development program, FY 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-02-01
The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) Laboratory Directed Research and Development Program FY 1996 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of the Laboratory Directed Research and Development (LDRD) program planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The Berkeley Lab LDRD program is a critical tool for directing the Laboratory`s forefront scientific research capabilities toward vital, excellent, and emerging scientific challenges. The program provides themore » resources for Berkeley Lab scientists to make rapid and significant contributions to critical national science and technology problems. The LDRD program also advances the Laboratory`s core competencies, foundations, and scientific capability, and permits exploration of exciting new opportunities. Areas eligible for support include: (1) Work in forefront areas of science and technology that enrich Laboratory research and development capability; (2) Advanced study of new hypotheses, new experiments, and innovative approaches to develop new concepts or knowledge; (3) Experiments directed toward proof of principle for initial hypothesis testing or verification; and (4) Conception and preliminary technical analysis to explore possible instrumentation, experimental facilities, or new devices.« less
Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C C M; Alderliesten, Tanja
2015-12-01
The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up cone-beam CT. For each patient we calculated pairwise distances between markers over time to evaluate geometric tumor volume variation. We then quantified marker displacements relative to bony anatomy and estimated the variation of systematic (Σ) and random errors (σ). During bony anatomy-based setup verification, we visually inspected whether the markers were inside the planning target volume (PTV) and attempted marker-based registration. Minor time trends with substantial fluctuations in pairwise distances implied tissue deformation. Overall, Σ(σ) in the left-right/cranial-caudal/anterior-posterior direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm; for the proximal stomach, it was 5.4(4.3)/4.9(3.2)/1.9(2.4) mm. After bony anatomy-based setup correction, all markers were inside the PTV. However, due to large tissue deformation, marker-based registration was not feasible. Generally, the interfractional position variation of esophageal tumors is more pronounced in the cranial-caudal direction and in the proximal stomach. Currently, marker-based setup verification is not feasible for clinical routine use, but markers can facilitate the setup verification by inspecting whether the PTV covers the tumor volume adequately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A superfluid helium system for an LST IR experiment
NASA Technical Reports Server (NTRS)
Breckenridge, R. W., Jr.; Moore, R. W., Jr.
1975-01-01
The results are presented of a study program directed toward evaluating the problems associated with cooling an LST instrument to 2 K for a year by using superfluid helium as the cooling means. The results include the parametric analysis of systems using helium only, and systems using helium plus a shield cryogen. A baseline system, using helium only is described. The baseline system is sized for an instrument heat leak of 50 mw. It contains 71 Kg of superfluid helium and has a total, filled weight of 217 Kg. A brief assessment of the technical problems associated with a long life, spaceborne superfluid helium storage system is also made. It is concluded that a one year life, superfluid helium cooling system is feasible, pending experimental verification of a suitable low g vent system.
An Overview-NASA LeRC Structures Program
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
1997-01-01
The Structures and Acoustics Division of the NASA Lewis Research Center has its genesis dating back to 1943. It has been an independent Division at Lewis since 1979. Its two primary capabilities are performance and life analysis of static and dynamic systems such as those found in aircraft and spacecraft propulsion systems and experimental verification of these analyses. Research is conducted in-house, through university grants and contracts, and through cooperative programs with industry. Our work directly supports NASA's Advanced Subsonic Technology (AST), Smart Green Engine, Fast Quiet Engine, High-Temperature Materials and Processing (HiTEMP), Hybrid Hyperspeed Propulsion, Rotorcraft, High-Speed Research (HSR), and Aviation Safety Program (AvSP). A general overview is given discussing these programs and other technologies that are being developed at NASA LeRC.
2000-05-01
Flexible Aircraft Control", held in Ottawa, Canada, 18-20 October 1999, and published in RTO MP-36. 9-2 INTRODUCTION 2. PRINCIPES DE LA METHODE DE CALCUL...constitude par un .les pressions sur la gouveme et le ensemble de 17 pouts de jauge , de 20 moment de charni~re sont surestimds accildrom~tes, de 5...les corrdlations calcul-essais 130 mm). des rdponses dc jauges de contraintes A 12 Le calcul, comme les essais, permettent chargements statiques. Cette
NASA Technical Reports Server (NTRS)
Nicks, Oran W.; Korkan, Kenneth D.
1991-01-01
Two reports on student activities to determine the properties of a new laminar airfoil which were delivered at a conference on soaring technology are presented. The papers discuss a wind tunnel investigation and analysis of the SM701 airfoil and verification of the SM701 airfoil aerodynamic charcteristics utilizing theoretical techniques. The papers are based on a combination of analytical design, hands-on model fabrication, wind tunnel calibration and testing, data acquisition and analysis, and comparison of test results and theory.
Bounded Parametric Model Checking for Elementary Net Systems
NASA Astrophysics Data System (ADS)
Knapik, Michał; Szreter, Maciej; Penczek, Wojciech
Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.
Verification of an IGBT Fusing Switch for Over-current Protection of the SNS HVCM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benwell, Andrew; Kemp, Mark; Burkhart, Craig
2010-06-11
An IGBT based over-current protection system has been developed to detect faults and limit the damage caused by faults in high voltage converter modulators. During normal operation, an IGBT enables energy to be transferred from storage capacitors to a H-bridge. When a fault occurs, the over-current protection system detects the fault, limits the fault current and opens the IGBT to isolate the remaining stored energy from the fault. This paper presents an experimental verification of the over-current protection system under applicable conditions.
Low level vapor verification of monomethyl hydrazine
NASA Technical Reports Server (NTRS)
Mehta, Narinder
1990-01-01
The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.
NASA Astrophysics Data System (ADS)
Viscardi, Massimo; Arena, Maurizio; Ciminello, Monica; Guida, Michele; Meola, Carosena; Cerreta, Pietro
2018-03-01
The development of advanced monitoring system for strain measurements on aeronautical components remain an important target both when related to the optimization of the lead-time and cost for part validation, allowing earlier entry into service, and when related to the implementation of advanced health monitoring systems dedicated to the in-service parameters verification and early stage detection of structural problems. The paper deals with the experimental testing of a composite samples set of the main landing gear bay for a CS-25 category aircraft, realized through an innovative design and production process. The test have represented a good opportunity for direct comparison of different strain measurement techniques: Strain Gauges (SG) and Fibers Bragg Grating (FBG) have been used as well as non-contact techniques, specifically the Digital Image Correlation (DIC) and Infrared (IR) thermography applied where possible in order to highlight possible hot-spot during the tests. The crucial points identification on the specimens has been supported by means of advanced finite element simulations, aimed to assessment of the structural strength and deformation as well as to ensure the best performance and the global safety of the whole experimental campaign.
Experimental setup for the measurement of induction motor cage currents
NASA Astrophysics Data System (ADS)
Bottauscio, Oriano; Chiampi, Mario; Donadio, Lorenzo; Zucca, Mauro
2005-04-01
An experimental setup for measurement of the currents flowing in the rotor bars of induction motors during synchronous no-load tests is described in the paper. The experimental verification of the high-frequency phenomena in the rotor cage is fundamental for a deep insight of the additional loss estimation by numerical methods. The attention is mainly focused on the analysis and design of the transducers developed for the cage current measurement.
Improvement of INVS Measurement Uncertainty for Pu and U-Pu Nitrate Solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swinhoe, Martyn Thomas; Menlove, Howard Olsen; Marlow, Johnna Boulds
2017-04-27
In the Tokai Reprocessing Plant (TRP) and the Plutonium Conversion Development Facility (PCDF), a large amount of plutonium nitrate solution which is recovered from light water reactor (LWR) and advanced thermal reactor (ATR), FUGEN are being stored. Since the solution is designated as a direct use material, the periodical inventory verification and flow verification are being conducted by Japan Safeguard Government Office (JSGO) and International Atomic Agency (IAEA).
Experimental quantum verification in the presence of temporally correlated noise
NASA Astrophysics Data System (ADS)
Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.
2018-02-01
Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.
2011-01-01
Purpose To verify the dose distribution and number of monitor units (MU) for dynamic treatment techniques like volumetric modulated single arc radiation therapy - Rapid Arc - each patient treatment plan has to be verified prior to the first treatment. The purpose of this study was to develop a patient related treatment plan verification protocol using a two dimensional ionization chamber array (MatriXX, IBA, Schwarzenbruck, Germany). Method Measurements were done to determine the dependence between response of 2D ionization chamber array, beam direction, and field size. Also the reproducibility of the measurements was checked. For the patient related verifications the original patient Rapid Arc treatment plan was projected on CT dataset of the MatriXX and the dose distribution was calculated. After irradiation of the Rapid Arc verification plans measured and calculated 2D dose distributions were compared using the gamma evaluation method implemented in the measuring software OmniPro (version 1.5, IBA, Schwarzenbruck, Germany). Results The dependence between response of 2D ionization chamber array, field size and beam direction has shown a passing rate of 99% for field sizes between 7 cm × 7 cm and 24 cm × 24 cm for measurements of single arc. For smaller and larger field sizes than 7 cm × 7 cm and 24 cm × 24 cm the passing rate was less than 99%. The reproducibility was within a passing rate of 99% and 100%. The accuracy of the whole process including the uncertainty of the measuring system, treatment planning system, linear accelerator and isocentric laser system in the treatment room was acceptable for treatment plan verification using gamma criteria of 3% and 3 mm, 2D global gamma index. Conclusion It was possible to verify the 2D dose distribution and MU of Rapid Arc treatment plans using the MatriXX. The use of the MatriXX for Rapid Arc treatment plan verification in clinical routine is reasonable. The passing rate should be 99% than the verification protocol is able to detect clinically significant errors. PMID:21342509
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
Character Recognition Method by Time-Frequency Analyses Using Writing Pressure
NASA Astrophysics Data System (ADS)
Watanabe, Tatsuhito; Katsura, Seiichiro
With the development of information and communication technology, personal verification becomes more and more important. In the future ubiquitous society, the development of terminals handling personal information requires the personal verification technology. The signature is one of the personal verification methods; however, the number of characters is limited in the case of the signature and therefore false signature is used easily. Thus, personal identification is difficult from handwriting. This paper proposes a “haptic pen” that extracts the writing pressure, and shows a character recognition method by time-frequency analyses. Although the figures of characters written by different amanuenses are similar, the differences appear in the time-frequency domain. As a result, it is possible to use the proposed character recognition for personal identification more exactly. The experimental results showed the viability of the proposed method.
ERIC Educational Resources Information Center
Wong, Siu-ling; Chun, Ka-wai Cecilia; Mak, Se-yuen
2007-01-01
We describe a physics investigation project inspired by one of the adventures of Odysseus in Homer's "Odyssey." The investigation uses the laws of mechanics, vector algebra and a simple way to construct a fan-and-sail-cart for experimental verification.
Resistivity Correction Factor for the Four-Probe Method: Experiment III
NASA Astrophysics Data System (ADS)
Yamashita, Masato; Nishii, Toshifumi; Kurihara, Hiroshi; Enjoji, Hideo; Iwata, Atsushi
1990-04-01
Experimental verification of the theoretically derived resistivity correction factor F is presented. Factor F is applied to a system consisting of a rectangular parallelepiped sample and a square four-probe array. Resistivity and sheet resistance measurements are made on isotropic graphites and crystalline ITO films. Factor F corrects experimental data and leads to reasonable resistivity and sheet resistance.
NASA Astrophysics Data System (ADS)
Hönicke, Philipp; Kolbe, Michael; Müller, Matthias; Mantler, Michael; Krämer, Markus; Beckhoff, Burkhard
2014-10-01
An experimental method for the verification of the individually different energy dependencies of L1-, L2-, and L3- subshell photoionization cross sections is described. The results obtained for Pd and Mo are well in line with theory regarding both energy dependency and absolute values, and confirm the theoretically calculated cross sections by Scofield from the early 1970 s and, partially, more recent data by Trzhaskovskaya, Nefedov, and Yarzhemsky. The data also demonstrate the questionability of quantitative x-ray spectroscopical results based on the widely used fixed jump ratio approximated cross sections with energy independent ratios. The experiments are carried out by employing the radiometrically calibrated instrumentation of the Physikalisch-Technische Bundesanstalt at the electron storage ring BESSY II in Berlin; the obtained fluorescent intensities are thereby calibrated at an absolute level in reference to the International System of Units. Experimentally determined fixed fluorescence line ratios for each subshell are used for a reliable deconvolution of overlapping fluorescence lines. The relevant fundamental parameters of Mo and Pd are also determined experimentally in order to calculate the subshell photoionization cross sections independently of any database.
NASA Astrophysics Data System (ADS)
Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.
2008-02-01
IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).
2015-03-05
launched on its rocket- estimated completion date of May 2015. Air Force will require verification that SpaceX can meet payload integration...design and accelerate integration capability at Space Exploration Technologies Corporation ( SpaceX )1 launch sites. o The Air Force does not intend to...accelerate integration capabilities at SpaceX launch sites because of the studies it directed, but will require verification that SpaceX can meet
Reynders, Truus; Tournel, Koen; De Coninck, Peter; Heymann, Steve; Vinh-Hung, Vincent; Van Parijs, Hilde; Duchateau, Michaël; Linthout, Nadine; Gevaert, Thierry; Verellen, Dirk; Storme, Guy
2009-10-01
Investigation of the use of TomoTherapy and TomoDirect versus conventional radiotherapy for the treatment of post-operative breast carcinoma. This study concentrates on the evaluation of the planning protocol for the TomoTherapy and TomoDirect TPS, dose verification and the implementation of in vivo dosimetry. Eight patients with different breast cancer indications (left/right tumor, axillary nodes involvement (N+)/no nodes (N0), tumorectomy/mastectomy) were enrolled. TomoTherapy, TomoDirect and conventional plans were generated for prone and supine positions leading to six or seven plans per patient. Dose prescription was 42Gy in 15 fractions over 3weeks. Dose verification of a TomoTherapy plan is performed using TLDs and EDR2 film inside a home-made wax breast phantom fixed on a rando-alderson phantom. In vivo dosimetry was performed with TLDs. It is possible to create clinically acceptable plans with TomoTherapy and TomoDirect. TLD calibration protocol with a water equivalent phantom is accurate. TLD verification with the phantom shows measured over calculated ratios within 2.2% (PTV). An overresponse of the TLDs was observed in the low dose regions (<0.1Gy). The film measurements show good agreement for high and low dose regions inside the phantom. A sharp gradient can be created to the thoracic wall. In vivo dosimetry with TLDs was clinically feasible. The TomoTherapy and TomoDirect modalities can deliver dose distributions which the radiotherapist judges to be equal to or better than conventional treatment of breast carcinoma according to the organ to be protected.
Analytical and Experimental Investigations of Sodium Heat Pipes and Thermal Energy Storage Systems.
1982-01-01
continued) Figure Page 5.1 Cylindrical container for eutectic salt (LiF-NgF -KF) . . . . . . 91 5.2 TESC sample . . . . . . ... . . 0...of fluorides of Mg, Li and K. Experimental results have been used to verify the melting point, and latent heat of fusion of the eutectic salt , in...a melting or solidification curve will provide experimental verification for the latent heat value and melting point of a given eutectic salt . In the
Experimental verification of Pyragas-Schöll-Fiedler control.
von Loewenich, Clemens; Benner, Hartmut; Just, Wolfram
2010-09-01
We present an experimental realization of time-delayed feedback control proposed by Schöll and Fiedler. The scheme enables us to stabilize torsion-free periodic orbits in autonomous systems, and to overcome the so-called odd number limitation. The experimental control performance is in quantitative agreement with the bifurcation analysis of simple model systems. The results uncover some general features of the control scheme which are deemed to be relevant for a large class of setups.
Computer Modeling and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pronskikh, V. S.
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less
Andreski, Michael; Myers, Megan; Gainer, Kate; Pudlo, Anthony
Determine the effects of an 18-month pilot project using tech-check-tech in 7 community pharmacies on 1) rate of dispensing errors not identified during refill prescription final product verification; 2) pharmacist workday task composition; and 3) amount of patient care services provided and the reimbursement status of those services. Pretest-posttest quasi-experimental study where baseline and study periods were compared. Pharmacists and pharmacy technicians in 7 community pharmacies in Iowa. The outcome measures were 1) percentage of technician verified refill prescriptions where dispensing errors were not identified on final product verification; 2) percentage of time spent by pharmacists in dispensing, management, patient care, practice development, and other activities; 3) the number of pharmacist patient care services provided per pharmacist hours worked; and 4) percentage of time that technician product verification was used. There was no significant difference in overall errors (0.2729% vs. 0.5124%, P = 0.513), patient safety errors (0.0525% vs. 0.0651%, P = 0.837), or administrative errors (0.2204% vs. 0.4784%, P = 0.411). Pharmacist's time in dispensing significantly decreased (67.3% vs. 49.06%, P = 0.005), and time in direct patient care (19.96% vs. 34.72%, P = 0.003), increased significantly. Time in other activities did not significantly change. Reimbursable services per pharmacist hour (0.11 vs. 0.30, P = 0.129), did not significantly change. Non-reimbursable services increased significantly (2.77 vs. 4.80, P = 0.042). Total services significantly increased (2.88 vs. 5.16, P = 0.044). Pharmacy technician product verification of refill prescriptions preserved dispensing safety while significantly increasing the time spent in delivery of pharmacist provided patient care services. The total number of pharmacist services provided per hour also increased significantly, driven primarily by a significant increase in the number of non-reimbursed services. This was mostly likely due to the increased time available to provide patient care. Reimbursed services per hour did not increase significantly mostly likely due to lack of payers. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Self-Directed Adult Learning: A Critical Paradigm Revisited.
ERIC Educational Resources Information Center
Caffarella, Rosemary S.; O'Donnell, Judith M.
1987-01-01
Seeks to analyze and categorize both data-based and conceptual articles on self-directed learning. Covers (1) verification studies, (2) nature of the method, (3) nature of the learner, (4) nature of the philosophical position, and (5) policy. Suggests future research topics. (Author/CH)
AIR QUALITY FORECAST VERIFICATION USING SATELLITE DATA
NOAA 's operational geostationary satellite retrievals of aerosol optical depths (AODs) were used to verify National Weather Service (NWS) experimental (research mode) particulate matter (PM2.5) forecast guidance issued during the summer 2004 International Consortium for Atmosp...
Ac electronic tunneling at optical frequencies
NASA Technical Reports Server (NTRS)
Faris, S. M.; Fan, B.; Gustafson, T. K.
1974-01-01
Rectification characteristics of non-superconducting metal-barrier-metal junctions deduced from electronic tunneling have been observed experimentally for optical frequency irradiation of the junction. The results provide verification of optical frequency Fermi level modulation and electronic tunneling current modulation.
4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study
NASA Astrophysics Data System (ADS)
Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia
2015-08-01
At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugh, C.E.; Bass, B.R.; Keeney, J.A.
This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasismore » was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.« less
Circulation of spoof surface plasmon polaritons: Implementation and verification
NASA Astrophysics Data System (ADS)
Pan, Junwei; Wang, Jiafu; Qiu, Tianshuo; Pang, Yongqiang; Li, Yongfeng; Zhang, Jieqiu; Qu, Shaobo
2018-05-01
In this letter, we are dedicated to implementation and experimental verification of broadband circulator for spoof surface plasmon polaritons (SSPPs). For the ease of fabrication, a circulator operating in X band was firstly designed. The comb-like transmission lines (CL-TLs), a typical SSPP structure, are adopted as the three branches of the Y-junction. To enable broadband coupling of SSPP, a transition section is added on each end of the CL-TLs. Through such a design, the circulator can operate under the sub-wavelength SSPP mode in a broad band. The simulation results show that the insertion loss is less than 0.5dB while the isolation and return loss are higher than 20dB in 9.4-12.0GHz. A prototype was fabricated and measured. The experimental results are consistent with the simulation results and verify the broadband circulation performance in X band.
Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.
Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M
2000-02-01
Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.
Plasma Model V&V of Collisionless Electrostatic Shock
NASA Astrophysics Data System (ADS)
Martin, Robert; Le, Hai; Bilyeu, David; Gildea, Stephen
2014-10-01
A simple 1D electrostatic collisionless shock was selected as an initial validation and verification test case for a new plasma modeling framework under development at the Air Force Research Laboratory's In-Space Propulsion branch (AFRL/RQRS). Cross verification between PIC, Vlasov, and Fluid plasma models within the framework along with expected theoretical results will be shown. The non-equilibrium velocity distributions (VDF) captured by PIC and Vlasov will be compared to each other and the assumed VDF of the fluid model at selected points. Validation against experimental data from the University of California, Los Angeles double-plasma device will also be presented along with current work in progress at AFRL/RQRS towards reproducing the experimental results using higher fidelity diagnostics to help elucidate differences between model results and between the models and original experiment. DISTRIBUTION A: Approved for public release; unlimited distribution; PA (Public Affairs) Clearance Number 14332.
Wu, Xiaoping; Akgün, Can; Vaughan, J Thomas; Andersen, Peter; Strupp, John; Uğurbil, Kâmil; Van de Moortele, Pierre-François
2010-07-01
Parallel excitation holds strong promises to mitigate the impact of large transmit B1 (B+1) distortion at very high magnetic field. Accelerated RF pulses, however, inherently tend to require larger values in RF peak power which may result in substantial increase in Specific Absorption Rate (SAR) in tissues, which is a constant concern for patient safety at very high field. In this study, we demonstrate adapted rate RF pulse design allowing for SAR reduction while preserving excitation target accuracy. Compared with other proposed implementations of adapted rate RF pulses, our approach is compatible with any k-space trajectories, does not require an analytical expression of the gradient waveform and can be used for large flip angle excitation. We demonstrate our method with numerical simulations based on electromagnetic modeling and we include an experimental verification of transmit pattern accuracy on an 8 transmit channel 9.4 T system.
NASA Technical Reports Server (NTRS)
Fey, M. G.
1981-01-01
The experimental verification system for the production of silicon via the arc heater-sodium reduction of SiCl4 was designed, fabricated, installed, and operated. Each of the attendant subsystems was checked out and operated to insure performance requirements. These subsystems included: the arc heaters/reactor, cooling water system, gas system, power system, Control & Instrumentation system, Na injection system, SiCl4 injection system, effluent disposal system and gas burnoff system. Prior to introducing the reactants (Na and SiCl4) to the arc heater/reactor, a series of gas only-power tests was conducted to establish the operating parameters of the three arc heaters of the system. Following the successful completion of the gas only-power tests and the readiness tests of the sodium and SiCl4 injection systems, a shakedown test of the complete experimental verification system was conducted.
High temperature furnace modeling and performance verifications
NASA Technical Reports Server (NTRS)
Smith, James E., Jr.
1992-01-01
Analytical, numerical, and experimental studies were performed on two classes of high temperature materials processing sources for their potential use as directional solidification furnaces. The research concentrated on a commercially available high temperature furnace using a zirconia ceramic tube as the heating element and an Arc Furnace based on a tube welder. The first objective was to assemble the zirconia furnace and construct parts needed to successfully perform experiments. The 2nd objective was to evaluate the zirconia furnace performance as a directional solidification furnace element. The 3rd objective was to establish a data base on materials used in the furnace construction, with particular emphasis on emissivities, transmissivities, and absorptivities as functions of wavelength and temperature. A 1-D and 2-D spectral radiation heat transfer model was developed for comparison with standard modeling techniques, and were used to predict wall and crucible temperatures. The 4th objective addressed the development of a SINDA model for the Arc Furnace and was used to design sample holders and to estimate cooling media temperatures for the steady state operation of the furnace. And, the 5th objective addressed the initial performance evaluation of the Arc Furnace and associated equipment for directional solidification. Results of these objectives are presented.
Development of automated optical verification technologies for control systems
NASA Astrophysics Data System (ADS)
Volegov, Peter L.; Podgornov, Vladimir A.
1999-08-01
The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.
VeriClick: an efficient tool for table format verification
NASA Astrophysics Data System (ADS)
Nagy, George; Tamhankar, Mangesh
2012-01-01
The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.
A silicon strip detector array for energy verification and quality assurance in heavy ion therapy.
Debrot, Emily; Newall, Matthew; Guatelli, Susanna; Petasecca, Marco; Matsufuji, Naruhiro; Rosenfeld, Anatoly B
2018-02-01
The measurement of depth dose profiles for range and energy verification of heavy ion beams is an important aspect of quality assurance procedures for heavy ion therapy facilities. The steep dose gradients in the Bragg peak region of these profiles require the use of detectors with high spatial resolution. The aim of this work is to characterize a one dimensional monolithic silicon detector array called the "serial Dose Magnifying Glass" (sDMG) as an independent ion beam energy and range verification system used for quality assurance conducted for ion beams used in heavy ion therapy. The sDMG detector consists of two linear arrays of 128 silicon sensitive volumes each with an effective size of 2mm × 50μm × 100μm fabricated on a p-type substrate at a pitch of 200 μm along a single axis of detection. The detector was characterized for beam energy and range verification by measuring the response of the detector when irradiated with a 290 MeV/u 12 C ion broad beam incident along the single axis of the detector embedded in a PMMA phantom. The energy of the 12 C ion beam incident on the detector and the residual energy of an ion beam incident on the phantom was determined from the measured Bragg peak position in the sDMG. Ad hoc Monte Carlo simulations of the experimental setup were also performed to give further insight into the detector response. The relative response profiles along the single axis measured with the sDMG detector were found to have good agreement between experiment and simulation with the position of the Bragg peak determined to fall within 0.2 mm or 1.1% of the range in the detector for the two cases. The energy of the beam incident on the detector was found to vary less than 1% between experiment and simulation. The beam energy incident on the phantom was determined to be (280.9 ± 0.8) MeV/u from the experimental and (280.9 ± 0.2) MeV/u from the simulated profiles. These values coincide with the expected energy of 281 MeV/u. The sDMG detector response was studied experimentally and characterized using a Monte Carlo simulation. The sDMG detector was found to accurately determine the 12 C beam energy and is suited for fast energy and range verification quality assurance. It is proposed that the sDMG is also applicable for verification of treatment planning systems that rely on particle range. © 2017 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan
This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less
Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan; ...
2017-08-29
This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less
Johnson, Kennita A; Vormohr, Hannah R; Doinikov, Alexander A; Bouakaz, Ayache; Shields, C Wyatt; López, Gabriel P; Dayton, Paul A
2016-05-01
Acoustophoresis uses acoustic radiation force to remotely manipulate particles suspended in a host fluid for many scientific, technological, and medical applications, such as acoustic levitation, acoustic coagulation, contrast ultrasound imaging, ultrasound-assisted drug delivery, etc. To estimate the magnitude of acoustic radiation forces, equations derived for an inviscid host fluid are commonly used. However, there are theoretical predictions that, in the case of a traveling wave, viscous effects can dramatically change the magnitude of acoustic radiation forces, which make the equations obtained for an inviscid host fluid invalid for proper estimation of acoustic radiation forces. To date, experimental verification of these predictions has not been published. Experimental measurements of viscous effects on acoustic radiation forces in a traveling wave were conducted using a confocal optical and acoustic system and values were compared with available theories. Our results show that, even in a low-viscosity fluid such as water, the magnitude of acoustic radiation forces is increased manyfold by viscous effects in comparison with what follows from the equations derived for an inviscid fluid.
NASA Astrophysics Data System (ADS)
Johnson, Kennita A.; Vormohr, Hannah R.; Doinikov, Alexander A.; Bouakaz, Ayache; Shields, C. Wyatt; López, Gabriel P.; Dayton, Paul A.
2016-05-01
Acoustophoresis uses acoustic radiation force to remotely manipulate particles suspended in a host fluid for many scientific, technological, and medical applications, such as acoustic levitation, acoustic coagulation, contrast ultrasound imaging, ultrasound-assisted drug delivery, etc. To estimate the magnitude of acoustic radiation forces, equations derived for an inviscid host fluid are commonly used. However, there are theoretical predictions that, in the case of a traveling wave, viscous effects can dramatically change the magnitude of acoustic radiation forces, which make the equations obtained for an inviscid host fluid invalid for proper estimation of acoustic radiation forces. To date, experimental verification of these predictions has not been published. Experimental measurements of viscous effects on acoustic radiation forces in a traveling wave were conducted using a confocal optical and acoustic system and values were compared with available theories. Our results show that, even in a low-viscosity fluid such as water, the magnitude of acoustic radiation forces is increased manyfold by viscous effects in comparison with what follows from the equations derived for an inviscid fluid.
Application of additive laser technologies in the gas turbine blades design process
NASA Astrophysics Data System (ADS)
Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.
2017-11-01
An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.
Caswell, Joseph M; Singh, Manraj; Persinger, Michael A
2016-08-01
Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.
Yu, Ron X.; Liu, Jie; True, Nick; Wang, Wei
2008-01-01
A major challenge in the post-genome era is to reconstruct regulatory networks from the biological knowledge accumulated up to date. The development of tools for identifying direct target genes of transcription factors (TFs) is critical to this endeavor. Given a set of microarray experiments, a probabilistic model called TRANSMODIS has been developed which can infer the direct targets of a TF by integrating sequence motif, gene expression and ChIP-chip data. The performance of TRANSMODIS was first validated on a set of transcription factor perturbation experiments (TFPEs) involving Pho4p, a well studied TF in Saccharomyces cerevisiae. TRANSMODIS removed elements of arbitrariness in manual target gene selection process and produced results that concur with one's intuition. TRANSMODIS was further validated on a genome-wide scale by comparing it with two other methods in Saccharomyces cerevisiae. The usefulness of TRANSMODIS was then demonstrated by applying it to the identification of direct targets of DAF-16, a critical TF regulating ageing in Caenorhabditis elegans. We found that 189 genes were tightly regulated by DAF-16. In addition, DAF-16 has differential preference for motifs when acting as an activator or repressor, which awaits experimental verification. TRANSMODIS is computationally efficient and robust, making it a useful probabilistic framework for finding immediate targets. PMID:18350157
Experimental verification of multidimensional quantum steering
NASA Astrophysics Data System (ADS)
Li, Che-Ming; Lo, Hsin-Pin; Chen, Liang-Yu; Yabushita, Atsushi
2018-03-01
Quantum steering enables one party to communicate with another remote party even if the sender is untrusted. Such characteristics of quantum systems not only provide direct applications to quantum information science, but are also conceptually important for distinguishing between quantum and classical resources. While concrete illustrations of steering have been shown in several experiments, quantum steering has not been certified for higher dimensional systems. Here, we introduce a simple method to experimentally certify two different kinds of quantum steering: Einstein-Podolsky-Rosen (EPR) steering and single-system (SS) steering (i.e., temporal steering), for dimensionality (d) up to d = 16. The former reveals the steerability among bipartite systems, whereas the latter manifests itself in single quantum objects. We use multidimensional steering witnesses to verify EPR steering of polarization-entangled pairs and SS steering of single photons. The ratios between the measured witnesses and the maximum values achieved by classical mimicries are observed to increase with d for both EPR and SS steering. The designed scenario offers a new method to study further the genuine multipartite steering of large dimensionality and potential uses in quantum information processing.
NASA Astrophysics Data System (ADS)
Khait, A.; Shemer, L.
2018-05-01
The evolution of unidirectional wave trains containing a wave that gradually becomes steep is evaluated experimentally and numerically using the Boundary Element Method (BEM). The boundary conditions for the nonlinear numerical simulations corresponded to the actual movements of the wavemaker paddle as recorded in the physical experiments, allowing direct comparison between the measured in experiments' characteristics of the wave train and the numerical predictions. The high level of qualitative and quantitative agreement between the measurements and simulations validated the kinematic criterion for the inception of breaking and the location of the spilling breaker, on the basis of the BEM computations and associated experiments. The breaking inception is associated with the fluid particle at the crest of the steep wave that has been accelerated to match and surpass the crest velocity. The previously observed significant slow-down of the crest while approaching breaking is verified numerically; both narrow-/broad-banded wave trains are considered. Finally, the relative importance of linear and nonlinear contributions is analyzed.
Combustor Operability and Performance Verification for HIFiRE Flight 2
NASA Technical Reports Server (NTRS)
Storch, Andrea M.; Bynum, Michael; Liu, Jiwen; Gruber, Mark
2011-01-01
As part of the Hypersonic International Flight Research Experimentation (HIFiRE) Direct-Connect Rig (HDCR) test and analysis activity, three-dimensional computational fluid dynamics (CFD) simulations were performed using two Reynolds-Averaged Navier Stokes solvers. Measurements obtained from ground testing in the NASA Langley Arc-Heated Scramjet Test Facility (AHSTF) were used to specify inflow conditions for the simulations and combustor data from four representative tests were used as benchmarks. Test cases at simulated flight enthalpies of Mach 5.84, 6.5, 7.5, and 8.0 were analyzed. Modeling parameters (e.g., turbulent Schmidt number and compressibility treatment) were tuned such that the CFD results closely matched the experimental results. The tuned modeling parameters were used to establish a standard practice in HIFiRE combustor analysis. Combustor performance and operating mode were examined and were found to meet or exceed the objectives of the HIFiRE Flight 2 experiment. In addition, the calibrated CFD tools were then applied to make predictions of combustor operation and performance for the flight configuration and to aid in understanding the impacts of ground and flight uncertainties on combustor operation.
2009-10-01
will guarantee a solid base for the future. The content of this publication has been reproduced directly from material supplied by RTO or the...intensity threat involving a local population wanting to break into the camp to steal material and food supplies ; and • A higher intensity threat...combatant evacuation opeations, distribute emergency supplies , and evacuate/ relocate refugees and displaced persons. Specified NLW-relevant tasks are
Developing a NASA strategy for the verification of large space telescope observatories
NASA Astrophysics Data System (ADS)
Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie
2006-06-01
In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.
Drumm, Daniel W; Greentree, Andrew D
2017-11-07
Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.
AGARD Index of Publications, 1977 - 1979.
1980-08-01
therefore, experimental verification of calculation Walter Schutz In AGARD Fracture Mach Design Methodology methods, hypotheses etc. is very time...2-. 3-. or 4-D navigation initial stages of preliminary design analysis The state of the art systems and allows experimental or theoretical...Technol. on Weapons Systems Design Dot. 1978 1ES WITH WINGS NON -LEVEL 23 01 AERONAUTICS (GENERAL) J Stanley Ausman In AGARD The Impact of Integrated
Strength of bolted wood joints with various ratios of member thicknesses
Thomas Lee Wilkinson
1978-01-01
Procedures have been recommendedâsuch as in the National Design Specificationâfor design of bolted joints in wood members where the side members are thicker or thinner than half the main member thickness. However, these recommendations have had no experimental verification up to now. The same is true for joints with other than three members. This study experimentally...
1987-06-01
non -propagating cracks should be considered and maximum principal strain amplitude Is the controlling parameter. FATIGUE DAMAGE MAPS The preceding...fatigue is strain- controlled and not stress- controlled . The small effect of R-ratio suggested by Figure 2 may simply reflect the high experimental ...present a model (and its experimental verification) describing non -damaging notches in fatigue. &FFECT OF GRAIN SIZE AND TEMPERATURE In this part we shall
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Lo, R. Y.
1987-01-01
Modeling of SEU has been done in a CMOS static RAM containing 1-micron-channel-length transistors fabricated from a p-well epilayer process using both circuit-simulation and numerical-simulation techniques. The modeling results have been experimentally verified with the aid of heavy-ion beams obtained from a three-stage tandem van de Graaff accelerator. Experimental evidence for a novel SEU mode in an ON n-channel device is presented.
A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware
NASA Astrophysics Data System (ADS)
Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun
During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.
Static test induced loads verification beyond elastic limit
NASA Technical Reports Server (NTRS)
Verderaime, V.; Harrington, F.
1996-01-01
Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.
Static test induced loads verification beyond elastic limit
NASA Technical Reports Server (NTRS)
Verderaime, V.; Harrington, F.
1996-01-01
Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.
Study of the penetration of a plate made of titanium alloy VT6 with a steel ball
NASA Astrophysics Data System (ADS)
Buzyurkin, A. E.
2018-03-01
The purpose of this work is the development and verification of mathematical relationships, adapted to the package of finite element analysis LS-DYNA and describing the deformation and destruction of a titanium plate in a high-speed collision. Using data from experiments on the interaction of a steel ball with a titanium plate made of VT6 alloy, verification of the available constants necessary for describing the behavior of the material using the Johnson-Cook relationships was performed, as well as verification of the parameters of the fracture model used in the numerical modeling of the collision process. An analysis of experimental data on the interaction of a spherical impactor with a plate showed that the data accepted for VT6 alloy in the first approximation for deformation hardening in the Johnson-Cook model give too high results on the residual velocities of the impactor when piercing the plate.
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
Text Mining in Cancer Gene and Pathway Prioritization
Luo, Yuan; Riedlinger, Gregory; Szolovits, Peter
2014-01-01
Prioritization of cancer implicated genes has received growing attention as an effective way to reduce wet lab cost by computational analysis that ranks candidate genes according to the likelihood that experimental verifications will succeed. A multitude of gene prioritization tools have been developed, each integrating different data sources covering gene sequences, differential expressions, function annotations, gene regulations, protein domains, protein interactions, and pathways. This review places existing gene prioritization tools against the backdrop of an integrative Omic hierarchy view toward cancer and focuses on the analysis of their text mining components. We explain the relatively slow progress of text mining in gene prioritization, identify several challenges to current text mining methods, and highlight a few directions where more effective text mining algorithms may improve the overall prioritization task and where prioritizing the pathways may be more desirable than prioritizing only genes. PMID:25392685
Text mining in cancer gene and pathway prioritization.
Luo, Yuan; Riedlinger, Gregory; Szolovits, Peter
2014-01-01
Prioritization of cancer implicated genes has received growing attention as an effective way to reduce wet lab cost by computational analysis that ranks candidate genes according to the likelihood that experimental verifications will succeed. A multitude of gene prioritization tools have been developed, each integrating different data sources covering gene sequences, differential expressions, function annotations, gene regulations, protein domains, protein interactions, and pathways. This review places existing gene prioritization tools against the backdrop of an integrative Omic hierarchy view toward cancer and focuses on the analysis of their text mining components. We explain the relatively slow progress of text mining in gene prioritization, identify several challenges to current text mining methods, and highlight a few directions where more effective text mining algorithms may improve the overall prioritization task and where prioritizing the pathways may be more desirable than prioritizing only genes.
Joint Estimation of Source Range and Depth Using a Bottom-Deployed Vertical Line Array in Deep Water
Li, Hui; Yang, Kunde; Duan, Rui; Lei, Zhixiong
2017-01-01
This paper presents a joint estimation method of source range and depth using a bottom-deployed vertical line array (VLA). The method utilizes the information on the arrival angle of direct (D) path in space domain and the interference characteristic of D and surface-reflected (SR) paths in frequency domain. The former is related to a ray tracing technique to backpropagate the rays and produces an ambiguity surface of source range. The latter utilizes Lloyd’s mirror principle to obtain an ambiguity surface of source depth. The acoustic transmission duct is the well-known reliable acoustic path (RAP). The ambiguity surface of the combined estimation is a dimensionless ad hoc function. Numerical efficiency and experimental verification show that the proposed method is a good candidate for initial coarse estimation of source position. PMID:28590442
Turbokon scientific and production implementation company—25 years of activity
NASA Astrophysics Data System (ADS)
Favorskii, O. N.; Leont'ev, A. I.; Milman, O. O.
2016-05-01
The main results of studies performed at ZAO Turbokon NPVP in cooperation with leading Russian scientific organizations during 25 years of its activity in the field of development of unique ecologically clean electric power and heat production technologies are described. They include the development and experimental verification using prototypes and full-scale models of highly efficient air-cooled condensers for steam turbines, a high temperature gas steam turbine for stationary and transport power engineering, a nonfuel technology of electric power production using steam turbine installations with a unit power of 4-20 MW at gas-main pipelines and industrial boiler houses and heat stations. The results of efforts in the field of reducing vibroactivity of power equipment for transport installations are given. Basic directions of further research for increasing the efficiency and ecological safety of home power engineering are discussed.
NASA Astrophysics Data System (ADS)
Kondo, Hirotaka; Fujimoto, Kazuhiro J.; Tanaka, Shigenori; Deki, Hiroyuki; Nakamura, Takashi
2015-03-01
L-2-Haloacid dehalogenase (L-DEX YL) is a member of a family of enzymes that decontaminate a variety of environmental pollutants such as L-2-chloropropionate (L-2-CPA). This enzyme specifically catalyzes the hydrolytic dehalogenation of L-2-haloacid to produce D-2-hydroxy acid, and does not catalyze that of D-2-haloacid. Here, using the quantum-mechanical/molecular-mechanical and the fragment molecular orbital calculations, the enzymatic reaction of L-DEX YL to D-2-CPA was compared with that to L-2-CPA. As a result, Tyr12, Leu45 and Phe60 were predicted to affect the enantioselectivity. We then performed the site-directed-mutagenesis experiments and the activity measurement of these mutants, thus finding that the F60Y mutant had the enzymatic activity with D-2-CPA.
Smith, Ryan L; Haworth, Annette; Panettieri, Vanessa; Millar, Jeremy L; Franich, Rick D
2016-05-01
Verification of high dose rate (HDR) brachytherapy treatment delivery is an important step, but is generally difficult to achieve. A technique is required to monitor the treatment as it is delivered, allowing comparison with the treatment plan and error detection. In this work, we demonstrate a method for monitoring the treatment as it is delivered and directly comparing the delivered treatment with the treatment plan in the clinical workspace. This treatment verification system is based on a flat panel detector (FPD) used for both pre-treatment imaging and source tracking. A phantom study was conducted to establish the resolution and precision of the system. A pretreatment radiograph of a phantom containing brachytherapy catheters is acquired and registration between the measurement and treatment planning system (TPS) is performed using implanted fiducial markers. The measured catheter paths immediately prior to treatment were then compared with the plan. During treatment delivery, the position of the (192)Ir source is determined at each dwell position by measuring the exit radiation with the FPD and directly compared to the planned source dwell positions. The registration between the two corresponding sets of fiducial markers in the TPS and radiograph yielded a registration error (residual) of 1.0 mm. The measured catheter paths agreed with the planned catheter paths on average to within 0.5 mm. The source positions measured with the FPD matched the planned source positions for all dwells on average within 0.6 mm (s.d. 0.3, min. 0.1, max. 1.4 mm). We have demonstrated a method for directly comparing the treatment plan with the delivered treatment that can be easily implemented in the clinical workspace. Pretreatment imaging was performed, enabling visualization of the implant before treatment delivery and identification of possible catheter displacement. Treatment delivery verification was performed by measuring the source position as each dwell was delivered. This approach using a FPD for imaging and source tracking provides a noninvasive method of acquiring extensive information for verification in HDR prostate brachytherapy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Ryan L., E-mail: ryan.smith@wbrc.org.au; Millar, Jeremy L.; Franich, Rick D.
Purpose: Verification of high dose rate (HDR) brachytherapy treatment delivery is an important step, but is generally difficult to achieve. A technique is required to monitor the treatment as it is delivered, allowing comparison with the treatment plan and error detection. In this work, we demonstrate a method for monitoring the treatment as it is delivered and directly comparing the delivered treatment with the treatment plan in the clinical workspace. This treatment verification system is based on a flat panel detector (FPD) used for both pre-treatment imaging and source tracking. Methods: A phantom study was conducted to establish the resolutionmore » and precision of the system. A pretreatment radiograph of a phantom containing brachytherapy catheters is acquired and registration between the measurement and treatment planning system (TPS) is performed using implanted fiducial markers. The measured catheter paths immediately prior to treatment were then compared with the plan. During treatment delivery, the position of the {sup 192}Ir source is determined at each dwell position by measuring the exit radiation with the FPD and directly compared to the planned source dwell positions. Results: The registration between the two corresponding sets of fiducial markers in the TPS and radiograph yielded a registration error (residual) of 1.0 mm. The measured catheter paths agreed with the planned catheter paths on average to within 0.5 mm. The source positions measured with the FPD matched the planned source positions for all dwells on average within 0.6 mm (s.d. 0.3, min. 0.1, max. 1.4 mm). Conclusions: We have demonstrated a method for directly comparing the treatment plan with the delivered treatment that can be easily implemented in the clinical workspace. Pretreatment imaging was performed, enabling visualization of the implant before treatment delivery and identification of possible catheter displacement. Treatment delivery verification was performed by measuring the source position as each dwell was delivered. This approach using a FPD for imaging and source tracking provides a noninvasive method of acquiring extensive information for verification in HDR prostate brachytherapy.« less
Nuclear Energy Experiments to the Center for Global Security and Cooperation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Douglas M.
2015-06-01
This is to serve as verification that the Center 6200 experimental pieces supplied to the Technology Training and Demonstration Area within the Center of Global Security and Cooperation are indeed unclassified unlimited released for viewing.
Jeong, Min Yong; Chang, Seo Hyoung; Kim, Beom Hyun; Sim, Jae-Hoon; Said, Ayman; Casa, Diego; Gog, Thomas; Janod, Etienne; Cario, Laurent; Yunoki, Seiji; Han, Myung Joon; Kim, Jungho
2017-10-04
Strong spin-orbit coupling lifts the degeneracy of t 2g orbitals in 5d transition-metal systems, leaving a Kramers doublet and quartet with effective angular momentum of J eff = 1/2 and 3/2, respectively. These spin-orbit entangled states can host exotic quantum phases such as topological Mott state, unconventional superconductivity, and quantum spin liquid. The lacunar spinel GaTa 4 Se 8 was theoretically predicted to form the molecular J eff = 3/2 ground state. Experimental verification of its existence is an important first step to exploring the consequences of the J eff = 3/2 state. Here, we report direct experimental evidence of the J eff = 3/2 state in GaTa 4 Se 8 by means of excitation spectra of resonant inelastic X-ray scattering at the Ta L 3 and L 2 edges. We find that the excitations involving the J eff = 1/2 molecular orbital are absent only at the Ta L 2 edge, manifesting the realization of the molecular J eff = 3/2 ground state in GaTa 4 Se 8 .The strong interaction between electron spin and orbital degrees of freedom in 5d oxides can lead to exotic electronic ground states. Here the authors use resonant inelastic X-ray scattering to demonstrate that the theoretically proposed J eff = 3/2 state is realised in GaTa 4 Se 8 .
Pohorecki, Wladyslaw; Obryk, Barbara
2017-09-29
The results of nuclear heating measured by means of thermoluminescent dosemeters (TLD-LiF) in a Cu block irradiated by 14 MeV neutrons are presented. The integral Cu experiment relevant for verification of copper nuclear data at neutron energies characteristic for fusion facilities was performed in the ENEA FNG Laboratory at Frascati. Five types of TLDs were used: highly photon sensitive LiF:Mg,Cu,P (MCP-N), 7LiF:Mg,Cu,P (MCP-7) and standard, lower sensitivity LiF:Mg,Ti (MTS-N), 7LiF:Mg,Ti (MTS-7) and 6LiF:Mg,Ti (MTS-6). Calibration of the detectors was performed with gamma rays in terms of air-kerma (10 mGy of 137Cs air-kerma). Nuclear heating in the Cu block was also calculated with the use of MCNP transport code Nuclear heating in Cu and air in TLD's positions was calculated as well. The nuclear heating contribution from all simulated by MCNP6 code particles including protons, deuterons, alphas tritons and heavier ions produced by the neutron interactions were calculated. A trial of the direct comparison between experimental results and results of simulation was performed. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Numerical and Experimental Studies of Transient Natural Convection with Density Inversion
NASA Astrophysics Data System (ADS)
Mizutani, Satoru; Ishiguro, Tatsuji; Kuwahara, Kunio
1996-11-01
In beer manufacturing process, we cool beer in storage tank down from 8 to -1 ^circC. The understanding of cooling process is very important for designing a fermentation tank. In this paper, flow and temperature distribution in a rectangular enclosure was studied. The unsteady incompressible Navier-Stokes equations were integrated by using the multi-directional third-order upwind finite difference method(MUFDM). A parabolic density-temperature relationship was assumed in water which has the maximum density at 3.98 ^circC. Cooling down from 8 to 0 ^circC of water in 10 cm cubical enclosure (Ra=10^7) was numerically done by keeping a vertical side wall at 0 ^circC. Vortex was caused by density inversion of water which was cooled bellow 4 ^circC, and it rose near the cold wall and reached water surface after 33 min from the start of cooling. Finally, cooling proceeded from upper surface. At the aim of verifing the accuracy of the numerical result, temperature distribution under the same condition was experimentally visualized using temperature sensitive liquid crystal. The results will be presented by using video movie. Comparison between the computation and the experiment showed that the present direct simulation based on the MUFDM was powerful tool for the understanding of the natural convection with density inversion and the application of cooling phenomenon to the design of beer storage tanks.
ERIC Educational Resources Information Center
Long, Huey B.; Walsh, Stephen M.
1993-01-01
Offers an analysis of 11 dissertations focusing on self-directed learning (SDL) in community colleges, highlighting the importance of promoting SDL, the relationship between the level of SDL and other variables, verification and measurement of time spent on SDL projects, and effects of SDL. (DMM)
NGA West 2 | Pacific Earthquake Engineering Research Center
, multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors
While aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, the verification of the spatial and temporal variability of the magnitude and directionality of aerosol radiative forcing has remained challengi...
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems: Summary of Research
NASA Technical Reports Server (NTRS)
Manna, Zohar
1998-01-01
This research is directed towards the implementation of a comprehensive deductive-algorithmic environment (toolkit) for the development and verification of high assurance reactive systems, especially concurrent, real-time, and hybrid systems. For this, we have designed and implemented the STCP (Stanford Temporal Prover) verification system. Reactive systems have an ongoing interaction with their environment, and their computations are infinite sequences of states. A large number of systems can be seen as reactive systems, including hardware, concurrent programs, network protocols, and embedded systems. Temporal logic provides a convenient language for expressing properties of reactive systems. A temporal verification methodology provides procedures for proving that a given system satisfies a given temporal property. The research covered necessary theoretical foundations as well as implementation and application issues.
The Sedov Blast Wave as a Radial Piston Verification Test
Pederson, Clark; Brown, Bart; Morgan, Nathaniel
2016-06-22
The Sedov blast wave is of great utility as a verification problem for hydrodynamic methods. The typical implementation uses an energized cell of finite dimensions to represent the energy point source. We avoid this approximation by directly finding the effects of the energy source as a boundary condition (BC). Furthermore, the proposed method transforms the Sedov problem into an outward moving radial piston problem with a time-varying velocity. A portion of the mesh adjacent to the origin is removed and the boundaries of this hole are forced with the velocities from the Sedov solution. This verification test is implemented onmore » two types of meshes, and convergence is shown. Our results from the typical initial condition (IC) method and the new BC method are compared.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, X; Witztum, A; Kenton, O
2014-06-01
Purpose: Due to the unpredictability of bowel gas movement, the PA beam direction is always favored for robust proton therapy in post-operative pancreatic cancer treatment. We investigate the feasibility of replacing PA beam with a modified AP beam to take the bowel gas uncertainty into account. Methods: Nine post-operative pancreatic cancer patients treated with proton therapy (5040cGy, 28 fractions) in our institution were randomly selected. The original plan uses PA and lateral direction passive-scattering proton beams. Beam weighting is about 1:1. All patients received weekly verification CTs to assess the daily variations(total 17 verification CTs). The PA direction beam wasmore » replaced by two other groups of AP direction beam. Group AP: takes 3.5% range uncertainty into account. Group APmod: compensates the bowel gas uncertainty by expanding the proximal margin to 2cm more. The 2cm margin was acquired from the average bowel diameter in from 100 adult abdominal CT scans near pancreatic region (+/- 5cm superiorly and inferiorly). Dose Volume Histograms(DVHs) of the verification CTs were acquired for robustness study. Results: Without the lateral beam, Group APmod is as robust as Group PA. In Group AP, more than 10% of iCTV D98/D95 were reduced by 4–8%. LT kidney and Liver dose robustness are not affected by the AP/PA beam direction. There is 10% of chance that RT kidney and cord will be hit by AP proton beam due to the bowel gas. Compared to Group PA, APmod plan reduced the dose to kidneys and cord max significantly, while there is no statistical significant increase in bowel mean dose. Conclusion: APmod proton beam for the target coverage could be as robust as the PA direction without sacrificing too much of bowel dose. When the AP direction beam has to be selected, a 2cm proximal margin should be considered.« less
A proposed standard method for polarimetric calibration and calibration verification
NASA Astrophysics Data System (ADS)
Persons, Christopher M.; Jones, Michael W.; Farlow, Craig A.; Morell, L. Denise; Gulley, Michael G.; Spradley, Kevin D.
2007-09-01
Accurate calibration of polarimetric sensors is critical to reducing and analyzing phenomenology data, producing uniform polarimetric imagery for deployable sensors, and ensuring predictable performance of polarimetric algorithms. It is desirable to develop a standard calibration method, including verification reporting, in order to increase credibility with customers and foster communication and understanding within the polarimetric community. This paper seeks to facilitate discussions within the community on arriving at such standards. Both the calibration and verification methods presented here are performed easily with common polarimetric equipment, and are applicable to visible and infrared systems with either partial Stokes or full Stokes sensitivity. The calibration procedure has been used on infrared and visible polarimetric imagers over a six year period, and resulting imagery has been presented previously at conferences and workshops. The proposed calibration method involves the familiar calculation of the polarimetric data reduction matrix by measuring the polarimeter's response to a set of input Stokes vectors. With this method, however, linear combinations of Stokes vectors are used to generate highly accurate input states. This allows the direct measurement of all system effects, in contrast with fitting modeled calibration parameters to measured data. This direct measurement of the data reduction matrix allows higher order effects that are difficult to model to be discovered and corrected for in calibration. This paper begins with a detailed tutorial on the proposed calibration and verification reporting methods. Example results are then presented for a LWIR rotating half-wave retarder polarimeter.
75 FR 55799 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-14
... against polyomaviruses. Development Status: Pre-clinical. Inventors: Christopher B. Buck and Diana V... can serve as positive controls in chemokine receptor studies designed to identify novel... chemokine studies. Experimental verification of response to CXC family chemokines: The scientists have...
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NAM > Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts
Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H
2018-01-01
The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.
Surface acoustic wave diffraction driven mechanisms in microfluidic systems.
Fakhfouri, Armaghan; Devendran, Citsabehsan; Albrecht, Thomas; Collins, David J; Winkler, Andreas; Schmidt, Hagen; Neild, Adrian
2018-06-26
Acoustic forces arising from high-frequency surface acoustic waves (SAW) underpin an exciting range of promising techniques for non-contact manipulation of fluid and objects at micron scale. Despite increasing significance of SAW-driven technologies in microfluidics, the understanding of a broad range of phenomena occurring within an individual SAW system is limited. Acoustic effects including streaming and radiation force fields are often assumed to result from wave propagation in a simple planar fashion. The propagation patterns of a single SAW emanating from a finite-width source, however, cause a far richer range of physical effects. In this work, we seek a better understanding of the various effects arising from the incidence of a finite-width SAW beam propagating into a quiescent fluid. Through numerical and experimental verification, we present five distinct mechanisms within an individual system. These cause fluid swirling in two orthogonal planes, and particle trapping in two directions, as well as migration of particles in the direction of wave propagation. For a range of IDT aperture and channel dimensions, the relative importance of these mechanisms is evaluated.
A method for calculating strut and splitter plate noise in exit ducts: Theory and verification
NASA Technical Reports Server (NTRS)
Fink, M. R.
1978-01-01
Portions of a four-year analytical and experimental investigation relative to noise radiation from engine internal components in turbulent flow are summarized. Spectra measured for such airfoils over a range of chord, thickness ratio, flow velocity, and turbulence level were compared with predictions made by an available rigorous thin-airfoil analytical method. This analysis included the effects of flow compressibility and source noncompactness. Generally good agreement was obtained. This noise calculation method for isolated airfoils in turbulent flow was combined with a method for calculating transmission of sound through a subsonic exit duct and with an empirical far-field directivity shape. These three elements were checked separately and were individually shown to give close agreement with data. This combination provides a method for predicting engine internally generated aft-radiated noise from radial struts and stators, and annular splitter rings. Calculated sound power spectra, directivity, and acoustic pressure spectra were compared with the best available data. These data were for noise caused by a fan exit duct annular splitter ring, larger-chord stator blades, and turbine exit struts.
Chen, C Julian; Schwarz, Alex; Wiesendanger, Roland; Horn, Oliver; Müller, Jörg
2010-05-01
We present a novel quartz cantilever for frequency-modulation atomic force microscopy (FM-AFM) which has three electrodes: an actuating electrode, a sensing electrode, and a ground electrode. By applying an ac signal on the actuating electrode, the cantilever is set to vibrate. If the frequency of actuation voltage closely matches one of the characteristic frequencies of the cantilever, a sharp resonance should be observed. The vibration of the cantilever in turn generates a current on the sensing electrode. The arrangement of the electrodes is such that the cross-talk capacitance between the actuating electrode and the sensing electrode is less than 10(-16) F, thus the direct coupling is negligible. To verify the principle, a number of samples were made. Direct measurements with a Nanosurf easyPPL controller and detector showed that for each cantilever, one or more vibrational modes can be excited and detected. Using classical theory of elasticity, it is shown that such novel cantilevers with proper dimensions can provide optimized performance and sensitivity in FM-AFM with very simple electronics.
Empirical Model for Predicting Rockfall Trajectory Direction
NASA Astrophysics Data System (ADS)
Asteriou, Pavlos; Tsiambaos, George
2016-03-01
A methodology for the experimental investigation of rockfall in three-dimensional space is presented in this paper, aiming to assist on-going research of the complexity of a block's response to impact during a rockfall. An extended laboratory investigation was conducted, consisting of 590 tests with cubical and spherical blocks made of an artificial material. The effects of shape, slope angle and the deviation of the post-impact trajectory are examined as a function of the pre-impact trajectory direction. Additionally, an empirical model is proposed that estimates the deviation of the post-impact trajectory as a function of the pre-impact trajectory with respect to the slope surface and the slope angle. This empirical model is validated by 192 small-scale field tests, which are also presented in this paper. Some important aspects of the three-dimensional nature of rockfall phenomena are highlighted that have been hitherto neglected. The 3D space data provided in this study are suitable for the calibration and verification of rockfall analysis software that has become increasingly popular in design practice.
ERIC Educational Resources Information Center
Precker, Jurgen W.
2007-01-01
The wavelength of the light emitted by a light-emitting diode (LED) is intimately related to the band-gap energy of the semiconductor from which the LED is made. We experimentally estimate the band-gap energies of several types of LEDs, and compare them with the energies of the emitted light, which ranges from infrared to white. In spite of…
NASA Astrophysics Data System (ADS)
Sciazko, Anna; Komatsu, Yosuke; Brus, Grzegorz; Kimijima, Shinji; Szmyd, Janusz S.
2014-09-01
For a mathematical model based on the result of physical measurements, it becomes possible to determine their influence on the final solution and its accuracy. However, in classical approaches, the influence of different model simplifications on the reliability of the obtained results are usually not comprehensively discussed. This paper presents a novel approach to the study of methane/steam reforming kinetics based on an advanced methodology called the Orthogonal Least Squares method. The kinetics of the reforming process published earlier are divergent among themselves. To obtain the most probable values of kinetic parameters and enable direct and objective model verification, an appropriate calculation procedure needs to be proposed. The applied Generalized Least Squares (GLS) method includes all the experimental results into the mathematical model which becomes internally contradicted, as the number of equations is greater than number of unknown variables. The GLS method is adopted to select the most probable values of results and simultaneously determine the uncertainty coupled with all the variables in the system. In this paper, the evaluation of the reaction rate after the pre-determination of the reaction rate, which was made by preliminary calculation based on the obtained experimental results over a Nickel/Yttria-stabilized Zirconia catalyst, was performed.
NASA Astrophysics Data System (ADS)
Carman, Gregory P.
2015-09-01
Electromagnetic devices rely on electrical currents to generate magnetic fields. While extremely useful this approach has limitations in the small-scale. To overcome the scaling problem, researchers have tried to use electric fields to manipulate a magnetic material's intrinsic magnetization (i.e. multiferroic). The strain mediated class of multiferroics offers up to 70% of energy transduction using available piezoelectric and magnetoelastic materials. While strain mediated multiferroic is promising, few studies exist on modeling/testing of nanoscale magnetic structures. This talk presents motivation, analytical models, and experimental data on electrical control of nanoscale single magnetic domain structures. This research is conducted in a NSF Engineering Research Center entitled Translational Applications for Nanoscale Multiferroics TANMS. The models combine micromagnetics (Landau-Lifshitz-Gilbert) with elastodynamics using the electrostatic approximation producing eight fully coupled nonlinear partial differential equations. Qualitative and quantitative verification is achieved with direct comparison to experimental data. The modeling effort guides fabrication and testing on three elements, i.e. nanoscale rings (onion states), ellipses (single domain reorientation), and superparamagnetic elements. Experimental results demonstrate electrical and deterministic control of the magnetic states in the 5-500 nm structures as measured with Photoemission Electron Microscopy PEEM, Magnetic Force Microscopy MFM, or Lorentz Transmission Electron Microscopy TEM. These data strongly suggests efficient control of nanoscale magnetic spin states is possible with voltage.
Kim, Dong-Kwan; Hwang, Yoon Jo; Yoon, Cheolho; Yoon, Hye-On; Chang, Ki Soo; Lee, Gaehang; Lee, Seungwoo; Yi, Gi-Ra
2015-08-28
The theoretical extinction coefficients of gold nanoparticles (AuNPs) have been mainly verified by the analytical solving of the Maxwell equation for an ideal sphere, which was firstly founded by Mie (generally referred to as Mie theory). However, in principle, it has not been directly feasible with experimental verification especially for relatively large AuNPs (i.e., >40 nm), as conventionally proposed synthetic methods have inevitably resulted in a polygonal shaped, non-ideal Au nanosphere. Here, mono-crystalline, ultra-smooth, and highly spherical AuNPs of 40-100 nm were prepared by the procedure reported in our recent work (ACS Nano, 2013, 7, 11064). The extinction coefficients of the ideally spherical AuNPs of 40-100 nm were empirically extracted using the Beer-Lambert law, and were then compared with the theoretical limits obtained by the analytical and numerical methods. The obtained extinction coefficients of the ideally spherical AuNPs herein agree much more closely with the theoretical limits, compared with those of the faceted or polygonal shaped AuNPs. In addition, in order to further elucidate the importance of being spherical, we systematically compared our ideally spherical AuNPs with the polygonal counterparts; effectively addressing the role of the surface morphology on the spectral responses in both theoretical and experimental manners.
Woodward Effect Experimental Verifications
NASA Astrophysics Data System (ADS)
March, Paul
2004-02-01
The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.
NASA Astrophysics Data System (ADS)
Aksenov, A. A.; Danilishin, A. M.; Dubenko, A. M.; Kozhukov, Y. V.
2017-08-01
Design modernization of the centrifugal compressor stage test bench with three dimensional impeller blades was carried out for the possibility of holding a series of experimental studies of different 3D impeller models. The studies relates to the problem of joint work of the impeller and the stationary channels of the housing when carrying out works on modernization with the aim of improving the parameters of the volumetric capacity or pressure in the presence of design constraints. The object of study is the experimental single end centrifugal compressor stage with the 3D impeller. Compressor stage consists of the 3D impeller, vaneless diffuser (VLD), outlet collector - folded side scroll and downstream pipe. The drive is a DC motor 75 kW. The increase gear (multiplier) was set between the compressor and DC motor, gear ratio is i = 9.8. To obtain the characteristics of the compressor and the flow area the following values were measured: total pressure, static pressure, direction (angles) of the stream in different cross sections. Additional pneumometric probes on the front wall of the VLD of the test bench have been installed. Total pressure probes and foster holes for the measurement of total and static pressure by the new drainage scheme. This allowed carrying out full experimental studies for two elements of centrifugal compressor stage. After the experimental tests the comprehensive information about the performance of model stage were obtained. Was measured geometric parameters and the constructed virtual model of the experimental bench flow part with the help of Creo Parametric 3.0 and ANSYS v. 16.2. Conducted CFD calculations and verification with experimental data. Identifies the steps for further experimental and virtual works.
Satellite Power System (SPS) concept definition study (exhibit C)
NASA Technical Reports Server (NTRS)
Haley, G. M.
1979-01-01
The major outputs of the study are the constructability studies which resulted in the definition of the concepts for satellite, rectenna, and satellite construction base construction. Transportation analyses resulted in definition of heavy-lift launch vehicle, electric orbit transfer vehicle, personnel orbit transfer vehicle, and intra-orbit transfer vehicle as well as overall operations related to transportation systems. The experiment/verification program definition resulted in the definition of elements for the Ground-Based Experimental Research and Key Technology plans. These studies also resulted in conceptual approaches for early space technology verification. The cost analysis defined the overall program and cost data for all program elements and phases.
Palmprint verification using Lagrangian decomposition and invariant interest points
NASA Astrophysics Data System (ADS)
Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.
2011-06-01
This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.
2011-03-21
throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983
Verification of passive cooling techniques in the Super-FRS beam collimators
NASA Astrophysics Data System (ADS)
Douma, C. A.; Gellanki, J.; Najafi, M. A.; Moeini, H.; Kalantar-Nayestanaki, N.; Rigollet, C.; Kuiken, O. J.; Lindemulder, M. F.; Smit, H. A. J.; Timersma, H. J.
2016-08-01
The Super FRagment Separator (Super-FRS) at the FAIR facility will be the largest in-flight separator of heavy ions in the world. One of the essential steps in the separation procedure is to stop the unwanted ions with beam collimators. In one of the most common situations, the heavy ions are produced by a fission reaction of a primary 238U-beam (1.5 GeV/u) hitting a 12C target (2.5 g/cm2). In this situation, some of the produced ions are highly charged states of 238U. These ions can reach the collimators with energies of up to 1.3 GeV/u and a power of up to 500 W. Under these conditions, a cooling system is required to prevent damage to the collimators and to the corresponding electronics. Due to the highly radioactive environment, both the collimators and the cooling system must be suitable for robot handling. Therefore, an active cooling system is undesirable because of the increased possibility of malfunctioning and other complications. By using thermal simulations (performed with NX9 of Siemens PLM), the possibility of passive cooling is explored. The validity of these simulations is tested by independent comparison with other simulation programs and by experimental verification. The experimental verification is still under analysis, but preliminary results indicate that the explored passive cooling option provides sufficient temperature reduction.
Verification of Accurate Technical Insight: A Prerequisite for Self-Directed Surgical Training
ERIC Educational Resources Information Center
Hu, Yinin; Kim, Helen; Mahmutovic, Adela; Choi, Joanna; Le, Ivy; Rasmussen, Sara
2015-01-01
Simulation-based surgical skills training during preclinical education is a persistent challenge due to time constraints of trainees and instructors alike. Self-directed practice is resource-efficient and flexible; however, insight into technical proficiency among trainees is often lacking. The purpose of this study is to prospectively assess the…
While aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, there has been little effort devoted to verification of the spatial and temporal variability of the magnitude and directionality of aerosol radi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program
NASA Technical Reports Server (NTRS)
Manobianco, John; Nutter, Paul
1997-01-01
The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.
Experimental verification of an interpolation algorithm for improved estimates of animal position
NASA Astrophysics Data System (ADS)
Schell, Chad; Jaffe, Jules S.
2004-07-01
This article presents experimental verification of an interpolation algorithm that was previously proposed in Jaffe [J. Acoust. Soc. Am. 105, 3168-3175 (1999)]. The goal of the algorithm is to improve estimates of both target position and target strength by minimizing a least-squares residual between noise-corrupted target measurement data and the output of a model of the sonar's amplitude response to a target at a set of known locations. Although this positional estimator was shown to be a maximum likelihood estimator, in principle, experimental verification was desired because of interest in understanding its true performance. Here, the accuracy of the algorithm is investigated by analyzing the correspondence between a target's true position and the algorithm's estimate. True target position was measured by precise translation of a small test target (bead) or from the analysis of images of fish from a coregistered optical imaging system. Results with the stationary spherical test bead in a high signal-to-noise environment indicate that a large increase in resolution is possible, while results with commercial aquarium fish indicate a smaller increase is obtainable. However, in both experiments the algorithm provides improved estimates of target position over those obtained by simply accepting the angular positions of the sonar beam with maximum output as target position. In addition, increased accuracy in target strength estimation is possible by considering the effects of the sonar beam patterns relative to the interpolated position. A benefit of the algorithm is that it can be applied ``ex post facto'' to existing data sets from commercial multibeam sonar systems when only the beam intensities have been stored after suitable calibration.
Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael
2014-05-01
Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.
Investigation of air cleaning system response to accident conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrae, R.W.; Bolstad, J.W.; Foster, R.D.
1980-01-01
Air cleaning system response to the stress of accident conditions are being investigated. A program overview and hghlight recent results of our investigation are presented. The program includes both analytical and experimental investigations. Computer codes for predicting effects of tornados, explosions, fires, and material transport are described. The test facilities used to obtain supportive experimental data to define structural integrity and confinement effectiveness of ventilation system components are described. Examples of experimental results for code verification, blower response to tornado transients, and filter response to tornado and explosion transients are reported.
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
NASA Astrophysics Data System (ADS)
Jozwiak, Zbigniew Boguslaw
1995-01-01
Ethylene is an important auto-catalytic plant growth hormone. Removal of ethylene from the atmosphere surrounding ethylene-sensitive horticultural products may be very beneficial, allowing an extended period of storage and preventing or delaying the induction of disorders. Various ethylene removal techniques have been studied and put into practice. One technique is based on using low pressure mercury ultraviolet lamps as a source of photochemical energy to initiate chemical reactions that destroy ethylene. Although previous research showed that ethylene disappeared in experiments with mercury ultraviolet lamps, the reactions were not described and the actual cause of ethylene disappearance remained unknown. Proposed causes for this disappearance were the direct action of ultraviolet rays on ethylene, reaction of ethylene with ozone (which is formed when air or gas containing molecular oxygen is exposed to radiation emitted by this type of lamp), or reactions with atomic oxygen leading to formation of ozone. The objective of the present study was to determine the set of physical and chemical actions leading to the disappearance of ethylene from artificial storage atmosphere under conditions of ultraviolet irradiation. The goal was achieved by developing a static chemical model based on the physical properties of a commercially available ultraviolet lamp, the photochemistry of gases, and the kinetics of chemical reactions. The model was used to perform computer simulations predicting time dependent concentrations of chemical species included in the model. Development of the model was accompanied by the design of a reaction chamber used for experimental verification. The model provided a good prediction of the general behavior of the species involved in the chemistry under consideration; however the model predicted lower than measured rate of ethylene disappearance. Some reasons for the model -experiment disagreement are radiation intensity averaging, the experimental technique, mass transfer in the chamber, and incompleteness of the set of chemical reactions included in the model. The work is concluded with guidelines for development of a more complex mathematical model that includes elements of mass transfer inside the reaction chamber, and uses a three dimensional approach to distribute radiation from the low pressure mercury ultraviolet tube.
Delayed Gamma-Ray Spectroscopy for Non-Destructive Assay of Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludewigt, Bernhard; Mozin, Vladimir; Campbell, Luke
2015-06-01
High-energy, beta-delayed gamma-ray spectroscopy is a potential, non-destructive assay techniques for the independent verification of declared quantities of special nuclear materials at key stages of the fuel cycle and for directly assaying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Other potential applications include determination of MOX fuel composition, characterization of nuclear waste packages, and challenges in homeland security and arms control verification. Experimental measurements were performed to evaluate fission fragment yields, to test methods for determining isotopic fractions, and to benchmark the modeling code package. Experimental measurement campaigns were carried outmore » at the IAC using a photo-neutron source and at OSU using a thermal neutron beam from the TRIGA reactor to characterize the emission of high-energy delayed gamma rays from 235U, 239Pu, and 241Pu targets following neutron induced fission. Data were collected for pure and combined targets for several irradiation/spectroscopy cycle times ranging from 10/10 seconds to 15/30 minutes.The delayed gamma-ray signature of 241Pu, a significant fissile constituent in spent fuel, was measured and compared to 239Pu. The 241Pu/ 239Pu ratios varied between 0.5 and 1.2 for ten prominent lines in the 2700-3600 keV energy range. Such significant differences in relative peak intensities make it possible to determine relative fractions of these isotopes in a mixed sample. A method for determining fission product yields by fitting the energy and time dependence of the delayed gamma-ray emission was developed and demonstrated on a limited 235U data set. De-convolution methods for determining fissile fractions were developed and tested on the experimental data. The use of high count-rate LaBr 3 detectors was investigated as a potential alternative to HPGe detectors. Modeling capabilities were added to an existing framework and codes were adapted as needed for analyzing experiments and assessing application-specific assay concepts. A de-convolution analysis of the delayed gamma-ray response spectra modeled for spent fuel assemblies was performed using the same method that was applied to the experimental spectra.« less
Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K
2013-03-04
The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-12-01
The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-01-01
The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383
Is identity per se irrelevant? A contrarian view of self-verification effects.
Gregg, Aiden P
2009-01-01
Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby hindering therapeutic recovery. Despite the widespread acceptance of SVT, I contend that the evidence for it is weak and circumstantial. In particular, I contend that that most or all major findings cited in support of SVT can be more economically explained in terms of raison oblige theory (ROT). ROT posits that people with negative self-views solicit critical feedback, not because they want it, but because they their self-view inclines them regard it as probative, a necessary condition for considering it worth obtaining. Relevant findings are reviewed and reinterpreted with an emphasis on depression, and some new empirical data reported. (c) 2008 Wiley-Liss, Inc.
Verification in Referral-Based Crowdsourcing
Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.
2012-01-01
Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
Simulation and Experimental Study on Cavitating Water Jet Nozzle
NASA Astrophysics Data System (ADS)
Zhou, Wei; He, Kai; Cai, Jiannan; Hu, Shaojie; Li, Jiuhua; Du, Ruxu
2017-01-01
Cavitating water jet technology is a new kind of water jet technology with many advantages, such as energy-saving, efficient, environmentally-friendly and so on. Based on the numerical simulation and experimental verification in this paper, the research on cavitating nozzle has been carried out, which includes comparison of the cleaning ability of the cavitating jet and the ordinary jet, and comparison of cavitation effects of different structures of cavitating nozzles.
Three-dimensional surface contouring of macroscopic objects by means of phase-difference images.
Velásquez Prieto, Daniel; Garcia-Sucerquia, Jorge
2006-09-01
We report a technique to determine the 3D contour of objects with dimensions of at least 4 orders of magnitude larger than the illumination optical wavelength. Our proposal is based on the numerical reconstruction of the optical wave field of digitally recorded holograms. The required modulo 2pi phase map in any contouring process is obtained by means of the direct subtraction of two phase-contrast images under different illumination angles to create a phase-difference image of a still object. Obtaining the phase-difference images is only possible by using the capability of numerical reconstruction of the complex optical field provided by digital holography. This unique characteristic leads us to a robust, reliable, and fast procedure that requires only two images. A theoretical analysis of the contouring system is shown, with verification by means of numerical and experimental results.
Independent control of differently-polarized waves using anisotropic gradient-index metamaterials
Ma, Hui Feng; Wang, Gui Zhen; Jiang, Wei Xiang; Cui, Tie Jun
2014-01-01
We propose a kind of anisotropic gradient-index (GRIN) metamaterials, which can be used to control differently-polarized waves independently. We show that two three- dimensional (3D) planar lenses made of such anisotropic GRIN metamaterials are able to make arbitrary beam deflections for the vertical (or horizontal) polarization but have no response to the horizontal (or vertical) polarization. Then the vertically- and horizontally-polarized waves are separated and controlled independently to deflect to arbitrarily different directions by designing the anisotropic GRIN planar lenses. We make experimental verifications of the lenses using such a special metamaterial, which has both electric and magnetic responses simultaneously to reach approximately equal permittivity and permeability. Hence excellent impedance matching is obtained between the GRIN planar lenses and the air. The measurement results demonstrate good performance on the independent controls of differently-polarized waves, as observed in the numerical simulations. PMID:25231412
NASA Astrophysics Data System (ADS)
Banerjee, Ipsita
2009-03-01
Knowledge of pathways governing cellular differentiation to specific phenotype will enable generation of desired cell fates by careful alteration of the governing network by adequate manipulation of the cellular environment. With this aim, we have developed a novel method to reconstruct the underlying regulatory architecture of a differentiating cell population from discrete temporal gene expression data. We utilize an inherent feature of biological networks, that of sparsity, in formulating the network reconstruction problem as a bi-level mixed-integer programming problem. The formulation optimizes the network topology at the upper level and the network connectivity strength at the lower level. The method is first validated by in-silico data, before applying it to the complex system of embryonic stem (ES) cell differentiation. This formulation enables efficient identification of the underlying network topology which could accurately predict steps necessary for directing differentiation to subsequent stages. Concurrent experimental verification demonstrated excellent agreement with model prediction.
NASA Astrophysics Data System (ADS)
Pascuet, M. I.; Castin, N.; Becquart, C. S.; Malerba, L.
2011-05-01
An atomistic kinetic Monte Carlo (AKMC) method has been applied to study the stability and mobility of copper-vacancy clusters in Fe. This information, which cannot be obtained directly from experimental measurements, is needed to parameterise models describing the nanostructure evolution under irradiation of Fe alloys (e.g. model alloys for reactor pressure vessel steels). The physical reliability of the AKMC method has been improved by employing artificial intelligence techniques for the regression of the activation energies required by the model as input. These energies are calculated allowing for the effects of local chemistry and relaxation, using an interatomic potential fitted to reproduce them as accurately as possible and the nudged-elastic-band method. The model validation was based on comparison with available ab initio calculations for verification of the used cohesive model, as well as with other models and theories.
Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan
2016-05-01
In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Full-field 3D shape measurement of specular object having discontinuous surfaces
NASA Astrophysics Data System (ADS)
Zhang, Zonghua; Huang, Shujun; Gao, Nan; Gao, Feng; Jiang, Xiangqian
2017-06-01
This paper presents a novel Phase Measuring Deflectometry (PMD) method to measure specular objects having discontinuous surfaces. A mathematical model is established to directly relate the absolute phase and depth, instead of the phase and gradient. Based on the model, a hardware measuring system has been set up, which consists of a precise translating stage, a projector, a diffuser and a camera. The stage locates the projector and the diffuser together to a known position during measurement. By using the model-based and machine vision methods, system calibration is accomplished to provide the required parameters and conditions. The verification tests are given to evaluate the effectiveness of the developed system. 3D (Three-Dimensional) shapes of a concave mirror and a monolithic multi-mirror array having multiple specular surfaces have been measured. Experimental results show that the proposed method can obtain 3D shape of specular objects having discontinuous surfaces effectively
The BEFWM system for detection and phase conjugation of a weak laser beam
NASA Astrophysics Data System (ADS)
Khizhnyak, Anatoliy; Markov, Vladimir
2007-09-01
Real environmental conditions, such as atmospheric turbulence and aero-optics effects, make practical implementation of the object-in-the-loop (TIL) algorithm a very difficult task, especially when the system is set to operate with a signal from the diffuse surface image-resolved object. The problem becomes even more complex since for the remote object the intensity of the returned signal is extremely low. This presentation discusses the results of an analysis and experimental verification of a thresholdless coherent signal receiving system, capable not only in high-sensitivity detection of an ultra weak object-scattered light, but also in its high-gain amplification and phase conjugation. The process of coherent detection by using the Brillouin Enhanced Four Wave Mixing (BEFWM) enables retrieval of complete information on the received signal, including accurate measurement of its wavefront. This information can be used for direct real-time control of the adaptive mirror.
Wang, Yanran; Xiao, Gang; Dai, Zhouyun
2017-11-13
Automatic Dependent Surveillance-Broadcast (ADS-B) is the direction of airspace surveillance development. Research analyzing the benefits of Traffic Collision Avoidance System (TCAS) and ADS-B data fusion is almost absent. The paper proposes an ADS-B minimum system from ADS-B In and ADS-B Out. In ADS-B In, a fusion model with a variable sampling Variational Bayesian-Interacting Multiple Model (VSVB-IMM) algorithm is proposed for integrated display and an airspace traffic situation display is developed by using ADS-B information. ADS-B Out includes ADS-B Out transmission based on a simulator platform and an Unmanned Aerial Vehicle (UAV) platform. This paper describes the overall implementation of ADS-B minimum system, including theoretical model design, experimental simulation verification, engineering implementation, results analysis, etc. Simulation and implementation results show that the fused system has better performance than each independent subsystem and it can work well in engineering applications.
Fiber-optic evanescent-field sensor for attitude measurement
NASA Astrophysics Data System (ADS)
Liu, Yun; Chen, Shimeng; Liu, Zigeng; Guang, Jianye; Peng, Wei
2017-11-01
We proposed a new approach to attitude measurement by an evanescent field-based optical fiber sensing device and demonstrated a liquid pendulum. The device consisted of three fiber-optic evanescent-filed sensors which were fabricated by tapered single mode fibers and immersed in liquid. Three fiber Bragg gratings were used to measure the changes in evanescent field. And their reflection peaks were monitored in real time as measurement signals. Because every set of reflection responses corresponded to a unique attitude, the attitude of the device could be measured by the three fiber-optic evanescent-filed sensors. After theoretical analysis, computerized simulation and experimental verification, regular responses were obtained using this device for attitude measurement. The measurement ranges of dihedral angle and direction angle were 0°-50° and 0°-360°. The device is based on cost-effective power-referenced scheme. It can be used in electromagnetic or nuclear radiation environment.
Is there a relationship between curvature and inductance in the Josephson junction?
NASA Astrophysics Data System (ADS)
Dobrowolski, T.; Jarmoliński, A.
2018-03-01
A Josephson junction is a device made of two superconducting electrodes separated by a very thin layer of isolator or normal metal. This relatively simple device has found a variety of technical applications in the form of Superconducting Quantum Interference Devices (SQUIDs) and Single Electron Transistors (SETs). One can expect that in the near future the Josephson junction will find applications in digital electronics technology RSFQ (Rapid Single Flux Quantum) and in the more distant future in construction of quantum computers. Here we concentrate on the relation of the curvature of the Josephson junction with its inductance. We apply a simple Capacitively Shunted Junction (CSJ) model in order to find condition which guarantees consistency of this model with prediction based on the Maxwell and London equations with Landau-Ginzburg current of Cooper pairs. This condition can find direct experimental verification.
NASA Astrophysics Data System (ADS)
Matsuura, Masahiro; Mano, Takaaki; Noda, Takeshi; Shibata, Naokazu; Hotta, Masahiro; Yusa, Go
2018-02-01
Quantum energy teleportation (QET) is a proposed protocol related to quantum vacuum. The edge channels in a quantum Hall system are well suited for the experimental verification of QET. For this purpose, we examine a charge-density wave packet excited and detected by capacitively coupled front gate electrodes. We observe the waveform of the charge packet, which is proportional to the time derivative of the applied square voltage wave. Further, we study the transmission and reflection behaviors of the charge-density wave packet by applying a voltage to another front gate electrode to control the path of the edge state. We show that the threshold voltages where the dominant direction is switched in either transmission or reflection for dense and sparse wave packets are different from the threshold voltage where the current stops flowing in an equilibrium state.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... Antenna Performance Verification AGENCY: Federal Communications Commission. ACTION: Final rule; correction... as follows: Subpart BB--Disturbance of AM Broadcast Station Antenna Patterns * * * * * Federal...
ERIC Educational Resources Information Center
Deeson, Eric
1971-01-01
Reports a verification that hot water begins to freeze sooner than cooler water. Includes the investigations that lead to the conclusions that convection is a major influence, water content may have some effect, and the melting of the ice under the container makes no difference on the experimental results. (DS)
Bohata, J; Zvanovec, S; Pesek, P; Korinek, T; Mansour Abadi, M; Ghassemlooy, Z
2016-03-10
This paper describes the experimental verification of the utilization of long-term evolution radio over fiber (RoF) and radio over free space optics (RoFSO) systems using dual-polarization signals for cloud radio access network applications determining the specific utilization limits. A number of free space optics configurations are proposed and investigated under different atmospheric turbulence regimes in order to recommend the best setup configuration. We show that the performance of the proposed link, based on the combination of RoF and RoFSO for 64 QAM at 2.6 GHz, is more affected by the turbulence based on the measured difference error vector magnitude value of 5.5%. It is further demonstrated the proposed systems can offer higher noise immunity under particular scenarios with the signal-to-noise ratio reliability limit of 5 dB in the radio frequency domain for RoF and 19.3 dB in the optical domain for a combination of RoF and RoFSO links.
NASA Astrophysics Data System (ADS)
Sakamoto, Yasuaki; Kashiwagi, Takayuki; Hasegawa, Hitoshi; Sasakawa, Takashi; Fujii, Nobuo
This paper describes the design considerations and experimental verification of an LIM rail brake armature. In order to generate power and maximize the braking force density despite the limited area between the armature and the rail and the limited space available for installation, we studied a design method that is suitable for designing an LIM rail brake armature; we considered adoption of a ring winding structure. To examine the validity of the proposed design method, we developed a prototype ring winding armature for the rail brakes and examined its electromagnetic characteristics in a dynamic test system with roller rigs. By repeating various tests, we confirmed that unnecessary magnetic field components, which were expected to be present under high speed running condition or when a ring winding armature was used, were not present. Further, the necessary magnetic field component and braking force attained the desired values. These studies have helped us to develop a basic design method that is suitable for designing the LIM rail brake armatures.
NASA Astrophysics Data System (ADS)
Konishi, Takeshi; Hase, Shin-Ichi; Nakamichi, Yoshinobu; Nara, Hidetaka; Uemura, Tadashi
Interest has been shown in the concept of an energy storage system aimed at leveling load and improving energy efficiency by charging during vehicle regeneration and discharging during running. Such a system represents an efficient countermeasure against pantograph point voltage drop, power load fluctuation and regenerative power loss. We selected an EDLC model as an energy storage medium and a step-up/step-down chopper as a power converter to exchange power between the storage medium and overhead lines. Basic verification was conducted using a mini-model for DC 400V, demonstrating characteristics suitable for its use as an energy storage system. Based on these results, an energy storage system was built for DC 600V and a verification test conducted in conjunction with the Enoshima Electric Railway Co. Ltd. This paper gives its experimental analysis of voltage drop compensation in a DC electrified railway and some discussions based on the test.
JPL control/structure interaction test bed real-time control computer architecture
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1989-01-01
The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.
Chiang, Chia-Wen; Wang, Yong; Sun, Peng; Lin, Tsen-Hsuan; Trinkaus, Kathryn; Cross, Anne H.; Song, Sheng-Kwei
2014-01-01
The effect of extra-fiber structural and pathological components confounding diffusion tensor imaging (DTI) computation was quantitatively investigated using data generated by both Monte-Carlo simulations and tissue phantoms. Increased extent of vasogenic edema, by addition of various amount of gel to fixed normal mouse trigeminal nerves or by increasing non-restricted isotropic diffusion tensor components in Monte-Carlo simulations, significantly decreased fractional anisotropy (FA), increased radial diffusivity, while less significantly increased axial diffusivity derived by DTI. Increased cellularity, mimicked by graded increase of the restricted isotropic diffusion tensor component in Monte-Carlo simulations, significantly decreased FA and axial diffusivity with limited impact on radial diffusivity derived by DTI. The MC simulation and tissue phantom data were also analyzed by the recently developed diffusion basis spectrum imaging (DBSI) to simultaneously distinguish and quantify the axon/myelin integrity and extra-fiber diffusion components. Results showed that increased cellularity or vasogenic edema did not affect the DBSI-derived fiber FA, axial or radial diffusivity. Importantly, the extent of extra-fiber cellularity and edema estimated by DBSI correlated with experimentally added gel and Monte-Carlo simulations. We also examined the feasibility of applying 25-direction diffusion encoding scheme for DBSI analysis on coherent white matter tracts. Results from both phantom experiments and simulations suggested that the 25-direction diffusion scheme provided comparable DBSI estimation of both fiber diffusion parameters and extra-fiber cellularity/edema extent as those by 99-direction scheme. An in vivo 25-direction DBSI analysis was performed on experimental autoimmune encephalomyelitis (EAE, an animal model of human multiple sclerosis) optic nerve as an example to examine the validity of derived DBSI parameters with post-imaging immunohistochemistry verification. Results support that in vivo DBSI using 25-direction diffusion scheme correctly reflect the underlying axonal injury, demyelination, and inflammation of optic nerves in EAE mice. PMID:25017446
NDEC: A NEA platform for nuclear data testing, verification and benchmarking
NASA Astrophysics Data System (ADS)
Díez, C. J.; Michel-Sendis, F.; Cabellos, O.; Bossant, M.; Soppera, N.
2017-09-01
The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle) platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.
Status on the Verification of Combustion Stability for the J-2X Engine Thrust Chamber Assembly
NASA Technical Reports Server (NTRS)
Casiano, Matthew; Hinerman, Tim; Kenny, R. Jeremy; Hulka, Jim; Barnett, Greg; Dodd, Fred; Martin, Tom
2013-01-01
Development is underway of the J -2X engine, a liquid oxygen/liquid hydrogen rocket engine for use on the Space Launch System. The Engine E10001 began hot fire testing in June 2011 and testing will continue with subsequent engines. The J -2X engine main combustion chamber contains both acoustic cavities and baffles. These stability aids are intended to dampen the acoustics in the main combustion chamber. Verification of the engine thrust chamber stability is determined primarily by examining experimental data using a dynamic stability rating technique; however, additional requirements were included to guard against any spontaneous instability or rough combustion. Startup and shutdown chug oscillations are also characterized for this engine. This paper details the stability requirements and verification including low and high frequency dynamics, a discussion on sensor selection and sensor port dynamics, and the process developed to assess combustion stability. A status on the stability results is also provided and discussed.
Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP
NASA Astrophysics Data System (ADS)
Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio
1988-09-01
This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1992-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarchalski, M.; Pytel, K.; Wroblewska, M.
2015-07-01
Precise computational determination of nuclear heating which consists predominantly of gamma heating (more than 80 %) is one of the challenges in material testing reactor exploitation. Due to sophisticated construction and conditions of experimental programs planned in JHR it became essential to use most accurate and precise gamma heating model. Before the JHR starts to operate, gamma heating evaluation methods need to be developed and qualified in other experimental reactor facilities. This is done inter alia using OSIRIS, MINERVE or EOLE research reactors in France. Furthermore, MARIA - Polish material testing reactor - has been chosen to contribute to themore » qualification of gamma heating calculation schemes/tools. This reactor has some characteristics close to those of JHR (beryllium usage, fuel element geometry). To evaluate gamma heating in JHR and MARIA reactors, both simulation tools and experimental program have been developed and performed. For gamma heating simulation, new calculation scheme and gamma heating model of MARIA have been carried out using TRIPOLI4 and APOLLO2 codes. Calculation outcome has been verified by comparison to experimental measurements in MARIA reactor. To have more precise calculation results, model of MARIA in TRIPOLI4 has been made using the whole geometry of the core. This has been done for the first time in the history of MARIA reactor and was complex due to cut cone shape of all its elements. Material composition of burnt fuel elements has been implemented from APOLLO2 calculations. An experiment for nuclear heating measurements and calculation verification has been done in September 2014. This involved neutron, photon and nuclear heating measurements at selected locations in MARIA reactor using in particular Rh SPND, Ag SPND, Ionization Chamber (all three from CEA), KAROLINA calorimeter (NCBJ) and Gamma Thermometer (CEA/SCK CEN). Measurements were done in forty points using four channels. Maximal nuclear heating evaluated from measurements is of the order of 2.5 W/g at half of the possible MARIA power - 15 MW. The approach and the detailed program for experimental verification of calculations will be presented. The following points will be discussed: - Development of a gamma heating model of MARIA reactor with TRIPOLI 4 (coupled neutron-photon mode) and APOLLO2 model taking into account the key parameters like: configuration of the core, experimental loading, control rod location, reactor power, fuel depletion); - Design of specific measurement tools for MARIA experiments including for instance a new single-cell calorimeter called KAROLINA calorimeter; - MARIA experimental program description and a preliminary analysis of results; - Comparison of calculations for JHR and MARIA cores with experimental verification analysis, calculation behavior and n-γ 'environments'. (authors)« less
Heat transfer direction dependence of heat transfer coefficients in annuli
NASA Astrophysics Data System (ADS)
Prinsloo, Francois P. A.; Dirker, Jaco; Meyer, Josua P.
2018-04-01
In this experimental study the heat transfer phenomena in concentric annuli in tube-in-tube heat exchangers at different annular Reynolds numbers, annular diameter ratios, and inlet fluid temperatures using water were considered. Turbulent flow with Reynolds numbers ranging from 15,000 to 45,000, based on the average bulk fluid temperature was tested at annular diameter ratios of 0.327, 0.386, 0.409 and 0.483 with hydraulic diameters of 17.00, 22.98, 20.20 and 26.18 mm respectively. Both heated and cooled annuli were investigated by conducting tests at a range of inlet temperatures between 10 °C to 30 °C for heating cases, and 30 °C to 50 °C for cooling cases. Of special interest was the direct measurement of local wall temperatures on the heat transfer surface, which is often difficult to obtain and evasive in data-sets. Continuous verification and re-evaluation of temperatures measurements were performed via in-situ calibration. It is shown that inlet fluid temperature and the heat transfer direction play significant roles on the magnitude of the heat transfer coefficient. A new adjusted Colburn j-factor definition is presented to describe the heating and cooling cases and is used to correlate the 894 test cases considered in this study.
Experimental verification of rank 1 chaos in switch-controlled Chua circuit.
Oksasoglu, Ali; Ozoguz, Serdar; Demirkol, Ahmet S; Akgul, Tayfun; Wang, Qiudong
2009-03-01
In this paper, we provide the first experimental proof for the existence of rank 1 chaos in the switch-controlled Chua circuit by following a step-by-step procedure given by the theory of rank 1 maps. At the center of this procedure is a periodically kicked limit cycle obtained from the unforced system. Then, this limit cycle is subjected to periodic kicks by adding externally controlled switches to the original circuit. Both the smooth nonlinearity and the piecewise linear cases are considered in this experimental investigation. Experimental results are found to be in concordance with the conclusions of the theory.
Escobedo, Patricia; Cruz, Tess Boley; Tsai, Kai-Ya; Allem, Jon-Patrick; Soto, Daniel W; Kirkpatrick, Matthew G; Pattarroyo, Monica; Unger, Jennifer B
2017-09-11
Limited information exists about strategies and methods used on brand marketing websites to transmit pro-tobacco messages to tobacco users and potential users. This study compared age verification methods, themes, interactive activities and links to social media across tobacco brand websites. This study examined 12 tobacco brand websites representing four tobacco product categories: cigarettes, cigar/cigarillos, smokeless tobacco, and e-cigarettes. Website content was analyzed by tobacco product category and data from all website visits (n = 699) were analyzed. Adult smokers (n=32) coded websites during a one-year period, indicating whether or not they observed any of 53 marketing themes, seven interactive activities, or five external links to social media sites. Most (58%) websites required online registration before entering, however e-cigarette websites used click-through age verification. Compared to cigarette sites, cigar/cigarillo sites were more likely to feature themes related to "party" lifestyle, and e-cigarette websites were much more likely to feature themes related to harm reduction. Cigarette sites featured greater levels of interactive content compared to other tobacco products. Compared to cigarette sites, cigar/cigarillo sites were more likely to feature activities related to events and music. Compared to cigarette sites, both cigar and e-cigarette sites were more likely to direct visitors to external social media sites. Marketing methods and strategies normalize tobacco use by providing website visitors with positive themes combined with interactive content, and is an area of future research. Moreover, all tobacco products under federal regulatory authority should be required to use more stringent age verification gates. Findings indicate the Food and Drug Administration (FDA) should require brand websites of all tobacco products under its regulatory authority use more stringent age verification gates by requiring all visitors be at least 18 years of age and register online prior to entry. This is important given that marketing strategies may encourage experimentation with tobacco or deter quit attempts among website visitors. Future research should examine the use of interactive activities and social media on a wide variety of tobacco brand websites as interactive content is associated with more active information processing. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOT National Transportation Integrated Search
2000-06-01
In 1997, a load rating of an historic reinforced concrete bridge in Oregon, Horsetail Creek Bridge, indicated substandard shear and moment capacities of the beams. As a result, the Bridge was strengthened with fiber reinforced : polymer composites as...
NASA Astrophysics Data System (ADS)
Müller, A.; Urich, D.; Kreck, G.; Metzmacher, M.; Lindner, R.
2018-04-01
The presentation will cover results from an ESA supported investigation to collect lessons learned for mechanism assembly with the focus on quality and contamination requirements verification in exploration projects such as ExoMars.
Rapid toxicity technologies can detect certain toxins and with testing it can be determined their susceptibility to interfering chemical in controlled experimental matrix. Rapid toxicity technologies do not identify or determine the concentrations of specific contaminants, but s...
Verification and Validation of a Three-Dimensional Generalized Composite Material Model
NASA Technical Reports Server (NTRS)
Hoffarth, Canio; Harrington, Joseph; Subramaniam, D. Rajan; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther
2014-01-01
A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800- F3900 fiber/resin composite material.
Verification and Validation of a Three-Dimensional Generalized Composite Material Model
NASA Technical Reports Server (NTRS)
Hoffarth, Canio; Harrington, Joseph; Rajan, Subramaniam D.; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther
2015-01-01
A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800-F3900 fiber/resin composite material
Large - scale Rectangular Ruler Automated Verification Device
NASA Astrophysics Data System (ADS)
Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie
2018-03-01
This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.
A fingerprint key binding algorithm based on vector quantization and error correction
NASA Astrophysics Data System (ADS)
Li, Liang; Wang, Qian; Lv, Ke; He, Ning
2012-04-01
In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.
NASA Technical Reports Server (NTRS)
Kascak, Peter E.; Kenny, Barbara H.; Dever, Timothy P.; Santiago, Walter; Jansen, Ralph H.
2001-01-01
An experimental flywheel energy storage system is described. This system is being used to develop a flywheel based replacement for the batteries on the International Space Station (ISS). Motor control algorithms which allow the flywheel to interface with a simplified model of the ISS power bus, and function similarly to the existing ISS battery system, are described. Results of controller experimental verification on a 300 W-hr flywheel are presented.
NASA Astrophysics Data System (ADS)
McIntyre, Gregory; Neureuther, Andrew; Slonaker, Steve; Vellanki, Venu; Reynolds, Patrick
2006-03-01
The initial experimental verification of a polarization monitoring technique is presented. A series of phase shifting mask patterns produce polarization dependent signals in photoresist and are capable of monitoring the Stokes parameters of any arbitrary illumination scheme. Experiments on two test reticles have been conducted. The first reticle consisted of a series of radial phase gratings (RPG) and employed special apertures to select particular illumination angles. Measurement sensitivities of about 0.3 percent of the clear field per percent change in polarization state were observed. The second test reticle employed the more sensitive proximity effect polarization analyzers (PEPA), a more robust experimental setup, and a backside pinhole layer for illumination angle selection and to enable characterization of the full illuminator. Despite an initial complication with the backside pinhole alignment, the results correlate with theory. Theory suggests that, once the pinhole alignment is corrected in the near future, the second reticle should achieve a measurement sensitivity of about 1 percent of the clear field per percent change in polarization state. This corresponds to a measurement of the Stokes parameters after test mask calibration, to within about 0.02 to 0.03. Various potential improvements to the design, fabrication of the mask, and experimental setup are discussed. Additionally, to decrease measurement time, a design modification and double exposure technique is proposed to enable electrical detection of the measurement signal.
Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda
2003-01-01
Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.
NASA Astrophysics Data System (ADS)
Duparré, Jacques; Wippermann, Frank; Dannberg, Peter; Schreiber, Peter; Bräuer, Andreas; Völkel, Reinhard; Scharf, Toralf
2005-09-01
Two novel objective types on the basis of artificial compound eyes are examined. Both imaging systems are well suited for fabrication using microoptics technology due to the small required lens sags. In the apposition optics a microlens array (MLA) and a photo detector array of different pitch in its focal plane are applied. The image reconstruction is based on moire magnification. Several generations of demonstrators of this objective type are manufactured by photo lithographic processes. This includes a system with opaque walls between adjacent channels and an objective which is directly applied onto a CMOS detector array. The cluster eye approach, which is based on a mixture of superposition compound eyes and the vision system of jumping spiders, produces a regular image. Here, three microlens arrays of different pitch form arrays of Keplerian microtelescopes with tilted optical axes, including a field lens. The microlens arrays of this demonstrator are also fabricated using microoptics technology, aperture arrays are applied. Subsequently the lens arrays are stacked to the overall microoptical system on wafer scale. Both fabricated types of artificial compound eye imaging systems are experimentally characterized with respect to resolution, sensitivity and cross talk between adjacent channels. Captured images are presented.
MHD Simulations of Plasma Dynamics with Non-Axisymmetric Boundaries
NASA Astrophysics Data System (ADS)
Hansen, Chris; Levesque, Jeffrey; Morgan, Kyle; Jarboe, Thomas
2015-11-01
The arbitrary geometry, 3D extended MHD code PSI-TET is applied to linear and non-linear simulations of MCF plasmas with non-axisymmetric boundaries. Progress and results from simulations on two experiments will be presented: 1) Detailed validation studies of the HIT-SI experiment with self-consistent modeling of plasma dynamics in the helicity injectors. Results will be compared to experimental data and NIMROD simulations that model the effect of the helicity injectors through boundary conditions on an axisymmetric domain. 2) Linear studies of HBT-EP with different wall configurations focusing on toroidal asymmetries in the adjustable conducting wall. HBT-EP studies the effect of active/passive stabilization with an adjustable ferritic wall. Results from linear verification and benchmark studies of ideal mode growth with and without toroidal asymmetries will be presented and compared to DCON predictions. Simulations of detailed experimental geometries are enabled by use of the PSI-TET code, which employs a high order finite element method on unstructured tetrahedral grids that are generated directly from CAD models. Further development of PSI-TET will also be presented including work to support resistive wall regions within extended MHD simulations. Work supported by DoE.
Development of eddy current probe for fiber orientation assessment in carbon fiber composites
NASA Astrophysics Data System (ADS)
Wincheski, Russell A.; Zhao, Selina
2018-04-01
Measurement of the fiber orientation in a carbon fiber composite material is crucial in understanding the load carrying capability of the structure. As manufacturing conditions including resin flow and molding pressures can alter fiber orientation, verification of the as-designed fiber layup is necessary to ensure optimal performance of the structure. In this work, the development of an eddy current probe and data processing technique for analysis of fiber orientation in carbon fiber composites is presented. A proposed directional eddy current probe is modeled and its response to an anisotropic multi-layer conductor simulated. The modeling results are then used to finalize specifications of the eddy current probe. Experimental testing of the fabricated probe is presented for several samples including a truncated pyramid part with complex fiber orientation draped to the geometry for resin transfer molding. The inductively coupled single sided measurement enables fiber orientation characterization through the thickness of the part. The fast and cost-effective technique can be applied as a spot check or as a surface map of the fiber orientations across the structure. This paper will detail the results of the probe design, computer simulations, and experimental results.
An Investigation into Solution Verification for CFD-DEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fullmer, William D.; Musser, Jordan
This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of anmore » experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6 th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing different randomized particle configurations of the same general problem (for the fictitious case) or different instances of freezing a transient simulation, the numerical uncertainties appeared to be on the same order of magnitude as ensemble or time averaging uncertainties. By testing different drag laws, almost all cases studied show that model form uncertainty in this one, very important closure relation was larger than the numerical uncertainty, at least with a reasonable CFD grid, roughly five particle diameters. In this study, the diffusion width (filtering length scale) was mostly set at a constant of six particle diameters. A few exploratory tests were performed to show that similar convergence behavior was observed for diffusion widths greater than approximately two particle diameters. However, this subject was not investigated in great detail because determining an appropriate filter size is really a validation question which must be determined by comparison to experimental or highly accurate numerical data. Future studies are being considered targeting solution verification of transient simulations as well as validation of the filter size with direct numerical simulation data.« less
The 2014 Sandia Verification and Validation Challenge: Problem statement
Hu, Kenneth; Orient, George
2016-01-18
This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less
Estimation of Nitrogen Vertical Distribution by Bi-Directional Canopy Reflectance in Winter Wheat
Huang, Wenjiang; Yang, Qinying; Pu, Ruiliang; Yang, Shaoyuan
2014-01-01
Timely measurement of vertical foliage nitrogen distribution is critical for increasing crop yield and reducing environmental impact. In this study, a novel method with partial least square regression (PLSR) and vegetation indices was developed to determine optimal models for extracting vertical foliage nitrogen distribution of winter wheat by using bi-directional reflectance distribution function (BRDF) data. The BRDF data were collected from ground-based hyperspectral reflectance measurements recorded at the Xiaotangshan Precision Agriculture Experimental Base in 2003, 2004 and 2007. The view zenith angles (1) at nadir, 40° and 50°; (2) at nadir, 30° and 40°; and (3) at nadir, 20° and 30° were selected as optical view angles to estimate foliage nitrogen density (FND) at an upper, middle and bottom layer, respectively. For each layer, three optimal PLSR analysis models with FND as a dependent variable and two vegetation indices (nitrogen reflectance index (NRI), normalized pigment chlorophyll index (NPCI) or a combination of NRI and NPCI) at corresponding angles as explanatory variables were established. The experimental results from an independent model verification demonstrated that the PLSR analysis models with the combination of NRI and NPCI as the explanatory variables were the most accurate in estimating FND for each layer. The coefficients of determination (R2) of this model between upper layer-, middle layer- and bottom layer-derived and laboratory-measured foliage nitrogen density were 0.7335, 0.7336, 0.6746, respectively. PMID:25353983
Estimation of nitrogen vertical distribution by bi-directional canopy reflectance in winter wheat.
Huang, Wenjiang; Yang, Qinying; Pu, Ruiliang; Yang, Shaoyuan
2014-10-28
Timely measurement of vertical foliage nitrogen distribution is critical for increasing crop yield and reducing environmental impact. In this study, a novel method with partial least square regression (PLSR) and vegetation indices was developed to determine optimal models for extracting vertical foliage nitrogen distribution of winter wheat by using bi-directional reflectance distribution function (BRDF) data. The BRDF data were collected from ground-based hyperspectral reflectance measurements recorded at the Xiaotangshan Precision Agriculture Experimental Base in 2003, 2004 and 2007. The view zenith angles (1) at nadir, 40° and 50°; (2) at nadir, 30° and 40°; and (3) at nadir, 20° and 30° were selected as optical view angles to estimate foliage nitrogen density (FND) at an upper, middle and bottom layer, respectively. For each layer, three optimal PLSR analysis models with FND as a dependent variable and two vegetation indices (nitrogen reflectance index (NRI), normalized pigment chlorophyll index (NPCI) or a combination of NRI and NPCI) at corresponding angles as explanatory variables were established. The experimental results from an independent model verification demonstrated that the PLSR analysis models with the combination of NRI and NPCI as the explanatory variables were the most accurate in estimating FND for each layer. The coefficients of determination (R2) of this model between upper layer-, middle layer- and bottom layer-derived and laboratory-measured foliage nitrogen density were 0.7335, 0.7336, 0.6746, respectively.
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.
1984-01-01
The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chemerisov, Sergey; Gromov, Roman; Makarashvili, Vakho
Argonne is assisting SHINE Medical Technologies in developing SHINE, a system for producing fission-product 99Mo using a D/T-accelerator to produce fission in a non-critical target solution of aqueous uranyl sulfate. We have developed an experimental setup for studying thermal-hydraulics and bubble formation in the uranyl sulfate solution to simulate conditions expected in the SHINE target solution during irradiation. A direct electron beam from the linac accelerator will be used to irradiate a 20 L solution (sector of the solution vessel). Because the solution will undergo radiolytic decomposition, we will be able to study bubble formation and dynamics and effects ofmore » convection and temperature on bubble behavior. These experiments will serve as a verification/ validation tool for the thermal-hydraulic model. Utilization of the direct electron beam for irradiation allows homogeneous heating of a large solution volume and simplifies observation of the bubble dynamics simultaneously with thermal-hydraulic data collection, which will complement data collected during operation of the miniSHINE experiment. Irradiation will be conducted using a 30-40 MeV electron beam from the high-power linac accelerator. The total electron-beam power will be 20 kW, which will yield a power density on the order of 1 kW/L. The solution volume will be cooled on the front and back surfaces and central tube to mimic the geometry of the proposed SHINE solution vessel. Also, multiple thermocouples will be inserted into the solution vessel to map thermal profiles. The experimental design is now complete, and installation and testing are in progress.« less
An Educational Laboratory for Digital Control and Rapid Prototyping of Power Electronic Circuits
ERIC Educational Resources Information Center
Choi, Sanghun; Saeedifard, M.
2012-01-01
This paper describes a new educational power electronics laboratory that was developed primarily to reinforce experimentally the fundamental concepts presented in a power electronics course. The developed laboratory combines theoretical design, simulation studies, digital control, fabrication, and verification of power-electronic circuits based on…
2015-04-01
Measurement of radiative and nonradiative recombination rates in InGaAsP and AlGaAs light sources’, IEEE J. Quantum Electron., 1984, QE-20, (8), pp. 838–854 ELECTRONICS LETTERS 16th September 2004 Vol. 40 No. 19
DOT National Transportation Integrated Search
1978-03-01
This report deals with the selection of a test site, the design of a test installation, equipment selection, the installation and start-up of a pneumatic pipeline system for the transportation of tunnel muck. A review of prior pneumatic applications ...
Self-Justification as a Determinant of Performance-Style Effectiveness
ERIC Educational Resources Information Center
McKenna, Ralph J.
1971-01-01
This study examined experimentally the effect of justification on role playing, attempting a more complete verification of the performance style type. Also of concern was whether scores on the Performance Style Test were generalizable to overt behavior on the part of females. Results supported both concerns. (Author/CG)
Yang, Jiashi
2007-04-01
This letter discusses the difference between piezoelectric constitutive relations for the case of one-dimensional stress and the case of one-dimensional strain, and its implications in the modeling of Rosen piezoelectric transformers.
National Centers for Environmental Prediction
Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration consists of the following components: - The NOAA Environmental Modeling System (NEMS) version of the Non updates for the 12 km parent domain and the 3 km CONUS/Alaska nests. The non-cycled nests (Hawaii, Puerto
Verification and Validation of Monte Carlo N-Particle 6 for Computing Gamma Protection Factors
2015-03-26
methods for evaluating RPFs, which it used for the subsequent 30 years. These approaches included computational modeling, radioisotopes , and a high...1.2.1. Past Methods of Experimental Evaluation ........................................................ 2 1.2.2. Modeling Efforts...Other Considerations ......................................................................................... 14 2.4. Monte Carlo Methods
A performance verification demonstration of technologies capable of detecting dioxin and dioxin-like compounds in soil and sediment samples was conducted in April 2004 under the U.S. Environmental Protection Agency's Superfund Innovative Technology Evaluation (SITE) Monitoring an...
Issues of planning trajectory of parallel robots taking into account zones of singularity
NASA Astrophysics Data System (ADS)
Rybak, L. A.; Khalapyan, S. Y.; Gaponenko, E. V.
2018-03-01
A method for determining the design characteristics of a parallel robot necessary to provide specified parameters of its working space that satisfy the controllability requirement is developed. The experimental verification of the proposed method was carried out using an approximate planar 3-RPR mechanism.
DOT National Transportation Integrated Search
2006-08-01
Post-tensioned cast-in-place box girder bridges are commonly used in California. Losses in tension in : the steel prestressing tendons used in these bridges occur over time due to creep and shrinkage of : concrete and relaxation of the tendons. The u...
Apparatus for Teaching Physics: Linearizing a Nonlinear Spring.
ERIC Educational Resources Information Center
Wagner, Glenn
1995-01-01
Describes a method to eliminate the nonlinearity from a spring that is used in experimental verification of Hooke's Law where students are asked to determine the force constant and the linear equation that describes the extension of the spring as a function of the mass placed on it. (JRH)
DOT National Transportation Integrated Search
2000-06-01
In 1997, a load rating of an historic reinforced concrete bridge in Oregon, Horsetail Creek Bridge, indicated substandard shear and moment capacities of the beams. As a result, the Bridge was strengthened with fiber reinforced polymer composites as a...
Modeling of Pressure Drop During Refrigerant Condensation in Pipe Minichannels
NASA Astrophysics Data System (ADS)
Sikora, Małgorzata; Bohdal, Tadeusz
2017-12-01
Investigations of refrigerant condensation in pipe minichannels are very challenging and complicated issue. Due to the multitude of influences very important is mathematical and computer modeling. Its allows for performing calculations for many different refrigerants under different flow conditions. A large number of experimental results published in the literature allows for experimental verification of correctness of the models. In this work is presented a mathematical model for calculation of flow resistance during condensation of refrigerants in the pipe minichannel. The model was developed in environment based on conservation equations. The results of calculations were verified by authors own experimental investigations results.
Square wave voltammetry at the dropping mercury electrode: Experimental
Turner, J.A.; Christie, J.H.; Vukovic, M.; Osteryoung, R.A.
1977-01-01
Experimental verification of earlier theoretical work for square wave voltammetry at the dropping mercury electrode is given. Experiments using ferric oxalate and cadmium(II) in HCl confirm excellent agreement with theory. Experimental peak heights and peak widths are found to be within 2% of calculated results. An example of trace analysis using square wave voltammetry at the DME is presented. The technique is shown to have the same order of sensitivity as differential pulse polarography but is much faster to perform. A detection limit for cadmium in 0.1 M HCl for the system used here was 7 ?? 10-8 M.
Sound absorption by a Helmholtz resonator
NASA Astrophysics Data System (ADS)
Komkin, A. I.; Mironov, M. A.; Bykov, A. I.
2017-07-01
Absorption characteristics of a Helmholtz resonator positioned at the end wall of a circular duct are considered. The absorption coefficient of the resonator is experimentally investigated as a function of the diameter and length of the resonator neck and the depth of the resonator cavity. Based on experimental data, the linear analytic model of a Helmholtz resonator is verified, and the results of verification are used to determine the dissipative attached length of the resonator neck so as to provide the agreement between experimental and calculated data. Dependences of sound absorption by a Helmholtz resonator on its geometric parameters are obtained.
Experimental Verification of the Theory of Wind-Tunnel Boundary Interference
NASA Technical Reports Server (NTRS)
Theodorsen, Theodore; Silverstein, Abe
1935-01-01
The results of an experimental investigation on the boundary-correction factor are presented in this report. The values of the boundary-correction factor from the theory, which at the present time is virtually completed, are given in the report for all conventional types of tunnels. With the isolation of certain disturbing effects, the experimental boundary-correction factor was found to be in satisfactory agreement with the theoretically predicted values, thus verifying the soundness and sufficiency of the theoretical analysis. The establishment of a considerable velocity distortion, in the nature of a unique blocking effect, constitutes a principal result of the investigation.
Direct Write Printing on Thin and Flexible Substrates for Space Applications
NASA Technical Reports Server (NTRS)
Paquette, Beth
2016-01-01
This presentation describes the work done on direct-write printing conductive traces for a flexible detector application. A Repeatability Plan was established to define detector requirements, material and printer selections, printing facilities, and tests to verify requirements are met. Designs were created for the detector, and printed using an aerosol jet printer. Testing for requirement verification is ongoing.
49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?
Code of Federal Regulations, 2011 CFR
2011-10-01
... should have directed that a test occur. For example, if an employee tells you that the employer misidentified her as the subject of a random test, or directed her to take a reasonable suspicion or post... consider any evidence from tests of urine samples or other body fluids or tissues (e.g., blood or hair...
49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?
Code of Federal Regulations, 2013 CFR
2013-10-01
... should have directed that a test occur. For example, if an employee tells you that the employer misidentified her as the subject of a random test, or directed her to take a reasonable suspicion or post... consider any evidence from tests of urine samples or other body fluids or tissues (e.g., blood or hair...
49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?
Code of Federal Regulations, 2010 CFR
2010-10-01
... should have directed that a test occur. For example, if an employee tells you that the employer misidentified her as the subject of a random test, or directed her to take a reasonable suspicion or post... consider any evidence from tests of urine samples or other body fluids or tissues (e.g., blood or hair...
49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?
Code of Federal Regulations, 2012 CFR
2012-10-01
... should have directed that a test occur. For example, if an employee tells you that the employer misidentified her as the subject of a random test, or directed her to take a reasonable suspicion or post... consider any evidence from tests of urine samples or other body fluids or tissues (e.g., blood or hair...
49 CFR 40.151 - What are MROs prohibited from doing as part of the verification process?
Code of Federal Regulations, 2014 CFR
2014-10-01
... should have directed that a test occur. For example, if an employee tells you that the employer misidentified her as the subject of a random test, or directed her to take a reasonable suspicion or post... consider any evidence from tests of urine samples or other body fluids or tissues (e.g., blood or hair...
Benchmarking on Tsunami Currents with ComMIT
NASA Astrophysics Data System (ADS)
Sharghi vand, N.; Kanoglu, U.
2015-12-01
There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)
Using Virtual Testing for Characterization of Composite Materials
NASA Astrophysics Data System (ADS)
Harrington, Joseph
Composite materials are finally providing uses hitherto reserved for metals in structural systems applications -- airframes and engine containment systems, wraps for repair and rehabilitation, and ballistic/blast mitigation systems. They have high strength-to-weight ratios, are durable and resistant to environmental effects, have high impact strength, and can be manufactured in a variety of shapes. Generalized constitutive models are being developed to accurately model composite systems so they can be used in implicit and explicit finite element analysis. These models require extensive characterization of the composite material as input. The particular constitutive model of interest for this research is a three-dimensional orthotropic elasto-plastic composite material model that requires a total of 12 experimental stress-strain curves, yield stresses, and Young's Modulus and Poisson's ratio in the material directions as input. Sometimes it is not possible to carry out reliable experimental tests needed to characterize the composite material. One solution is using virtual testing to fill the gaps in available experimental data. A Virtual Testing Software System (VTSS) has been developed to address the need for a less restrictive method to characterize a three-dimensional orthotropic composite material. The system takes in the material properties of the constituents and completes all 12 of the necessary characterization tests using finite element (FE) models. Verification and validation test cases demonstrate the capabilities of the VTSS.
NASA Astrophysics Data System (ADS)
Wang, M.; Cole, M. O. T.; Keogh, P. S.
2017-11-01
A new approach for the recovery of contact-free levitation of a rotor supported by active magnetic bearings (AMB) is assessed through control strategy design, system modelling and experimental verification. The rotor is considered to make contact with a touchdown bearing (TDB), which may lead to entrapment in a bi-stable nonlinear response. A linear matrix inequality (LMI) based gain-scheduling H∞ control technique is introduced to recover the rotor to a contact-free state. The controller formulation involves a time-varying effective stiffness parameter, which can be evaluated in terms of forces transmitted through the TDB. Rather than measuring these forces directly, an observer is introduced with a model of the base structure to transform base acceleration signals using polytopic coordinates for controller adjustment. Force transmission to the supporting base structure will occur either through an AMB alone without contact, or through the AMB and TDB with contact and this must be accounted for in the observer design. The controller is verified experimentally in terms of (a) non-contact robust stability and vibration suppression performance; (b) control action for contact-free recovery at typical running speeds with various unbalance and TDB misalignment conditions; and (c) coast-down experimental tests. The results demonstrate the effectiveness of the AMB control action whenever it operates within its dynamic load capacity.
NASA Astrophysics Data System (ADS)
Demberg, Kerstin; Laun, Frederik Bernd; Windschuh, Johannes; Umathum, Reiner; Bachert, Peter; Kuder, Tristan Anselm
2017-02-01
Diffusion pore imaging is an extension of diffusion-weighted nuclear magnetic resonance imaging enabling the direct measurement of the shape of arbitrarily formed, closed pores by probing diffusion restrictions using the motion of spin-bearing particles. Examples of such pores comprise cells in biological tissue or oil containing cavities in porous rocks. All pores contained in the measurement volume contribute to one reconstructed image, which reduces the problem of vanishing signal at increasing resolution present in conventional magnetic resonance imaging. It has been previously experimentally demonstrated that pore imaging using a combination of a long and a narrow magnetic field gradient pulse is feasible. In this work, an experimental verification is presented showing that pores can be imaged using short gradient pulses only. Experiments were carried out using hyperpolarized xenon gas in well-defined pores. The phase required for pore image reconstruction was retrieved from double diffusion encoded (DDE) measurements, while the magnitude could either be obtained from DDE signals or classical diffusion measurements with single encoding. The occurring image artifacts caused by restrictions of the gradient system, insufficient diffusion time, and by the phase reconstruction approach were investigated. Employing short gradient pulses only is advantageous compared to the initial long-narrow approach due to a more flexible sequence design when omitting the long gradient and due to faster convergence to the diffusion long-time limit, which may enable application to larger pores.
Verification of adolescent self-reported smoking.
Kentala, Jukka; Utriainen, Pekka; Pahkala, Kimmo; Mattila, Kari
2004-02-01
Smoking and the validity of information obtained on it is often questioned in view of the widespread belief that adolescents tend to under- or over-report the habit. The aim here was to verify smoking habits as reported in a questionnaire given in conjunction with dental examinations by asking participants directly whether they smoked or not and performing biochemical measurements of thiocyanate in the saliva and carbon monoxide in the expired air. The series consisted of 150 pupils in the ninth grade (age 15 years). The reports in the questionnaires seemed to provide a reliable estimate of adolescent smoking, the sensitivity of the method being 81-96%, specificity 77-95%. Biochemical verification or control of smoking proved needless in normal dental practice. Accepting information offered by the patient provides a good starting point for health education and work motivating and supporting of self-directed breaking of the habit.
A Single-Boundary Accumulator Model of Response Times in an Addition Verification Task
Faulkenberry, Thomas J.
2017-01-01
Current theories of mathematical cognition offer competing accounts of the interplay between encoding and calculation in mental arithmetic. Additive models propose that manipulations of problem format do not interact with the cognitive processes used in calculation. Alternatively, interactive models suppose that format manipulations have a direct effect on calculation processes. In the present study, we tested these competing models by fitting participants' RT distributions in an arithmetic verification task with a single-boundary accumulator model (the shifted Wald distribution). We found that in addition to providing a more complete description of RT distributions, the accumulator model afforded a potentially more sensitive test of format effects. Specifically, we found that format affected drift rate, which implies that problem format has a direct impact on calculation processes. These data give further support for an interactive model of mental arithmetic. PMID:28769853
Flow visualization methods for field test verification of CFD analysis of an open gloveport
Strons, Philip; Bailey, James L.
2017-01-01
Anemometer readings alone cannot provide a complete picture of air flow patterns at an open gloveport. Having a means to visualize air flow for field tests in general provides greater insight by indicating direction in addition to the magnitude of the air flow velocities in the region of interest. Furthermore, flow visualization is essential for Computational Fluid Dynamics (CFD) verification, where important modeling assumptions play a significant role in analyzing the chaotic nature of low-velocity air flow. A good example is shown Figure 1, where an unexpected vortex pattern occurred during a field test that could not have been measuredmore » relying only on anemometer readings. Here by, observing and measuring the patterns of the smoke flowing into the gloveport allowed the CFD model to be appropriately updated to match the actual flow velocities in both magnitude and direction.« less
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Generalization of information-based concepts in forecast verification
NASA Astrophysics Data System (ADS)
Tödter, J.; Ahrens, B.
2012-04-01
This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.
Modeling human response errors in synthetic flight simulator domain
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.
1992-01-01
This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.
Pyroelectric effect in tryglicyne sulphate single crystals - Differential measurement method
NASA Astrophysics Data System (ADS)
Trybus, M.
2018-06-01
A simple mathematical model of the pyroelectric phenomenon was used to explain the electric response of the TGS (triglycine sulphate) samples in the linear heating process in ferroelectric and paraelectric phases. Experimental verification of mathematical model was realized. TGS single crystals were grown and four electrode samples were fabricated. Differential measurements of the pyroelectric response of two different regions of the samples were performed and the results were compared with data obtained from the model. Experimental results are in good agreement with model calculations.
1987-05-29
Controler A Fig.1 Experimental setip, P.S.O, : Phase sen:sitive detector. 0 VC.X.O. : Voltage controlled crystal oscillator. 1 A : Post - detector amplifier...the sampling period samples were obtained using a pair of fre- used in the experimental verification. :uency counters controlled by a desk-top...reduce the effect of group delay changes. The first method can te implemented by actively -_ - - . - or passively controlling the environment around
Hawking radiation in an electromagnetic waveguide?
Schützhold, Ralf; Unruh, William G
2005-07-15
It is demonstrated that the propagation of electromagnetic waves in an appropriately designed waveguide is (for large wavelengths) analogous to that within a curved space-time--such as around a black hole. As electromagnetic radiation (e.g., microwaves) can be controlled, amplified, and detected (with present-day technology) much easier than sound, for example, we propose a setup for the experimental verification of the Hawking effect. Apart from experimentally testing this striking prediction, this would facilitate the investigation of the trans-Planckian problem.
Continuing Issues (FY 1979) Regarding DoD Use of the Space Transportation System.
1979-12-01
estimates) to the cost of launching experimental payloads In the sortie mode is the analytical verification of compatibility ("Integrsaton") of the experiment...with the Shuttle; the Integration cost mya be reduced by the Air Force by treir proposed "class cargo" generalized integra- tion analysis that, once...camactness and light weight (for a given experimental weight) rather than on intrinsic cost , to minmize lazc costa as couted by the NASA volume and weight
In vivo dose verification method in catheter based high dose rate brachytherapy.
Jaselskė, Evelina; Adlienė, Diana; Rudžianskas, Viktoras; Urbonavičius, Benas Gabrielis; Inčiūra, Arturas
2017-12-01
In vivo dosimetry is a powerful tool for dose verification in radiotherapy. Its application in high dose rate (HDR) brachytherapy is usually limited to the estimation of gross errors, due to inability of the dosimetry system/ method to record non-uniform dose distribution in steep dose gradient fields close to the radioactive source. In vivo dose verification in interstitial catheter based HDR brachytherapy is crucial since the treatment is performed inserting radioactive source at the certain positions within the catheters that are pre-implanted into the tumour. We propose in vivo dose verification method for this type of brachytherapy treatment which is based on the comparison between experimentally measured and theoretical dose values calculated at well-defined locations corresponding dosemeter positions in the catheter. Dose measurements were performed using TLD 100-H rods (6 mm long, 1 mm diameter) inserted in a certain sequences into additionally pre-implanted dosimetry catheter. The adjustment of dosemeter positioning in the catheter was performed using reconstructed CT scans of patient with pre-implanted catheters. Doses to three Head&Neck and one Breast cancer patient have been measured during several randomly selected treatment fractions. It was found that the average experimental dose error varied from 4.02% to 12.93% during independent in vivo dosimetry control measurements for selected Head&Neck cancer patients and from 7.17% to 8.63% - for Breast cancer patient. Average experimental dose error was below the AAPM recommended margin of 20% and did not exceed the measurement uncertainty of 17.87% estimated for this type of dosemeters. Tendency of slightly increasing average dose error was observed in every following treatment fraction of the same patient. It was linked to the changes of theoretically estimated dosemeter positions due to the possible patient's organ movement between different treatment fractions, since catheter reconstruction was performed for the first treatment fraction only. These findings indicate potential for further average dose error reduction in catheter based brachytherapy by at least 2-3% in the case that catheter locations will be adjusted before each following treatment fraction, however it requires more detailed investigation. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko
2017-02-01
A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were ⩽3 mm and ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.
Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko
2017-02-21
A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were ⩽3 mm and ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.
Magnetically controlled ferromagnetic swimmers
Hamilton, Joshua K.; Petrov, Peter G.; Winlove, C. Peter; Gilbert, Andrew D.; Bryan, Matthew T.; Ogrin, Feodor Y.
2017-01-01
Microscopic swimming devices hold promise for radically new applications in lab-on-a-chip and microfluidic technology, diagnostics and drug delivery etc. In this paper, we demonstrate the experimental verification of a new class of autonomous ferromagnetic swimming devices, actuated and controlled solely by an oscillating magnetic field. These devices are based on a pair of interacting ferromagnetic particles of different size and different anisotropic properties joined by an elastic link and actuated by an external time-dependent magnetic field. The net motion is generated through a combination of dipolar interparticle gradient forces, time-dependent torque and hydrodynamic coupling. We investigate the dynamic performance of a prototype (3.6 mm) of the ferromagnetic swimmer in fluids of different viscosity as a function of the external field parameters (frequency and amplitude) and demonstrate stable propulsion over a wide range of Reynolds numbers. We show that the direction of swimming has a dependence on both the frequency and amplitude of the applied external magnetic field, resulting in robust control over the speed and direction of propulsion. This paves the way to fabricating microscale devices for a variety of technological applications requiring reliable actuation and high degree of control. PMID:28276490
A biologist's view of the relevance of thermodynamics and physical chemistry to cryobiology✰
Mazur, Peter
2013-01-01
Thermodynamics and physical chemistry have played powerful roles the past forty-five years in interpreting cryobiological problems and in predicting cryobiological outcomes. The author has been guided by a few core principles in using these concepts and tools and this paper discusses these core principles. They are (1) the importance of chemical potentials and of the difference between the chemical potentials of water and solutes inside the cell and outside in determining the direction and rate of fluxes of water and solutes. (2) The influence of the curvature of an ice crystal on its chemical potential and on the ability of ice to pass through pores in cell membranes, on the nucleation temperature of supercooled water, and on the recrystallization of ice. (3) The use of Le Chatalier's Principle in qualitatively predicting the direction of a reaction is response to variables like pressure. (4) The fact that the energy differences between State A and State B are independent of the path taken to go from A to B. (5) The importance of being aware of the assumptions underlying thermodynamic models of cryobiological events. And (6), the difficulties in obtaining experimental verification of thermodynamic and physical-chemical models. PMID:19962974
Magnetically controlled ferromagnetic swimmers
NASA Astrophysics Data System (ADS)
Hamilton, Joshua K.; Petrov, Peter G.; Winlove, C. Peter; Gilbert, Andrew D.; Bryan, Matthew T.; Ogrin, Feodor Y.
2017-03-01
Microscopic swimming devices hold promise for radically new applications in lab-on-a-chip and microfluidic technology, diagnostics and drug delivery etc. In this paper, we demonstrate the experimental verification of a new class of autonomous ferromagnetic swimming devices, actuated and controlled solely by an oscillating magnetic field. These devices are based on a pair of interacting ferromagnetic particles of different size and different anisotropic properties joined by an elastic link and actuated by an external time-dependent magnetic field. The net motion is generated through a combination of dipolar interparticle gradient forces, time-dependent torque and hydrodynamic coupling. We investigate the dynamic performance of a prototype (3.6 mm) of the ferromagnetic swimmer in fluids of different viscosity as a function of the external field parameters (frequency and amplitude) and demonstrate stable propulsion over a wide range of Reynolds numbers. We show that the direction of swimming has a dependence on both the frequency and amplitude of the applied external magnetic field, resulting in robust control over the speed and direction of propulsion. This paves the way to fabricating microscale devices for a variety of technological applications requiring reliable actuation and high degree of control.
Ground-state properties of light kaonic nuclei signaling symmetry energy at high densities
NASA Astrophysics Data System (ADS)
Yang, Rongyao; Wei, Sina; Jiang, Weizhou
2018-01-01
A sensitive correlation between the ground-state properties of light kaonic nuclei and the symmetry energy at high densities is constructed under the framework of relativistic mean-field theory. Taking oxygen isotopes as an example, we see that a high-density core is produced in kaonic oxygen nuclei, due to the strongly attractive antikaon-nucleon interaction. It is found that the 1{S}1/2 state energy in the high-density core of kaonic nuclei can directly probe the variation of the symmetry energy at supranormal nuclear density, and a sensitive correlation between the neutron skin thickness and the symmetry energy at supranormal density is established directly. Meanwhile, the sensitivity of the neutron skin thickness to the low-density slope of the symmetry energy is greatly increased in the corresponding kaonic nuclei. These sensitive relationships are established upon the fact that the isovector potential in the central region of kaonic nuclei becomes very sensitive to the variation of the symmetry energy. These findings might provide another perspective to constrain high-density symmetry energy, and await experimental verification in the future. Supported by National Natural Science Foundation of China (11775049, 11275048) and the China Jiangsu Provincial Natural Science Foundation (BK20131286)
Direct observation of how the heavy-fermion state develops in CeCoIn5
NASA Astrophysics Data System (ADS)
Chen, Q. Y.; Xu, D. F.; Niu, X. H.; Jiang, J.; Peng, R.; Xu, H. C.; Wen, C. H. P.; Ding, Z. F.; Huang, K.; Shu, L.; Zhang, Y. J.; Lee, H.; Strocov, V. N.; Shi, M.; Bisti, F.; Schmitt, T.; Huang, Y. B.; Dudin, P.; Lai, X. C.; Kirchner, S.; Yuan, H. Q.; Feng, D. L.
2017-07-01
Heavy-fermion systems share some of the strange metal phenomenology seen in other unconventional superconductors, providing a unique opportunity to set strange metals in a broader context. Central to understanding heavy-fermion systems is the interplay of localization and itinerancy. These materials acquire high electronic masses and a concomitant Fermi volume increase as the f electrons delocalize at low temperatures. However, despite the wide-spread acceptance of this view, a direct microscopic verification has been lacking. Here we report high-resolution angle-resolved photoemission measurements on CeCoIn5, a prototypical heavy-fermion compound, which spectroscopically resolve the development of band hybridization and the Fermi surface expansion over a wide temperature region. Unexpectedly, the localized-to-itinerant transition occurs at surprisingly high temperatures, yet f electrons are still largely localized even at the lowest temperature. These findings point to an unanticipated role played by crystal-field excitations in the strange metal behavior of CeCoIn5. Our results offer a comprehensive experimental picture of the heavy-fermion formation, setting the stage for understanding the emergent properties, including unconventional superconductivity, in this and related materials.
Kostanyan, Artak E; Erastov, Andrey A
2016-09-02
The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. Copyright © 2016 Elsevier B.V. All rights reserved.
Huang, Dao-sheng; Shi, Wei; Han, Lei; Sun, Ke; Chen, Guang-bo; Wu Jian-xiong; Xu, Gui-hong; Bi, Yu-an; Wang, Zhen-zhong; Xiao, Wei
2015-06-01
To optimize the belt drying process conditions optimization of Gardeniae Fructus extract from Reduning injection by Box-Behnken design-response surface methodology, on the basis of single factor experiment, a three-factor and three-level Box-Behnken experimental design was employed to optimize the drying technology of Gardeniae Fructus extract from Reduning injection. With drying temperature, drying time, feeding speed as independent variables and the content of geniposide as dependent variable, the experimental data were fitted to a second order polynomial equation, establishing the mathematical relationship between the content of geniposide and respective variables. With the experimental data analyzed by Design-Expert 8. 0. 6, the optimal drying parameter was as follows: the drying temperature was 98.5 degrees C , the drying time was 89 min, the feeding speed was 99.8 r x min(-1). Three verification experiments were taked under this technology and the measured average content of geniposide was 564. 108 mg x g(-1), which was close to the model prediction: 563. 307 mg x g(-1). According to the verification test, the Gardeniae Fructus belt drying process is steady and feasible. So single factor experiments combined with response surface method (RSM) could be used to optimize the drying technology of Reduning injection Gardenia extract.
McNair, Helen A; Hansen, Vibeke N; Parker, Christopher C; Evans, Phil M; Norman, Andrew; Miles, Elizabeth; Harris, Emma J; Del-Acroix, Louise; Smith, Elizabeth; Keane, Richard; Khoo, Vincent S; Thompson, Alan C; Dearnaley, David P
2008-05-01
To evaluate the utility of intraprostatic markers in the treatment verification of prostate cancer radiotherapy. Specific aims were: to compare the effectiveness of offline correction protocols, either using gold markers or bony anatomy; to estimate the potential benefit of online correction protocol's using gold markers; to determine the presence and effect of intrafraction motion. Thirty patients with three gold markers inserted had pretreatment and posttreatment images acquired and were treated using an offline correction protocol and gold markers. Retrospectively, an offline protocol was applied using bony anatomy and an online protocol using gold markers. The systematic errors were reduced from 1.3, 1.9, and 2.5 mm to 1.1, 1.1, and 1.5 mm in the right-left (RL), superoinferior (SI), and anteroposterior (AP) directions, respectively, using the offline correction protocol and gold markers instead of bony anatomy. The subsequent decrease in margins was 1.7, 3.3, and 4 mm in the RL, SI, and AP directions, respectively. An offline correction protocol combined with an online correction protocol in the first four fractions reduced random errors further to 0.9, 1.1, and 1.0 mm in the RL, SI, and AP directions, respectively. A daily online protocol reduced all errors to <1 mm. Intrafraction motion had greater impact on the effectiveness of the online protocol than the offline protocols. An offline protocol using gold markers is effective in reducing the systematic error. The value of online protocols is reduced by intrafraction motion.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.
2016-01-01
Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, J; Hu, W; Xing, Y
Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less
Carol-Visser, Jeroen; van der Schans, Marcel; Fidder, Alex; Hulst, Albert G; van Baar, Ben L M; Irth, Hubertus; Noort, Daan
2008-07-01
Rapid monitoring and retrospective verification are key issues in protection against and non-proliferation of chemical warfare agents (CWA). Such monitoring and verification are adequately accomplished by the analysis of persistent protein adducts of these agents. Liquid chromatography-mass spectrometry (LC-MS) is the tool of choice in the analysis of such protein adducts, but the overall experimental procedure is quite elaborate. Therefore, an automated on-line pepsin digestion-LC-MS configuration has been developed for the rapid determination of CWA protein adducts. The utility of this configuration is demonstrated by the analysis of specific adducts of sarin and sulfur mustard to human butyryl cholinesterase and human serum albumin, respectively.
Concept Verification Test - Evaluation of Spacelab/Payload operation concepts
NASA Technical Reports Server (NTRS)
Mcbrayer, R. O.; Watters, H. H.
1977-01-01
The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.
Verification of the CFD simulation system SAUNA for complex aircraft configurations
NASA Astrophysics Data System (ADS)
Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.
1994-04-01
This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.
Case-Study of the High School Student's Family Values Formation
ERIC Educational Resources Information Center
Valeeva, Roza A.; Korolyeva, Natalya E.; Sakhapova, Farida Kh.
2016-01-01
The aim of the research is the theoretical justification and experimental verification of content, complex forms and methods to ensure effective development of the high school students' family values formation. 93 lyceum students from Kazan took part in the experiment. To study students' family values we have applied method of studying personality…
Bullying in School: Case Study of Prevention and Psycho-Pedagogical Correction
ERIC Educational Resources Information Center
Ribakova, Laysan A.; Valeeva, Roza A.; Merker, Natalia
2016-01-01
The purpose of the study was the theoretical justification and experimental verification of content, complex forms and methods to ensure effective prevention and psycho-pedagogical correction of bullying in school. 53 teenage students from Kazan took part in the experiment. A complex of diagnostic techniques for the detection of violence and…
DOT National Transportation Integrated Search
1978-12-01
This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...
Conservation of Mechanical and Electric Energy: Simple Experimental Verification
ERIC Educational Resources Information Center
Ponikvar, D.; Planinsic, G.
2009-01-01
Two similar experiments on conservation of energy and transformation of mechanical into electrical energy are presented. Both can be used in classes, as they offer numerous possibilities for discussion with students and are simple to perform. Results are presented and are precise within 20% for the version of the experiment where measured values…
Shuttle structural dynamics characteristics: The analysis and verification
NASA Technical Reports Server (NTRS)
Modlin, C. T., Jr.; Zupp, G. A., Jr.
1985-01-01
The space shuttle introduced a new dimension in the complexity of the structural dynamics of a space vehicle. The four-body configuration exhibited structural frequencies as low as 2 hertz with a model density on the order of 10 modes per hertz. In the verification process, certain mode shapes and frequencies were identified by the users as more important than others and, as such, the test objectives were oriented toward experimentally extracting those modes and frequencies for analysis and test correlation purposes. To provide the necessary experimental data, a series of ground vibration tests (GVT's) was conducted using test articles ranging from the 1/4-scale structural replica of the space shuttle to the full-scale vehicle. The vibration test and analysis program revealed that the mode shapes and frequency correlations below 10 hertz were good. The quality of correlation of modes between 10 and 20 hertz ranged from good to fair and that of modes above 20 hertz ranged from poor to good. Since the most important modes, based on user preference, were below 10 hertz, it was judged that the shuttle structural dynamic models were adequate for flight certifications.
Mechanical verification of soft-tissue attachment on bioactive glasses and titanium implants.
Zhao, Desheng; Moritz, Niko; Vedel, Erik; Hupa, Leena; Aro, Hannu T
2008-07-01
Soft-tissue attachment is a desired feature of many clinical biomaterials. The aim of the current study was to design a suitable experimental method for tensile testing of implant incorporation with soft-tissues. Conical implants were made of three compositions of bioactive glass (SiO(2)-P(2)O(5)-B(2)O(3)-Na(2)O-K(2)O-CaO-MgO) or titanium fiber mesh (porosity 84.7%). The implants were surgically inserted into the dorsal subcutaneous soft-tissue or back muscles in the rat. Soft-tissue attachment was evaluated by pull-out testing using a custom-made jig 8 weeks after implantation. Titanium fiber mesh implants had developed a relatively high pull-out force in subcutaneous tissue (12.33+/-5.29 N, mean+/-SD) and also measurable attachment with muscle tissue (2.46+/-1.33 N). The bioactive glass implants failed to show mechanically relevant soft-tissue bonding. The experimental set-up of mechanical testing seems to be feasible for verification studies of soft-tissue attachment. The inexpensive small animal model is beneficial for large-scale in vivo screening of new biomaterials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Liu, Wenlong; Zhang, Xili; He, Fuyuan; Zhang, Ping; Wang, Haiqin; Wu, Dezhi; Chen, Zuohong
2011-11-01
To establish and experimental verification the mathematical model of the balance groups that is the steady-state of traditional Chinese medicine in extraction. Using the entropy and genetic principles of statistics, and taking the coefficient of variation of GC fingerprint which is the naphtha of the Houttuynia cordata between strains in the same GAP place as a pivot to establish and verify the mathematical model was established of the balance groups that is the steady-state of traditional Chinese medicine in extraction. A mathematical model that is suitable for the balance groups of the steady-state of traditional Chinese medicine and preparation in extraction, and the balance groups which is 29 683 strains (approximately 118.7 kg) were gained with the same origin of H. cordata as the model drug. Under the GAP of quality control model, controlling the stability of the quality through further using the Hardy-Weinberg balance groups of the H. cordata between strains, the new theory and experiment foundation is established for the steady-state of traditional Chinese medicine in extraction and quality control.
Learning Experience on Transformer Using HOT Lab for Pre-service Physics Teacher’s
NASA Astrophysics Data System (ADS)
Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.
2017-09-01
This study aimed at investigating pre-service teacher’s critical thinking skills improvement through Higher Order Thinking (HOT) Lab on transformer learning. This research used mix method with the embedded experimental model. Research subjects are 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The results showed that based on the results of the analysis of practical reports and observation sheet shows students in the experimental group was better in carrying out the practicum and can solve the real problem while the control group was going on the opposite. The critical thinking skills of students applying the HOT Lab were higher than the verification lab. Critical thinking skills could increase due to HOT Lab based problems solving that can develop higher order thinking skills through laboratory activities. Therefore, it was concluded that the application of HOT Lab was more effective than verification lab on improving students’ thinking skills on transformer topic learning. Finally, HOT Lab can be implemented in other subject learning and could be used to improve another higher order thinking skills.
NASA Astrophysics Data System (ADS)
Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.
2011-05-01
Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m < n. The determination of an input control strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.
Virtual Manufacturing (la Fabrication virtuelle)
1998-05-01
with moving parts and subassemblies, • verification of product subcomponents and systems operations through kinematics studies, and • realism ...dimensions, parts moved in mechanism based directions, and realism of interaction is increased through use of sound, touch and other parameters. For the...direct converters from CAD systems. A simple cinematic package is also high on the requirement to be able to simulate motions as well as an interface to
Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.
Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David
2013-12-01
Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.
Yassin, Ali A
2014-01-01
Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.
Yassin, Ali A.
2014-01-01
Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051
Quantum money with nearly optimal error tolerance
NASA Astrophysics Data System (ADS)
Amiri, Ryan; Arrazola, Juan Miguel
2017-06-01
We present a family of quantum money schemes with classical verification which display a number of benefits over previous proposals. Our schemes are based on hidden matching quantum retrieval games and they tolerate noise up to 23 % , which we conjecture reaches 25 % asymptotically as the dimension of the underlying hidden matching states is increased. Furthermore, we prove that 25 % is the maximum tolerable noise for a wide class of quantum money schemes with classical verification, meaning our schemes are almost optimally noise tolerant. We use methods in semidefinite programming to prove security in a substantially different manner to previous proposals, leading to two main advantages: first, coin verification involves only a constant number of states (with respect to coin size), thereby allowing for smaller coins; second, the reusability of coins within our scheme grows linearly with the size of the coin, which is known to be optimal. Last, we suggest methods by which the coins in our protocol could be implemented using weak coherent states and verified using existing experimental techniques, even in the presence of detector inefficiencies.
Authentication Based on Pole-zero Models of Signature Velocity
Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad
2013-01-01
With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797
Cho, Sangwoo; Ku, Jeonghun; Park, Jinsick; Han, Kiwan; Lee, Hyeongrae; Choi, You Kyong; Jung, Young-Chul; Namkoong, Kee; Kim, Jae-Jin; Kim, In Young; Kim, Sun I; Shen, Dong Fan
2008-06-01
Alcoholism is a disease that affects parts of the brain that control emotion, decisions, and behavior. Therapy for people with alcoholism must address coping skills for facing high-risk situations. Therefore, it is important to develop tools to mimic such conditions. Cue exposure therapy (CET) provides high-risk situations during treatment, which raises the individual's ability to recognize that alcohol craving is being induced. Using CET, it is hard to simulate situations that induce alcohol craving. By contrast, virtual reality (VR) approaches can present realistic situations that cannot be experienced directly in CET. Therefore, we hypothesized that is possible to model social pressure situations using VR. We developed a VR system for inducing alcohol craving under social pressure situations and measured both the induced alcohol craving and head gaze of participants. A 2 x 2 experimental model (alcohol-related locality vs. social pressure) was designed. In situations without an avatar (no social pressure), more alcohol craving was induced if alcohol was present than if it was not. And more alcohol craving was induced in situations with an avatar (social pressure) than in situations without an avatar (no social pressure). The difference of angle between the direction of head gazing and the direction of alcohol or avatar was smaller in situations with an avatar alone (social pressure) than in situations with alcohol alone. In situations with both alcohol and an avatar, the angle between the direction of head gaze and the direction of the avatar was smaller than between the direction of head gaze and the direction of the alcohol. Considering the results, this VR system induces alcohol craving using an avatar that can express various social pressure situations.
Modeling interfacial fracture in Sierra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang
2013-09-01
This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-06
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
Vavrek, Jayson R; Henderson, Brian S; Danagoulian, Areg
2018-04-24
Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618-8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal from the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy "genuine" and "hoax" objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.
Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code
NASA Astrophysics Data System (ADS)
Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.
2015-12-01
WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).
Report #12-P-0747, August 30, 2012. Recent studies corroborate EPA’s claims that its SmartWay Transport Partnership program helps remove marketplace barriers in order to deploy fuel efficient technologies faster.