Sample records for initial experimental verification

  1. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  2. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

  3. National Centers for Environmental Prediction

    Science.gov Websites

    Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  4. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS

  5. AGARD Index of Publications, 1977 - 1979.

    DTIC Science & Technology

    1980-08-01

    therefore, experimental verification of calculation Walter Schutz In AGARD Fracture Mach Design Methodology methods, hypotheses etc. is very time...2-. 3-. or 4-D navigation initial stages of preliminary design analysis The state of the art systems and allows experimental or theoretical...Technol. on Weapons Systems Design Dot. 1978 1ES WITH WINGS NON -LEVEL 23 01 AERONAUTICS (GENERAL) J Stanley Ausman In AGARD The Impact of Integrated

  6. Verification of elastic-wave static displacement in solids. [using ultrasonic techniques on Ge single crystals

    NASA Technical Reports Server (NTRS)

    Cantrell, J. H., Jr.; Winfree, W. P.

    1980-01-01

    The solution of the nonlinear differential equation which describes an initially sinusoidal finite-amplitude elastic wave propagating in a solid contains a static-displacement term in addition to the harmonic terms. The static-displacement amplitude is theoretically predicted to be proportional to the product of the squares of the driving-wave amplitude and the driving-wave frequency. The first experimental verification of the elastic-wave static displacement in a solid (the 111 direction of single-crystal germanium) is reported, and agreement is found with the theoretical predictions.

  7. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NAM > EXPERIMENTAL DATA Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION

  8. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.

  9. Experimental verification of PSM polarimetry: monitoring polarization at 193nm high-NA with phase shift masks

    NASA Astrophysics Data System (ADS)

    McIntyre, Gregory; Neureuther, Andrew; Slonaker, Steve; Vellanki, Venu; Reynolds, Patrick

    2006-03-01

    The initial experimental verification of a polarization monitoring technique is presented. A series of phase shifting mask patterns produce polarization dependent signals in photoresist and are capable of monitoring the Stokes parameters of any arbitrary illumination scheme. Experiments on two test reticles have been conducted. The first reticle consisted of a series of radial phase gratings (RPG) and employed special apertures to select particular illumination angles. Measurement sensitivities of about 0.3 percent of the clear field per percent change in polarization state were observed. The second test reticle employed the more sensitive proximity effect polarization analyzers (PEPA), a more robust experimental setup, and a backside pinhole layer for illumination angle selection and to enable characterization of the full illuminator. Despite an initial complication with the backside pinhole alignment, the results correlate with theory. Theory suggests that, once the pinhole alignment is corrected in the near future, the second reticle should achieve a measurement sensitivity of about 1 percent of the clear field per percent change in polarization state. This corresponds to a measurement of the Stokes parameters after test mask calibration, to within about 0.02 to 0.03. Various potential improvements to the design, fabrication of the mask, and experimental setup are discussed. Additionally, to decrease measurement time, a design modification and double exposure technique is proposed to enable electrical detection of the measurement signal.

  10. Plasma Model V&V of Collisionless Electrostatic Shock

    NASA Astrophysics Data System (ADS)

    Martin, Robert; Le, Hai; Bilyeu, David; Gildea, Stephen

    2014-10-01

    A simple 1D electrostatic collisionless shock was selected as an initial validation and verification test case for a new plasma modeling framework under development at the Air Force Research Laboratory's In-Space Propulsion branch (AFRL/RQRS). Cross verification between PIC, Vlasov, and Fluid plasma models within the framework along with expected theoretical results will be shown. The non-equilibrium velocity distributions (VDF) captured by PIC and Vlasov will be compared to each other and the assumed VDF of the fluid model at selected points. Validation against experimental data from the University of California, Los Angeles double-plasma device will also be presented along with current work in progress at AFRL/RQRS towards reproducing the experimental results using higher fidelity diagnostics to help elucidate differences between model results and between the models and original experiment. DISTRIBUTION A: Approved for public release; unlimited distribution; PA (Public Affairs) Clearance Number 14332.

  11. TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childerson, M.T.; Fujita, R.K.

    1985-01-01

    A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less

  12. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Subramaniam, D. Rajan; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2014-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800- F3900 fiber/resin composite material.

  13. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Rajan, Subramaniam D.; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2015-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800-F3900 fiber/resin composite material

  14. Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes

    NASA Astrophysics Data System (ADS)

    Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.

    2018-04-01

    To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.

  15. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  16. Experimental preparation and verification of quantum money

    NASA Astrophysics Data System (ADS)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  17. Operation SUN BEAM. Shot Small Boy, Project Officers’ Report. Project 2. 1. Initial Radiation Measurements

    DTIC Science & Technology

    1981-05-01

    production 01 these gamma-rays and an experimental verification of their magnitude essential: 11) Tha transient radiation on electronics (TREE) work...Figure 2.6. It con- sisted of a scintillator, light pipe, photo sensitive device, and auxiliary electronic assembly. Arrangement of these elements in...types of mechanically interchangeable packages, consisting of a photosensitive device and auxiliary electronics , were available for each detector. (M

  18. Determination of initial conditions for heat exchanger placed in furnace by burning pellets

    NASA Astrophysics Data System (ADS)

    Durčanský, Peter; Jandačka, Jozef; Kapjor, Andrej

    2014-08-01

    Objective of the experimental facility and subsequent measurements is generally determine whether the expected physical properties of the verification, identification of the real behavior of the proposed system, or part thereof. For the design of heat exchanger for combined energy machine is required to identify and verify a large number of parameters. One of these are the boundary conditions of heat exchanger and pellets burner.

  19. Optically Pumped Coherent Mechanical Oscillators: The Laser Rate Equation Theory and Experimental Verification

    DTIC Science & Technology

    2012-10-23

    Naeini A H, Hill J T, Krause A, Groblacher S, Aspelmeyer M and Painter O 2011 Nature 478 89 [14] Siegman A E 1986 Lasers (Sausalito, CA: University... laser rate equation theory and experimental verification 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...coherent mechanical oscillators: the laser rate equation theory and experimental verification J B Khurgin1, M W Pruessner2,3, T H Stievater2 and W S

  20. Mathematical Modeling of Ni/H2 and Li-Ion Batteries

    NASA Technical Reports Server (NTRS)

    Weidner, John W.; White, Ralph E.; Dougal, Roger A.

    2001-01-01

    The modelling effort outlined in this viewgraph presentation encompasses the following topics: 1) Electrochemical Deposition of Nickel Hydroxide; 2) Deposition rates of thin films; 3) Impregnation of porous electrodes; 4) Experimental Characterization of Nickel Hydroxide; 5) Diffusion coefficients of protons; 6) Self-discharge rates (i.e., oxygen-evolution kinetics); 7) Hysteresis between charge and discharge; 8) Capacity loss on cycling; 9) Experimental Verification of the Ni/H2 Battery Model; 10) Mathematical Modeling Li-Ion Batteries; 11) Experimental Verification of the Li-Ion Battery Model; 11) Integrated Power System Models for Satellites; and 12) Experimental Verification of Integrated-Systems Model.

  1. National Centers for Environmental Prediction

    Science.gov Websites

    Reference List Table of Contents NCEP OPERATIONAL MODEL FORECAST GRAPHICS PARALLEL/EXPERIMENTAL MODEL Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS VERIFICATION (GRID VS.OBS) WEB PAGE (NCEP EXPERIMENTAL PAGE, INTERNAL USE ONLY) Interactive web page tool for

  2. 40 CFR 1065.303 - Summary of required calibration and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... initial installation and after major maintenance. THC FID optimization, and THC FID verification Optimize and determine CH4 response for THC FID analyzers: upon initial installation and after major maintenance. Verify CH4 response for THC FID analyzers: upon initial installation, within 185 days before...

  3. 40 CFR 1065.303 - Summary of required calibration and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... initial installation and after major maintenance. THC FID optimization, and THC FID verification Optimize and determine CH4 response for THC FID analyzers: upon initial installation and after major maintenance. Verify CH4 response for THC FID analyzers: upon initial installation, within 185 days before...

  4. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  5. Study on verifying the angle measurement performance of the rotary-laser system

    NASA Astrophysics Data System (ADS)

    Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui

    2018-04-01

    An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.

  6. Investigation of the trajectories and length of combustible gas jet flames in a sweeping air stream

    NASA Astrophysics Data System (ADS)

    Polezhaev, Yu. V.; Mostinskii, I. L.; Lamden, D. I.; Stonik, O. G.

    2011-05-01

    The trajectories of round gas jets and jet flames introduced into a sweeping air stream are studied. The influence of various initial conditions and of the physical properties of gases on the trajectory is considered. Experimental verification of the available approximation relations for the trajectories of flames in a wide range of the values of the blowing ratio has been carried out. It is shown that the newly obtained experimental approximation of the trajectory shape differs from the existing ones by about 20%. At small values of the blowing ratio (smaller than ~4.5) the flame trajectories cease to depend on it.

  7. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  8. Verification of the proteus two-dimensional Navier-Stokes code for flat plate and pipe flows

    NASA Technical Reports Server (NTRS)

    Conley, Julianne M.; Zeman, Patrick L.

    1991-01-01

    The Proteus Navier-Stokes Code is evaluated for 2-D/axisymmetric, viscous, incompressible, internal, and external flows. The particular cases to be discussed are laminar and turbulent flows over a flat plate, laminar and turbulent developing pipe flows, and turbulent pipe flow with swirl. Results are compared with exact solutions, empirical correlations, and experimental data. A detailed description of the code set-up, including boundary conditions, initial conditions, grid size, and grid packing is given for each case.

  9. Experimental verification of a model of a two-link flexible, lightweight manipulator. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Huggins, James David

    1988-01-01

    Experimental verification is presented for an assumed modes model of a large, two link, flexible manipulator design and constructed in the School of Mechanical Engineering at Georgia Institute of Technology. The structure was designed to have typical characteristics of a lightweight manipulator.

  10. Thermal noise in space-charge-limited hole current in silicon

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Golder, J.; Nicolet, M.

    1972-01-01

    Present theories on noise in single-carrier space-charge-limited currents in solids have not been quantitatively substantiated by experimental evidence. To obtain such experimental verification, the noise in specially fabricated silicon structures is being measured and analyzed. The first results of this verification effort are reported.

  11. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  12. 40 CFR 1065.303 - Summary of required calibration and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: FID calibrationTHC FID optimization, and THC FID verification Calibrate all FID analyzers: upon initial installation and after major maintenance.Optimize and determine CH4 response for THC FID analyzers: upon initial installation and after major maintenance. Verify CH4 response for THC FID analyzers: upon...

  13. 40 CFR 1065.303 - Summary of required calibration and verifications

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... THC FID optimization, and THC FID verification Optimize and determine CH4 response for THC FID analyzers: Upon initial installation and after major maintenance. Verify CH4 response for THC FID analyzers.... For THC FID analyzers: Upon initial installation, after major maintenance, and after FID optimization...

  14. 40 CFR 1065.303 - Summary of required calibration and verifications

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... THC FID optimization, and THC FID verification Optimize and determine CH4 response for THC FID analyzers: Upon initial installation and after major maintenance. Verify CH4 response for THC FID analyzers.... For THC FID analyzers: Upon initial installation, after major maintenance, and after FID optimization...

  15. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    NASA Technical Reports Server (NTRS)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using traditional verification methodologies. Output from object-based verification within NCAR s Meteorological Evaluation Tools reveals that the WRF runs initialized with LIS+MODIS data consistently generated precipitation objects that better matched observed precipitation objects, especially at higher precipitation intensities. The LIS+MODIS runs produced on average a 4% increase in matched precipitation areas and a simultaneous 4% decrease in unmatched areas during three months of daily simulations.

  16. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    ERIC Educational Resources Information Center

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  17. Experimental Verification of Boyle's Law and the Ideal Gas Law

    ERIC Educational Resources Information Center

    Ivanov, Dragia Trifonov

    2007-01-01

    Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…

  18. Numerical verification of three point bending experiment of magnetorheological elastomer (MRE) in magnetic field

    NASA Astrophysics Data System (ADS)

    Miedzinska, Danuta; Boczkowska, Anna; Zubko, Konrad

    2010-07-01

    In the article a method of numerical verification of experimental results for magnetorheological elastomer samples (MRE) is presented. The samples were shaped into cylinders with diameter of 8 mm and height of 20 mm with various carbonyl iron volume shares (1,5%, 11,5% and 33%). The diameter of soft ferromagnetic substance particles ranged from 6 to 9 μm. During the experiment, initially bended samples were exposed to the magnetic field with intensity levels at 0,1T, 0,3T, 0,5T, 0,7 and 1T. The reaction of the sample to the field action was measured as a displacement of a specimen. Numerical calculation was carried out with the MSC Patran/Marc computer code. For the purpose of numerical analysis the orthotropic material model with the material properties of magnetorheological elastomer along the iron chains, and of the pure elastomer along other directions, was applied. The material properties were obtained from the experimental tests. During the numerical analysis, the initial mechanical load resulting from cylinder deflection was set. Then, the equivalent external force, that was set on the basis of analytical calculations of intermolecular reaction within iron chains in the specific magnetic field, was put on the bended sample. Correspondence of such numerical model with results of the experiment was verified. Similar results of the experiments and both theoretical and FEM analysis indicates that macroscopic modeling of magnetorheological elastomer mechanical properties as orthotropic material delivers accurate enough description of the material's behavior.

  19. Modeling interfacial fracture in Sierra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang

    2013-09-01

    This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less

  20. Experimental verification of vapor deposition rate theory in high velocity burner rigs

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.; Santoro, Gilbert J.

    1985-01-01

    The main objective has been the experimental verification of the corrosive vapor deposition theory in high-temperature, high-velocity environments. Towards this end a Mach 0.3 burner-rig appartus was built to measure deposition rates from salt-seeded (mostly Na salts) combustion gases on the internally cooled cylindrical collector. Deposition experiments are underway.

  1. Experimental verification of dynamic simulation

    NASA Technical Reports Server (NTRS)

    Yae, K. Harold; Hwang, Howyoung; Chern, Su-Tai

    1989-01-01

    The dynamics model here is a backhoe, which is a four degree of freedom manipulator from the dynamics standpoint. Two types of experiment are chosen that can also be simulated by a multibody dynamics simulation program. In the experiment, recorded were the configuration and force histories; that is, velocity and position, and force output and differential pressure change from the hydraulic cylinder, in the time domain. When the experimental force history is used as driving force in the simulation model, the forward dynamics simulation produces a corresponding configuration history. Then, the experimental configuration history is used in the inverse dynamics analysis to generate a corresponding force history. Therefore, two sets of configuration and force histories--one set from experiment, and the other from the simulation that is driven forward and backward with the experimental data--are compared in the time domain. More comparisons are made in regard to the effects of initial conditions, friction, and viscous damping.

  2. 19 CFR 181.75 - Issuance of origin determination.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... origin verification initiated under § 181.72(a) of this part in regard to a good imported into the United... the origin verification, Customs shall provide the exporter or producer whose good is the subject of the verification with a written determination of whether the good qualifies as an originating good...

  3. Toward Improved Land Surface Initialization in Support of Regional WRF Forecasts at the Kenya Meteorological Department

    NASA Technical Reports Server (NTRS)

    Case. Jonathan; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.

    2014-01-01

    Flooding and drought are two key forecasting challenges for the Kenya Meteorological Department (KMD). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the boundary layer of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-end events over east Africa. KMD currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Nonhydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over eastern Africa. Two organizations at the National Aeronautics and Space Administration Marshall Space Flight Center in Huntsville, AL, SERVIR and the Short-term Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMD for enhancing its regional modeling capabilities. To accomplish this goal, SPoRT and SERVIR will provide experimental land surface initialization datasets and model verification capabilities to KMD. To produce a land-surface initialization more consistent with the resolution of the KMD-WRF runs, the NASA Land Information System (LIS) will be run at a comparable resolution to provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Additionally, real-time green vegetation fraction data from the Visible Infrared Imaging Radiometer Suite will be incorporated into the KMD-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service. Finally, model verification capabilities will be transitioned to KMD using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. The transition of these MET tools will enable KMD to monitor model forecast accuracy in near real time. This presentation will highlight preliminary verification results of WRF runs over east Africa using the LIS land surface initialization.

  4. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  5. Measurement of a True [Formula: see text]O2max during a Ramp Incremental Test Is Not Confirmed by a Verification Phase.

    PubMed

    Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H

    2018-01-01

    The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.

  6. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  7. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  8. Automated solar panel assembly line

    NASA Technical Reports Server (NTRS)

    Somberg, H.

    1981-01-01

    The initial stage of the automated solar panel assembly line program was devoted to concept development and proof of approach through simple experimental verification. In this phase, laboratory bench models were built to demonstrate and verify concepts. Following this phase was machine design and integration of the various machine elements. The third phase was machine assembly and debugging. In this phase, the various elements were operated as a unit and modifications were made as required. The final stage of development was the demonstration of the equipment in a pilot production operation.

  9. WRF Simulation over the Eastern Africa by use of Land Surface Initialization

    NASA Astrophysics Data System (ADS)

    Sakwa, V. N.; Case, J.; Limaye, A. S.; Zavodsky, B.; Kabuchanga, E. S.; Mungai, J.

    2014-12-01

    The East Africa region experiences severe weather events associated with hazards of varying magnitude. It receives heavy precipitation which leads to wide spread flooding and lack of sufficient rainfall in some parts results into drought. Cases of flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). The source of heat and moisture depends on the state of the land surface which interacts with the boundary layer of the atmosphere to produce excessive precipitation or lack of it that leads to severe drought. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface within weakly-sheared environments, such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in numerical weather prediction models. Improved modeling capabilities within the region have the potential to enhance forecast guidance in support of daily operations and high-impact weather over East Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) model in real time to support its daily forecasting operations, invoking the Non-hydrostatic Mesoscale Model (NMM) dynamical core. They make use of the National Oceanic and Atmospheric Administration / National Weather Service Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the WRF-NMM model runs on a 7-km regional grid over Eastern Africa.SPoRT and SERVIR provide land surface initialization datasets and model verification tool. The NASA Land Information System (LIS) provide real-time, daily soil initialization data in place of interpolated Global Forecast System soil moisture and temperature data. Model verification is done using the Model Evaluation Tools (MET) package, in order to quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. These MET tools enable KMS to monitor model forecast accuracy in near real time. This study highlights verification results of WRF runs over East Africa using the LIS land surface initialization.

  10. [Carl Friedrich von Weizsäcker and the Bethe-Weizsäcker cycle].

    PubMed

    Wiescher, Michael

    2014-01-01

    The Carbon- or Bethe-Weizsäcker Cycle plays an important role in astrophysics as one of the most important energy sources for a quiescent and explosive hydrogen burning in stars. This paper presents the historical background and the contributions by Carl Friedrich von Weizsäcker and Hans Bethe who provided the first predictions of the cycle. Furthermore, it discussed the experimental verification of the predicted process in the following decades. Also discussed is the extension of the initial Carbon cycle to the CNO multi-cycles and the hot CNO cycles which followed from the detailed experimental studies of the associated nuclear reactions. Finally discussed is the impact of the experimental and theoretical results on our present understanding of hydrogen burning in different stellar environments and on our understanding of the chemical evolution of our universe.

  11. NASA Low-Speed Centrifugal Compressor for Fundamental Research

    NASA Technical Reports Server (NTRS)

    Wood, J. R.; Adam, P. W.; Buggele, A. E.

    1983-01-01

    A centrifugal compressor facility being built by the NASA Lewis Research Center is described; its purpose is to obtain benchmark experimental data for internal flow code verification and modeling. The facility will be heavily instrumented with standard pressure and temperature probes and have provisions for flow visualization and laser Doppler velocimetry. The facility will accommodate rotational speeds to 2400 rpm and will be rated at pressures to 1.25 atm. The initial compressor stage for testing is geometrically and dynamically representative of modern high-performance stages with the exception of Mach number levels. Design exit tip speed for the initial stage is 500 ft/sec with a pressure ratio of 1.17. The rotor exit backsweep is 55 deg from radial.

  12. Single-Event Upset (SEU) model verification and threshold determination using heavy ions in a bipolar static RAM

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.

    1985-01-01

    Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.

  13. Study on method to simulate light propagation on tissue with characteristics of radial-beam LED based on Monte-Carlo method.

    PubMed

    Song, Sangha; Elgezua, Inko; Kobayashi, Yo; Fujie, Masakatsu G

    2013-01-01

    In biomedical, Monte-carlo simulation is commonly used for simulation of light diffusion in tissue. But, most of previous studies did not consider a radial beam LED as light source. Therefore, we considered characteristics of a radial beam LED and applied them on MC simulation as light source. In this paper, we consider 3 characteristics of radial beam LED. The first is an initial launch area of photons. The second is an incident angle of a photon at an initial photon launching area. The third is the refraction effect according to contact area between LED and a turbid medium. For the verification of the MC simulation, we compared simulation and experimental results. The average of the correlation coefficient between simulation and experimental results is 0.9954. Through this study, we show an effective method to simulate light diffusion on tissue with characteristics for radial beam LED based on MC simulation.

  14. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  15. Sound production on a "coaxial saxophone".

    PubMed

    Doc, J-B; Vergez, C; Guillemain, P; Kergomard, J

    2016-11-01

    Sound production on a "coaxial saxophone" is investigated experimentally. The coaxial saxophone is a variant of the cylindrical saxophone made up of two tubes mounted in parallel, which can be seen as a low-frequency analogy of a truncated conical resonator with a mouthpiece. Initially developed for the purposes of theoretical analysis, an experimental verification of the analogy between conical and cylindrical saxophones has never been reported. The present paper explains why the volume of the cylindrical saxophone mouthpiece limits the achievement of a good playability. To limit the mouthpiece volume, a coaxial alignment of pipes is proposed and a prototype of coaxial saxophone is built. An impedance model of coaxial resonator is proposed and validated by comparison with experimental data. Sound production is also studied through experiments with a blowing machine. The playability of the prototype is then assessed and proven for several values of the blowing pressure, of the embouchure parameter, and of the instrument's geometrical parameters.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn Allen; Nelson, Stacy Michelle; Briggs, Timothy

    Presented is a model verification and validation effort using low - velocity impact (LVI) of carbon fiber reinforced polymer laminate experiments. A flat cylindrical indenter impacts the laminate with enough energy to produce delamination, matrix cracks and fiber breaks. Included in the experimental efforts are ultrasonic scans of the damage for qualitative validation of the models. However, the primary quantitative metrics of validation are the force time history measured through the instrumented indenter and initial and final velocities. The simulations, whi ch are run on Sandia's Sierra finite element codes , consist of all physics and material parameters of importancemore » as determined by a sensitivity analysis conducted on the LVI simulation. A novel orthotropic damage and failure constitutive model that is cap able of predicting progressive composite damage and failure is described in detail and material properties are measured, estimated from micromechanics or optimized through calibration. A thorough verification and calibration to the accompanying experiment s are presented. Specia l emphasis is given to the four - point bend experiment. For all simulations of interest, the mesh and material behavior is verified through extensive convergence studies. An ensemble of simulations incorporating model parameter unc ertainties is used to predict a response distribution which is then compared to experimental output. The result is a quantifiable confidence in material characterization and model physics when simulating this phenomenon in structures of interest.« less

  17. Verification of kinetic schemes of hydrogen ignition and combustion in air

    NASA Astrophysics Data System (ADS)

    Fedorov, A. V.; Fedorova, N. N.; Vankova, O. S.; Tropin, D. A.

    2018-03-01

    Three chemical kinetic models for hydrogen combustion in oxygen and three gas-dynamic models for reactive mixture flow behind the initiating SW front were analyzed. The calculated results were compared with experimental data on the dependences of the ignition delay on the temperature and the dilution of the mixture with argon or nitrogen. Based on detailed kinetic mechanisms of nonequilibrium chemical transformations, a mathematical technique for describing the ignition and combustion of hydrogen in air was developed using the ANSYS Fluent code. The problem of ignition of a hydrogen jet fed coaxially into supersonic flow was solved numerically. The calculations were carried out using the Favre-averaged Navier-Stokes equations for a multi-species gas taking into account chemical reactions combined with the k-ω SST turbulence model. The problem was solved in several steps. In the first step, verification of the calculated and experimental data for the three kinetic schemes was performed without considering the conicity of the flow. In the second step, parametric calculations were performed to determine the influence of the conicity of the flow on the mixing and ignition of hydrogen in air using a kinetic scheme consisting of 38 reactions. Three conical supersonic nozzles for a Mach number M = 2 with different expansion angles β = 4°, 4.5°, and 5° were considered.

  18. Space Shuttle Tail Service Mast Concept Verification

    NASA Technical Reports Server (NTRS)

    Uda, R. T.

    1976-01-01

    Design studies and analyses were performed to describe the loads and dynamics of the space shuttle tail service masts (TSMs). Of particular interest are the motion and interaction of the umbilical carrier plate, lanyard system, vacuum jacketed hoses, latches, links, and masthead. A development test rig was designed and fabricated to obtain experimental data. The test program is designed to (1) verify the theoretical dynamics calculations, (2) prove the soundness of design concepts, and (3) elucidate problem areas (if any) in the design of mechanisms and structural components. Design, fabrication, and initiation of TSM development testing at Kennedy Space Center are described.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  20. Improved fluid dynamics similarity, analysis and verification. Part 5: Analytical and experimental studies of thermal stratification phenomena

    NASA Technical Reports Server (NTRS)

    Winter, E. R. F.; Schoenhals, R. J.; Haug, R. I.; Libby, T. L.; Nelson, R. N.; Stevenson, W. H.

    1968-01-01

    The stratification behavior of a contained fluid subjected to transient free convection heat transfer was studied. A rectangular vessel was employed with heat transfer from two opposite walls of the vessel to the fluid. The wall temperature was increased suddenly to initiate the process and was then maintained constant throughout the transient stratification period. Thermocouples were positioned on a post at the center of the vessel. They were adjusted so that temperatures could be measured at the fluid surface and at specific depths beneath the surface. The predicted values of the surface temperature and the stratified layer thickness were found to agree reasonably well with the experimental measurements. The experiments also provided information on the transient centerline temperature distribution and the transient flow distribution.

  1. NASA low-speed centrifugal compressor for fundamental research

    NASA Technical Reports Server (NTRS)

    Wood, J. R.; Adam, P. W.; Buggele, A. E.

    1983-01-01

    A new centrifugal compressor facility being built by the NASA Lewis Research Center is described; its purpose is to obtain 'benchmark' experimental data for internal flow code verification and modeling. The facility will be heavily instrumented with standard pressure and temperature probes and have provisions for flow visualization and laser Doppler velocimetry. The facility will accommodate rotational speeds to 2400 rpm and will be rated at pressures to 1.25 atm. The initial compressor stage for testing is geometrically and dynamically representative of modern high-performance stages with the exception of Mach number levels. Design exit tip speed for the initial stage is 500 ft/sec with a pressure ratio of 1.17. The rotor exit backsweep is 55 deg from radial. The facility is expected to be operational in the first half of 1985.

  2. Glauber-based evaluations of the odd moments of the initial eccentricity relative to the even order participant plates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacey, R.; Wei, R.; Ajitanand, N.

    2011-09-01

    Monte Carlo simulations are used to compute the centrality dependence of the odd moments of the initial eccentricity {var_epsilon}{sub n+1}, relative to the even-order (n) participant planes {Psi}{sub n} in Au + Au collisions. The results obtained for two models of the eccentricity - the Glauber and the factorized Kharzeev-Levin-Nardi (fKLN) models - indicate magnitudes which are essentially zero. They suggest that a possible correlation between the orientations of the odd and even participant planes ({Psi}{sub n+1} and {Psi}{sub n}, respectively) does not have a significant influence on the calculated eccentricities. An experimental verification test for correlations between the orientationsmore » of the odd and even participant planes is also proposed.« less

  3. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  4. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  5. On initial steps of chemical prebiotic evolution: Triggering autocatalytic reaction of oligomerization

    NASA Astrophysics Data System (ADS)

    Bartsev, S. I.; Mezhevikin, V. V.

    2008-12-01

    Searching for extraterrestrial life attracts more and more attention. However this searching hardly can be effective without sufficiently universal concept of life origin, which incidentally tackles a problem of origin of life on the Earth. A concept of initial stages of life origin is stated in the paper. The concept eliminates key difficulties in the problem of life origin, and allows experimental verification of it. According to the concept the predecessor of living beings has to be sufficiently simple to provide non-zero probability of self-assembling during short (in geological or cosmic scale) time. In addition the predecessor has to be capable of autocatalysis, and further complication (evolution). A possible scenario of initial stage of life origin, which can be realized both on other planets, and inside experimental facility is considered. In the scope of the scenario a theoretical model of multivariate oligomeric autocatalyst is presented. Results of computer simulation of two versions of oligomeric autocatalytic reactions are presented. It is shown that the contribution of monomer activation reaction is essential, and in some cases autocatalysis in polymerizing reaction can be achieved without catalyzing proper monomer binding reaction.

  6. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  7. On theoretical and experimental modeling of metabolism forming in prebiotic systems

    NASA Astrophysics Data System (ADS)

    Bartsev, S. I.; Mezhevikin, V. V.

    Recently searching for extraterrestrial life attracts more and more attention However the searching hardly can be effective without sufficiently universal concept of life origin which incidentally tackles a problem of origin of life on the Earth A concept of initial stages of life origin including origin of prebiotic metabolism is stated in the paper Suggested concept eliminates key difficulties in the problem of life origin and allows experimental verification of it According to the concept the predecessor of living beings has to be sufficiently simple to provide non-zero probability of self-assembling during short in geological or cosmic scale time In addition the predecessor has to be capable of autocatalysis and further complication evolution A possible scenario of initial stage of life origin which can be realized both on other planets and inside experimental facility is considered In the scope of the scenario a theoretical model of multivariate oligomeric autocatalyst coupled with phase-separated particle is presented Results of computer simulation of possible initial stage of chemical evolution are shown Conducted estimations show the origin of autocatalytic oligomeric phase-separated system is possible at reasonable values of kinetic parameters of involved chemical reactions in a small-scale flow reactor Accepted statements allowing to eliminate key problems of life origin imply important consequence -- organisms emerged out of the Earth or inside a reactor have to be based on another different from terrestrial biochemical

  8. Idaho out-of-service verification field operational test

    DOT National Transportation Integrated Search

    2000-02-01

    The Out-of-Service Verification Field Operational Test Project was initiated in 1994. The purpose of the project was to test the feasibility of using sensors and a computerized tracking system to augment the ability of inspectors to monitor and contr...

  9. 76 FR 81991 - National Spectrum Sharing Research Experimentation, Validation, Verification, Demonstration and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... non-federal community, including the academic, commercial, and public safety sectors, to implement a..., Verification, Demonstration and Trials: Technical Workshop II on Coordinating Federal Government/Private Sector Spectrum Innovation Testing Needs AGENCY: The National Coordination Office (NCO) for Networking and...

  10. Biometrics based authentication scheme for session initiation protocol.

    PubMed

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  11. VERIFICATION OF THE PERFORMANCE OF DECONTAMINATION TECHNOLOGIES IN EPA'S SAFE BUILDINGS PROGRAM

    EPA Science Inventory

    The paper describes initial progress in identifying and testing technologies applicable for decontaminating workplaces and other buildings that may be subject to chemical or biological attack. The EPA is using the process established in its Environmental Technology Verification (...

  12. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF FOUR IMMUNOASSAY TEST KITS

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  13. PERFORMANCE VERIFICATION OF ADVANCED MONITORING SYSTEMS FOR AIR, WATER, AND SOIL

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to assess environmental quality. The ETV p...

  14. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF FOUR DIOXIN EMISSION MONITORING SYSTEMS

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  15. Bayesian truthing as experimental verification of C4ISR sensors

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew

    2015-05-01

    In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.

  16. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  17. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  18. A novel pathway for the biosynthesis of heme in Archaea: genome-based bioinformatic predictions and experimental evidence.

    PubMed

    Storbeck, Sonja; Rolfes, Sarah; Raux-Deery, Evelyne; Warren, Martin J; Jahn, Dieter; Layer, Gunhild

    2010-12-13

    Heme is an essential prosthetic group for many proteins involved in fundamental biological processes in all three domains of life. In Eukaryota and Bacteria heme is formed via a conserved and well-studied biosynthetic pathway. Surprisingly, in Archaea heme biosynthesis proceeds via an alternative route which is poorly understood. In order to formulate a working hypothesis for this novel pathway, we searched 59 completely sequenced archaeal genomes for the presence of gene clusters consisting of established heme biosynthetic genes and colocalized conserved candidate genes. Within the majority of archaeal genomes it was possible to identify such heme biosynthesis gene clusters. From this analysis we have been able to identify several novel heme biosynthesis genes that are restricted to archaea. Intriguingly, several of the encoded proteins display similarity to enzymes involved in heme d(1) biosynthesis. To initiate an experimental verification of our proposals two Methanosarcina barkeri proteins predicted to catalyze the initial steps of archaeal heme biosynthesis were recombinantly produced, purified, and their predicted enzymatic functions verified.

  19. A Novel Pathway for the Biosynthesis of Heme in Archaea: Genome-Based Bioinformatic Predictions and Experimental Evidence

    PubMed Central

    Storbeck, Sonja; Rolfes, Sarah; Raux-Deery, Evelyne; Warren, Martin J.; Jahn, Dieter; Layer, Gunhild

    2010-01-01

    Heme is an essential prosthetic group for many proteins involved in fundamental biological processes in all three domains of life. In Eukaryota and Bacteria heme is formed via a conserved and well-studied biosynthetic pathway. Surprisingly, in Archaea heme biosynthesis proceeds via an alternative route which is poorly understood. In order to formulate a working hypothesis for this novel pathway, we searched 59 completely sequenced archaeal genomes for the presence of gene clusters consisting of established heme biosynthetic genes and colocalized conserved candidate genes. Within the majority of archaeal genomes it was possible to identify such heme biosynthesis gene clusters. From this analysis we have been able to identify several novel heme biosynthesis genes that are restricted to archaea. Intriguingly, several of the encoded proteins display similarity to enzymes involved in heme d 1 biosynthesis. To initiate an experimental verification of our proposals two Methanosarcina barkeri proteins predicted to catalyze the initial steps of archaeal heme biosynthesis were recombinantly produced, purified, and their predicted enzymatic functions verified. PMID:21197080

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: CONSTELLATION TECHNOLOGY CORPORATION - CT-1128 PORTABLE GAS CHROMATOGRAPH-MASS SPECTROMETER

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  1. Experimental Verification Of The Osculating Cones Method For Two Waverider Forebodies At Mach 4 and 6

    NASA Technical Reports Server (NTRS)

    Miller, Rolf W.; Argrow, Brian M.; Center, Kenneth B.; Brauckmann, Gregory J.; Rhode, Matthew N.

    1998-01-01

    The NASA Langley Research Center Unitary Plan Wind Tunnel and the 20-Inch Mach 6 Tunnel were used to test two osculating cones waverider models. The Mach-4 and Mach-6 shapes were generated using the interactive design tool WIPAR. WIPAR performance predictions are compared to the experimental results. Vapor screen results for the Mach-4 model at the on- design Mach number provide visual verification that the shock is attached along the entire leading edge, within the limits of observation. WIPAR predictions of pressure distributions and aerodynamic coefficients show general agreement with the corresponding experimental values.

  2. Homolytic Cleavage of a B-B Bond by the Cooperative Catalysis of Two Lewis Bases: Computational Design and Experimental Verification.

    PubMed

    Wang, Guoqiang; Zhang, Honglin; Zhao, Jiyang; Li, Wei; Cao, Jia; Zhu, Chengjian; Li, Shuhua

    2016-05-10

    Density functional theory (DFT) investigations revealed that 4-cyanopyridine was capable of homolytically cleaving the B-B σ bond of diborane via the cooperative coordination to the two boron atoms of the diborane to generate pyridine boryl radicals. Our experimental verification provides supportive evidence for this new B-B activation mode. With this novel activation strategy, we have experimentally realized the catalytic reduction of azo-compounds to hydrazine derivatives, deoxygenation of sulfoxides to sulfides, and reduction of quinones with B2 (pin)2 at mild conditions. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Towards Run-time Assurance of Advanced Propulsion Algorithms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Schierman, John D.; Schlapkohl, Thomas; Chicatelli, Amy

    2014-01-01

    This paper covers the motivation and rationale for investigating the application of run-time assurance methods as a potential means of providing safety assurance for advanced propulsion control systems. Certification is becoming increasingly infeasible for such systems using current verification practices. Run-time assurance systems hold the promise of certifying these advanced systems by continuously monitoring the state of the feedback system during operation and reverting to a simpler, certified system if anomalous behavior is detected. The discussion will also cover initial efforts underway to apply a run-time assurance framework to NASA's model-based engine control approach. Preliminary experimental results are presented and discussed.

  4. Heterogeneity of activated carbons in adsorption of phenols from aqueous solutions—Comparison of experimental isotherm data and simulation predictions

    NASA Astrophysics Data System (ADS)

    Podkościelny, P.; Nieszporek, K.

    2007-01-01

    Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.

  5. Cleanup Verification Package for the 100-F-20, Pacific Northwest Laboratory Parallel Pits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. J. Appel

    2007-01-22

    This cleanup verification package documents completion of remedial action for the 100-F-20, Pacific Northwest Laboratory Parallel Pits waste site. This waste site consisted of two earthen trenches thought to have received both radioactive and nonradioactive material related to the 100-F Experimental Animal Farm.

  6. Verification of RELAP5-3D code in natural circulation loop as function of the initial water inventory

    NASA Astrophysics Data System (ADS)

    Bertani, C.; Falcone, N.; Bersano, A.; Caramello, M.; Matsushita, T.; De Salve, M.; Panella, B.

    2017-11-01

    High safety and reliability of advanced nuclear reactors, Generation IV and Small Modular Reactors (SMR), have a crucial role in the acceptance of these new plants design. Among all the possible safety systems, particular efforts are dedicated to the study of passive systems because they rely on simple physical principles like natural circulation, without the need of external energy source to operate. Taking inspiration from the second Decay Heat Removal system (DHR2) of ALFRED, the European Generation IV demonstrator of the fast lead cooled reactor, an experimental facility has been built at the Energy Department of Politecnico di Torino (PROPHET facility) to study single and two-phase flow natural circulation. The facility behavior is simulated using the thermal-hydraulic system code RELAP5-3D, which is widely used in nuclear applications. In this paper, the effect of the initial water inventory on natural circulation is analyzed. The experimental time behaviors of temperatures and pressures are analyzed. The experimental matrix ranges between 69 % and 93%; the influence of the opposite effects related to the increase of the volume available for the expansion and the pressure raise due to phase change is discussed. Simulations of the experimental tests are carried out by using a 1D model at constant heat power and fixed liquid and air mass; the code predictions are compared with experimental results. Two typical responses are observed: subcooled or two phase saturated circulation. The steady state pressure is a strong function of liquid and air mass inventory. The numerical results show that, at low initial liquid mass inventory, the natural circulation is not stable but pulsated.

  7. Design of a front-end integrated circuit for 3D acoustic imaging using 2D CMUT arrays.

    PubMed

    Ciçek, Ihsan; Bozkurt, Ayhan; Karaman, Mustafa

    2005-12-01

    Integration of front-end electronics with 2D capacitive micromachined ultrasonic transducer (CMUT) arrays has been a challenging issue due to the small element size and large channel count. We present design and verification of a front-end drive-readout integrated circuit for 3D ultrasonic imaging using 2D CMUT arrays. The circuit cell dedicated to a single CMUT array element consists of a high-voltage pulser and a low-noise readout amplifier. To analyze the circuit cell together with the CMUT element, we developed an electrical CMUT model with parameters derived through finite element analysis, and performed both the pre- and postlayout verification. An experimental chip consisting of 4 X 4 array of the designed circuit cells, each cell occupying a 200 X 200 microm2 area, was formed for the initial test studies and scheduled for fabrication in 0.8 microm, 50 V CMOS technology. The designed circuit is suitable for integration with CMUT arrays through flip-chip bonding and the CMUT-on-CMOS process.

  8. The species translation challenge—A systems biology perspective on human and rat bronchial epithelial cells

    PubMed Central

    Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2014-01-01

    The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to ‘translate’ the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies. PMID:25977767

  9. The species translation challenge-a systems biology perspective on human and rat bronchial epithelial cells.

    PubMed

    Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2014-01-01

    The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to 'translate' the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies.

  10. Experimental verification of the Neuber relation at room and elevated temperatures. M.S. Thesis; [to predict stress-strain behavior in notched specimens of hastelloy x

    NASA Technical Reports Server (NTRS)

    Lucas, L. J.

    1982-01-01

    The accuracy of the Neuber equation at room temperature and 1,200 F as experimentally determined under cyclic load conditions with hold times. All strains were measured with an interferometric technique at both the local and remote regions of notched specimens. At room temperature, strains were obtained for the initial response at one load level and for cyclically stable conditions at four load levels. Stresses in notched members were simulated by subjecting smooth specimens to he same strains as were recorded on the notched specimen. Local stress-strain response was then predicted with excellent accuracy by subjecting a smooth specimen to limits established by the Neuber equation. Data at 1,200 F were obtained with the same experimental techniques but only in the cyclically stable conditions. The Neuber prediction at this temperature gave relatively accurate results in terms of predicting stress and strain points.

  11. Experimental Verification of a Progressive Damage Model for IM7/5260 Laminates Subjected to Tension-Tension Fatigue

    NASA Technical Reports Server (NTRS)

    Coats, Timothy W.; Harris, Charles E.

    1995-01-01

    The durability and damage tolerance of laminated composites are critical design considerations for airframe composite structures. Therefore, the ability to model damage initiation and growth and predict the life of laminated composites is necessary to achieve structurally efficient and economical designs. The purpose of this research is to experimentally verify the application of a continuum damage model to predict progressive damage development in a toughened material system. Damage due to monotonic and tension-tension fatigue was documented for IM7/5260 graphite/bismaleimide laminates. Crack density and delamination surface area were used to calculate matrix cracking and delamination internal state variables to predict stiffness loss in unnotched laminates. A damage dependent finite element code predicted the stiffness loss for notched laminates with good agreement to experimental data. It was concluded that the continuum damage model can adequately predict matrix damage progression in notched and unnotched laminates as a function of loading history and laminate stacking sequence.

  12. Requirement Specifications for a Design and Verification Unit.

    ERIC Educational Resources Information Center

    Pelton, Warren G.; And Others

    A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…

  13. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF SEVEN ANALYZERS THAT MEASURE AMBIENT AMMONIA EMISSIONS AT AN ANIMAL FEEDING OPERATION

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to monitor environmental quality. The ETV ...

  14. Ignition-and-Growth Modeling of NASA Standard Detonator and a Linear Shaped Charge

    NASA Technical Reports Server (NTRS)

    Oguz, Sirri

    2010-01-01

    The main objective of this study is to quantitatively investigate the ignition and shock sensitivity of NASA Standard Detonator (NSD) and the shock wave propagation of a linear shaped charge (LSC) after being shocked by NSD flyer plate. This combined explosive train was modeled as a coupled Arbitrary Lagrangian-Eulerian (ALE) model with LS-DYNA hydro code. An ignition-and-growth (I&G) reactive model based on unreacted and reacted Jones-Wilkins-Lee (JWL) equations of state was used to simulate the shock initiation. Various NSD-to-LSC stand-off distances were analyzed to calculate the shock initiation (or failure to initiate) and detonation wave propagation along the shaped charge. Simulation results were verified by experimental data which included VISAR tests for NSD flyer plate velocity measurement and an aluminum target severance test for LSC performance verification. Parameters used for the analysis were obtained from various published data or by using CHEETAH thermo-chemical code.

  15. Investigation of the operating conditions to morphology evolution of β-L-glutamic acid during seeded cooling crystallization

    NASA Astrophysics Data System (ADS)

    Zhang, Fangkun; Liu, Tao; Huo, Yan; Guan, Runduo; Wang, Xue Z.

    2017-07-01

    In this paper the effects of operating conditions including cooling rate, initial supersaturation, and seeding temperature were investigated on the morphology evolution of β-L-glutamic acid (β-LGA) during seeded cooling crystallization. Based on the results of in-situ image acquisition of the crystal morphology evolution during the crystallization process, it was found that the crystal products tend to be plate-like or short rod-like under a slow cooling rate, low initial supersaturation, and low seeding temperature. In the opposite, the operating conditions of a faster cooling rate, higher initial supersaturation, and higher seeding temperature tend to produce long rod-like or needle-like crystals, and meanwhile, the length and width of crystal products will be increased together with a wider crystal size distribution (CSD). The aspect ratio of crystals, defined by the crystal length over width measured from in-situ or sample images, was taken as a shape index to analyze the crystal morphologies. Based on comparative analysis of the experimental results, guidelines on these operating conditions were given for obtaining the desired crystal shapes, along with the strategies for obtaining a narrower CSD for better product quality. Experimental verifications were performed to illustrate the proposed guidelines on the operating conditions for seeded cooling crystallization of LGA solution.

  16. Experimental Verification of the Use of Metal Filled Via Hole Fences for Crosstalk Control of Microstrip Lines in LTCC Packages

    NASA Technical Reports Server (NTRS)

    Ponchak, George E.; Chun, Donghoon; Katehi, Linda P. B.; Yook, Jong-Gwan

    1999-01-01

    Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior 3D-FEM electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually increases coupling between the lines; however, if the top of the via posts are connected by a metal Strip, coupling is reduced. In this paper, experimental verification of the 3D-FEM simulations Is demonstrated for commercially fabricated LTCC packages.

  17. Statistical analysis of NWP rainfall data from Poland..

    NASA Astrophysics Data System (ADS)

    Starosta, Katarzyna; Linkowska, Joanna

    2010-05-01

    A goal of this work is to summarize the latest results of precipitation verification in Poland. In IMGW, COSMO_PL version 4.0 has been running. The model configuration is: 14 km horizontal grid spacing, initial time at 00 UTC and 12 UTC, the forecast range 72 h. The fields from the model had been verified with Polish SYNOP stations. The verification was performed using a new verification tool. For the accumulated precipitation indices FBI, POD, FAR, ETS from contingency table are calculated. In this paper the comparison of monthly and seasonal verification of 6h, 12h, 24h accumulated precipitation in 2009 is presented. Since February 2010 the model with 7 km grid spacing will be running in IMGW. The results of precipitation verification for two different models' resolution will be shown.

  18. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  19. Design and Performance Evaluation of an Electro-Hydraulic Camless Engine Valve Actuator for Future Vehicle Applications

    PubMed Central

    Nam, Kanghyun; Cho, Kwanghyun; Park, Sang-Shin; Choi, Seibum B.

    2017-01-01

    This paper details the new design and dynamic simulation of an electro-hydraulic camless engine valve actuator (EH-CEVA) and experimental verification with lift position sensors. In general, camless engine technologies have been known for improving fuel efficiency, enhancing power output, and reducing emissions of internal combustion engines. Electro-hydraulic valve actuators are used to eliminate the camshaft of an existing internal combustion engines and used to control the valve timing and valve duration independently. This paper presents novel electro-hydraulic actuator design, dynamic simulations, and analysis based on design specifications required to satisfy the operation performances. An EH-CEVA has initially been designed and modeled by means of a powerful hydraulic simulation software, AMESim, which is useful for the dynamic simulations and analysis of hydraulic systems. Fundamental functions and performances of the EH-CEVA have been validated through comparisons with experimental results obtained in a prototype test bench. PMID:29258270

  20. Design and Performance Evaluation of an Electro-Hydraulic Camless Engine Valve Actuator for Future Vehicle Applications.

    PubMed

    Nam, Kanghyun; Cho, Kwanghyun; Park, Sang-Shin; Choi, Seibum B

    2017-12-18

    This paper details the new design and dynamic simulation of an electro-hydraulic camless engine valve actuator (EH-CEVA) and experimental verification with lift position sensors. In general, camless engine technologies have been known for improving fuel efficiency, enhancing power output, and reducing emissions of internal combustion engines. Electro-hydraulic valve actuators are used to eliminate the camshaft of an existing internal combustion engines and used to control the valve timing and valve duration independently. This paper presents novel electro-hydraulic actuator design, dynamic simulations, and analysis based on design specifications required to satisfy the operation performances. An EH-CEVA has initially been designed and modeled by means of a powerful hydraulic simulation software, AMESim, which is useful for the dynamic simulations and analysis of hydraulic systems. Fundamental functions and performances of the EH-CEVA have been validated through comparisons with experimental results obtained in a prototype test bench.

  1. Theory, simulation and experiments for precise deflection control of radiotherapy electron beams.

    PubMed

    Figueroa, R; Leiva, J; Moncada, R; Rojas, L; Santibáñez, M; Valente, M; Velásquez, J; Young, H; Zelada, G; Yáñez, R; Guillen, Y

    2018-03-08

    Conventional radiotherapy is mainly applied by linear accelerators. Although linear accelerators provide dual (electron/photon) radiation beam modalities, both of them are intrinsically produced by a megavoltage electron current. Modern radiotherapy treatment techniques are based on suitable devices inserted or attached to conventional linear accelerators. Thus, precise control of delivered beam becomes a main key issue. This work presents an integral description of electron beam deflection control as required for novel radiotherapy technique based on convergent photon beam production. Theoretical and Monte Carlo approaches were initially used for designing and optimizing device´s components. Then, dedicated instrumentation was developed for experimental verification of electron beam deflection due to the designed magnets. Both Monte Carlo simulations and experimental results support the reliability of electrodynamics models used to predict megavoltage electron beam control. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF TWO CONTINUOUS EMISSION MONITORS (CEMS) FOR MEASURING AMMONIA EMISSIONS: SIEMENS AG LDS 3000, AND OPSIS AB LD500

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV ...

  3. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF TWO HYDROGEN SULFIDE ANALYZERS: HORIBA INSTRUMENTS, INC., APSA-360 AND TELEDYNE-API MODEL 101E

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  4. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  5. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    PubMed

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  6. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part II—Experimental Implementation

    PubMed Central

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441

  7. The experimental verification of a streamline curvature numerical analysis method applied to the flow through an axial flow fan

    NASA Technical Reports Server (NTRS)

    Pierzga, M. J.

    1981-01-01

    The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.

  8. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  9. LETTER TO THE EDITOR: Similarity laws for collisionless interaction of superstrong electromagnetic fields with a plasma

    NASA Astrophysics Data System (ADS)

    Ryutov, D. D.; Remington, B. A.

    2006-03-01

    Several similarity laws for the collisionless interaction of ultra-intense electromagnetic fields with a plasma of an arbitrary initial shape are presented. Both ultra-relativistic and non-relativistic cases are covered. The ion motion is included. A relation to the S-similarity described in Pukhov et al (2004 Plasma Phys. Control. Fusion 46 B179) and Gordienko and Pukhov (2005 Phys. Plasmas 12 043109) is established. A brief discussion of possible ways of experimental verification of scaling laws is presented. The results can be of interest for experiments and numerical simulations in the areas of ion acceleration, harmonic generation, magnetic field generation and Coulomb explosion of clusters.

  10. Failure detection of liquid cooled electronics in sealed packages. [in airborne information management system

    NASA Technical Reports Server (NTRS)

    Hoadley, A. W.; Porter, A. J.

    1991-01-01

    The theory and experimental verification of a method of detecting fluid-mass loss, expansion-chamber pressure loss, or excessive vapor build-up in NASA's Airborne Information Management System (AIMS) are presented. The primary purpose of this leak-detection method is to detect the fluid-mass loss before the volume of vapor on the liquid side causes a temperature-critical part to be out of the liquid. The method detects the initial leak after the first 2.5 pct of the liquid mass has been lost, and it can be used for detecting subsequent situations including the leaking of air into the liquid chamber and the subsequent vapor build-up.

  11. Simulation of Laboratory Tests of Steel Arch Support

    NASA Astrophysics Data System (ADS)

    Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof

    2017-03-01

    The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.

  12. Experimental measurement-device-independent verification of quantum steering

    NASA Astrophysics Data System (ADS)

    Kocsis, Sacha; Hall, Michael J. W.; Bennet, Adam J.; Saunders, Dylan J.; Pryde, Geoff J.

    2015-01-01

    Bell non-locality between distant quantum systems—that is, joint correlations which violate a Bell inequality—can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  13. Experimental measurement-device-independent verification of quantum steering.

    PubMed

    Kocsis, Sacha; Hall, Michael J W; Bennet, Adam J; Saunders, Dylan J; Pryde, Geoff J

    2015-01-07

    Bell non-locality between distant quantum systems--that is, joint correlations which violate a Bell inequality--can be verified without trusting the measurement devices used, nor those performing the measurements. This leads to unconditionally secure protocols for quantum information tasks such as cryptographic key distribution. However, complete verification of Bell non-locality requires high detection efficiencies, and is not robust to typical transmission losses over long distances. In contrast, quantum or Einstein-Podolsky-Rosen steering, a weaker form of quantum correlation, can be verified for arbitrarily low detection efficiencies and high losses. The cost is that current steering-verification protocols require complete trust in one of the measurement devices and its operator, allowing only one-sided secure key distribution. Here we present measurement-device-independent steering protocols that remove this need for trust, even when Bell non-locality is not present. We experimentally demonstrate this principle for singlet states and states that do not violate a Bell inequality.

  14. Ethylene Decomposition Initiated by Ultraviolet Radiation from Low Pressure Mercury Lamps: Kinetics Model Prediction and Experimental Verification.

    NASA Astrophysics Data System (ADS)

    Jozwiak, Zbigniew Boguslaw

    1995-01-01

    Ethylene is an important auto-catalytic plant growth hormone. Removal of ethylene from the atmosphere surrounding ethylene-sensitive horticultural products may be very beneficial, allowing an extended period of storage and preventing or delaying the induction of disorders. Various ethylene removal techniques have been studied and put into practice. One technique is based on using low pressure mercury ultraviolet lamps as a source of photochemical energy to initiate chemical reactions that destroy ethylene. Although previous research showed that ethylene disappeared in experiments with mercury ultraviolet lamps, the reactions were not described and the actual cause of ethylene disappearance remained unknown. Proposed causes for this disappearance were the direct action of ultraviolet rays on ethylene, reaction of ethylene with ozone (which is formed when air or gas containing molecular oxygen is exposed to radiation emitted by this type of lamp), or reactions with atomic oxygen leading to formation of ozone. The objective of the present study was to determine the set of physical and chemical actions leading to the disappearance of ethylene from artificial storage atmosphere under conditions of ultraviolet irradiation. The goal was achieved by developing a static chemical model based on the physical properties of a commercially available ultraviolet lamp, the photochemistry of gases, and the kinetics of chemical reactions. The model was used to perform computer simulations predicting time dependent concentrations of chemical species included in the model. Development of the model was accompanied by the design of a reaction chamber used for experimental verification. The model provided a good prediction of the general behavior of the species involved in the chemistry under consideration; however the model predicted lower than measured rate of ethylene disappearance. Some reasons for the model -experiment disagreement are radiation intensity averaging, the experimental technique, mass transfer in the chamber, and incompleteness of the set of chemical reactions included in the model. The work is concluded with guidelines for development of a more complex mathematical model that includes elements of mass transfer inside the reaction chamber, and uses a three dimensional approach to distribute radiation from the low pressure mercury ultraviolet tube.

  15. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  16. Determining potential 30/20 GHZ domestic satellite system concepts and establishment of a suitable experimental configuration

    NASA Technical Reports Server (NTRS)

    Stevens, G. H.; Anzic, G.

    1979-01-01

    NASA is conducting a series of millimeter wave satellite communication systems and market studies to: (1) determine potential domestic 30/20 GHz satellite concepts and market potential, and (2) establish the requirements for a suitable technology verification payload which, although intended to be modest in capacity, would sufficiently demonstrate key technologies and experimentally address key operational issues. Preliminary results and critical issues of the current contracted effort are described. Also included is a description of a NASA-developed multibeam satellite payload configuration which may be representative of concepts utilized in a technology flight verification program.

  17. The experimental verification of wall movement influence coefficients for an adaptive walled test section

    NASA Technical Reports Server (NTRS)

    Neal, G.

    1988-01-01

    Flexible walled wind tunnels have for some time been used to reduce wall interference effects at the model. A necessary part of the 3-D wall adjustment strategy being developed for the Transonic Self-Streamlining Wind Tunnel (TSWT) of Southampton University is the use of influence coefficients. The influence of a wall bump on the centerline flow in TSWT has been calculated theoretically using a streamline curvature program. This report details the experimental verification of these influence coefficients and concludes that it is valid to use the theoretically determined values in 3-D model testing.

  18. Analysis and discussion on the experimental data of electrolyte analyzer

    NASA Astrophysics Data System (ADS)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  19. In vivo thermoluminescence dosimetry dose verification of transperineal 192Ir high-dose-rate brachytherapy using CT-based planning for the treatment of prostate cancer.

    PubMed

    Anagnostopoulos, G; Baltas, D; Geretschlaeger, A; Martin, T; Papagiannis, P; Tselis, N; Zamboglou, N

    2003-11-15

    To evaluate the potential of in vivo thermoluminescence dosimetry to estimate the accuracy of dose delivery in conformal high-dose-rate brachytherapy of prostate cancer. A total of 50 LiF, TLD-100 cylindrical rods were calibrated in the dose range of interest and used as a batch for all fractions. Fourteen dosimeters for every treatment fraction were loaded in a plastic 4F catheter that was fixed in either one of the 6F needles implanted for treatment purposes or in an extra needle implanted after consulting with the patient. The 6F needles were placed either close to the urethra or in the vicinity of the median posterior wall of the prostate. Initial results are presented for 18 treatment fractions in 5 patients and compared to corresponding data calculated using the commercial treatment planning system used for the planning of the treatments based on CT images acquired postimplantation. The maximum observed mean difference between planned and delivered dose within a single treatment fraction was 8.57% +/- 2.61% (root mean square [RMS] errors from 4.03% to 9.73%). Corresponding values obtained after averaging results over all fractions of a patient were 6.88% +/- 4.93% (RMS errors from 4.82% to 7.32%). Experimental results of each fraction corresponding to the same patient point were found to agree within experimental uncertainties. Experimental results indicate that the proposed method is feasible for dose verification purposes and suggest that dose delivery in transperineal high-dose-rate brachytherapy after CT-based planning can be of acceptable accuracy.

  20. Cross-checking of Large Evaluated and Experimental Nuclear Reaction Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeydina, O.; Koning, A.J.; Soppera, N.

    2014-06-15

    Automated methods are presented for the verification of large experimental and evaluated nuclear reaction databases (e.g. EXFOR, JEFF, TENDL). These methods allow an assessment of the overall consistency of the data and detect aberrant values in both evaluated and experimental databases.

  1. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  2. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (IIT-A-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...

  3. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  4. Summary of Part 75 Administrative Processes: Table 1

    EPA Pesticide Factsheets

    Learn how to submit your initial certification, recertification, monitoring plans, routine quality assurance tests, diagnostic tests and DAHS verifications, low mass emissions units and other notification requirements. Table 1, initial certification.

  5. Approaching behavior of a pair of spherical bubbles in quiescent liquids

    NASA Astrophysics Data System (ADS)

    Sanada, Toshiyuki; Kusuno, Hiroaki

    2015-11-01

    Some unique motions related bubble-bubble interaction, such as equilibrium distance, wake induced lift force, have been proposed by theoretical analysis or numerical simulations. These motions are different from the solid spheres like DKT model (Drafting, Kissing and Tumbling). However, there is a lack of the experimental verification. In this study, we experimentally investigated the motion of a pair of bubbles initially positioned in-line configuration in ultrapure water or an aqueous surfactant solution. The bubble motion were observed by two high speed video cameras. The bubbles Reynolds number was ranged from 50 to 300 and bubbles hold the spherical shape in this range. In ultrapure water, initially the trailing bubble deviated from the vertical line on the leading bubble owing to the wake of the leading bubble. And then, the slight difference of the bubble radius changed the relative motion. When the trailing bubble slightly larger than the leading bubble, the trailing bubble approached to the leading bubble due to it's buoyancy difference. The bubbles attracted and collided only when the bubbles rising approximately side by side configuration. In addition, we will also discuss the motion of bubbles rising in an aqueous surfactant solution.

  6. A tracking and verification system implemented in a clinical environment for partial HIPAA compliance

    NASA Astrophysics Data System (ADS)

    Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.

    2006-03-01

    The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.

  7. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    NASA Astrophysics Data System (ADS)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.

  8. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describemore » our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.« less

  9. Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. D. Habel

    2008-05-20

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.

  10. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  11. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  12. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  13. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  14. Principles of Sterilization of Mars Descent Vehicle Elements

    NASA Astrophysics Data System (ADS)

    Trofimov, Vladislav; Deshevaya, Elena; Khamidullina, N.; Kalashnikov, Viktor

    Due to COSPAR severe requirements to permissible microbiological contamination of elements of down-to-Mars S/C as well as complexity of their chemical composition and structure the exposure of such S/C elements to antimicrobial treatment (sterilization) at their integration requires application of a wide set of methods: chemical, ultraviolet, radiation. The report describes the analysis of all the aspects of applicable methods of treatment for cleaning of elements’ surfaces and inner contents from microbiota. The analysis showed that the most important, predictable and controllable method is radiation processing (of the elements which don’t change their properties after effective treatment). The experience of ionizing radiation application for sterilization of products for medicine, etc. shows that, depending on initial microbial contamination of lander elements, the required absorbed dose can be within the range 12 ÷ 35 kGr. The analysis of the effect of irregularity of radiation absorption in complex structure elements to the choice of radiation methodology was made and the algorithm of the choice of effective conditions of radiation treatment and control of sterilization efficiency was suggested. The important phase of establishing of the effective condition of each structure element treatment is experimental verification of real microbiological contamination in terms of S/C integration, contamination maximum decrease using another cleaning procedures (mechanical, chemical, ultraviolet) and determination of radiation resistance of spore microorganisms typical for the shops of space technology manufacturing and assembling. Proceeding from three parameters (irregularity of radiation absorption in a concrete element, its initial microbial contamination and resistance of microorganisms to the effect of radiation) the condition of the packed object sterilization is chosen, the condition that prevents secondary contamination, ensures given reliability of the treatment without final experimental microbiological verification only by simple control of the absorbed dose at critical points. All the process phases (from the choice of treatment conditions to provision of the procedure safety) are strictly regulated by Russian legislation in accordance with international standards.

  15. Californium interrogation prompt neutron (CIPN) instrument for non-destructive assay of spent nuclear fuel – design concept and experimental demonstration

    DOE PAGES

    Henzlova, Daniela; Menlove, Howard Olsen; Rael, Carlos D.; ...

    2015-10-09

    Our paper presents results of the first experimental demonstration of the Californium Interrogation Prompt Neutron (CIPN) instrument developed within a multi-year effort launched by the Next Generation Safeguards Initiative Spent Fuel Project of the United States Department of Energy. The goals of this project focused on developing viable non-destructive assay techniques with capabilities to improve an independent verification of spent fuel assembly characteristics. For this purpose, the CIPN instrument combines active and passive neutron interrogation, along with passive gamma-ray measurements, to provide three independent observables. We describe the initial feasibility demonstration of the CIPN instrument, which involved measurements of fourmore » pressurized-water-reactor spent fuel assemblies with different levels of burnup and two initial enrichments. The measurements were performed at the Post-Irradiation Examination Facility at the Korea Atomic Energy Institute in the Republic of Korea. The key aim of the demonstration was to evaluate CIPN instrument performance under realistic deployment conditions, with the focus on a detailed assessment of systematic uncertainties that are best evaluated experimentally. The measurements revealed good positioning reproducibility, as well as a high degree of insensitivity of the CIPN instrument's response to irregularities in a radial burnup profile. Systematic uncertainty of individual CIPN instrument signals due to assembly rotation was found to be <4.5%, even for assemblies with fairly extreme gradients in the radial burnup profile. Lastly, these features suggest that the CIPN instrument is capable of providing a good representation of assembly average characteristics, independent of assembly orientation in the instrument.« less

  16. Californium interrogation prompt neutron (CIPN) instrument for non-destructive assay of spent nuclear fuel – design concept and experimental demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henzlova, Daniela; Menlove, Howard Olsen; Rael, Carlos D.

    Our paper presents results of the first experimental demonstration of the Californium Interrogation Prompt Neutron (CIPN) instrument developed within a multi-year effort launched by the Next Generation Safeguards Initiative Spent Fuel Project of the United States Department of Energy. The goals of this project focused on developing viable non-destructive assay techniques with capabilities to improve an independent verification of spent fuel assembly characteristics. For this purpose, the CIPN instrument combines active and passive neutron interrogation, along with passive gamma-ray measurements, to provide three independent observables. We describe the initial feasibility demonstration of the CIPN instrument, which involved measurements of fourmore » pressurized-water-reactor spent fuel assemblies with different levels of burnup and two initial enrichments. The measurements were performed at the Post-Irradiation Examination Facility at the Korea Atomic Energy Institute in the Republic of Korea. The key aim of the demonstration was to evaluate CIPN instrument performance under realistic deployment conditions, with the focus on a detailed assessment of systematic uncertainties that are best evaluated experimentally. The measurements revealed good positioning reproducibility, as well as a high degree of insensitivity of the CIPN instrument's response to irregularities in a radial burnup profile. Systematic uncertainty of individual CIPN instrument signals due to assembly rotation was found to be <4.5%, even for assemblies with fairly extreme gradients in the radial burnup profile. Lastly, these features suggest that the CIPN instrument is capable of providing a good representation of assembly average characteristics, independent of assembly orientation in the instrument.« less

  17. 19 CFR Annex V to Part 351 - Comparison of Prior and New Regulations

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... weighted-average dumping margin Subpart B—Antidumping Duty Procedures 353.11 351.201 Self-initiation 353.12... Verification 353.37 351.308 Determination on the basis of the facts available 353.38 (a)-(e) 351.309 Written... 355.35 Removed Ex parte meeting 355.36 351.307 Verification 355.37 351.308 Determinations on the basis...

  18. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  19. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF TWO NUTRIENT ANALYZERS AT AN INDUSTRIAL PLANT: SCHIMADZU SCIENTIFIC INSTRUMENTS, INC. TNPC-4110(C) AND A ZAPS TECHNOLOGIES, INC. MULTI-PARAMETER (MP-1)

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  20. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    NASA Astrophysics Data System (ADS)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  1. A critique of the hypothesis, and a defense of the question, as a framework for experimentation.

    PubMed

    Glass, David J

    2010-07-01

    Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.

  2. Bistatic radar sea state monitoring system design

    NASA Technical Reports Server (NTRS)

    Ruck, G. T.; Krichbaum, C. K.; Everly, J. O.

    1975-01-01

    Remote measurement of the two-dimensional surface wave height spectrum of the ocean by the use of bistatic radar techniques was examined. Potential feasibility and experimental verification by field experiment are suggested. The required experimental hardware is defined along with the designing, assembling, and testing of several required experimental hardware components.

  3. The History and Impact of the CNO Cycles in Nuclear Astrophysics

    NASA Astrophysics Data System (ADS)

    Wiescher, Michael

    2018-03-01

    The carbon cycle, or Bethe-Weizsäcker cycle, plays an important role in astrophysics as one of the most important energy sources for quiescent and explosive hydrogen burning in stars. This paper presents the intellectual and historical background of the idea of the correlation between stellar energy production and the synthesis of the chemical elements in stars on the example of this cycle. In particular, it addresses the contributions of Carl Friedrich von Weizsäcker and Hans Bethe, who provided the first predictions of the carbon cycle. Further, the experimental verification of the predicted process as it developed over the following decades is discussed, as well as the extension of the initial carbon cycle to the carbon-nitrogen-oxygen (CNO) multi-cycles and the hot CNO cycles. This development emerged from the detailed experimental studies of the associated nuclear reactions over more than seven decades. Finally, the impact of the experimental and theoretical results on our present understanding of hydrogen burning in different stellar environments is presented, as well as the impact on our understanding of the chemical evolution of our universe.

  4. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    NASA Astrophysics Data System (ADS)

    Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.

    2017-05-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.

  5. Flow Friction or Spontaneous Ignition?

    NASA Technical Reports Server (NTRS)

    Stoltzfus, Joel M.; Gallus, Timothy D.; Sparks, Kyle

    2012-01-01

    "Flow friction," a proposed ignition mechanism in oxygen systems, has proved elusive in attempts at experimental verification. In this paper, the literature regarding flow friction is reviewed and the experimental verification attempts are briefly discussed. Another ignition mechanism, a form of spontaneous combustion, is proposed as an explanation for at least some of the fire events that have been attributed to flow friction in the literature. In addition, the results of a failure analysis performed at NASA Johnson Space Center White Sands Test Facility are presented, and the observations indicate that spontaneous combustion was the most likely cause of the fire in this 2000 psig (14 MPa) oxygen-enriched system.

  6. Non-Equilbrium Fermi Gases

    DTIC Science & Technology

    2016-02-02

    understanding is the experimental verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in...and additional qualifiers separated by commas, e.g. Smith, Richard, J, Jr. 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES). Self -explanatory... verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in shape and magnitude with all of our

  7. Cleanup Verification Package for the 118-F-6 Burial Ground

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    H. M. Sulloway

    2008-10-02

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.

  8. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  9. Experimental Verification of a Jarzynski-Related Information-Theoretic Equality by a Single Trapped Ion.

    PubMed

    Xiong, T P; Yan, L L; Zhou, F; Rehan, K; Liang, D F; Chen, L; Yang, W L; Ma, Z H; Feng, M; Vedral, V

    2018-01-05

    Most nonequilibrium processes in thermodynamics are quantified only by inequalities; however, the Jarzynski relation presents a remarkably simple and general equality relating nonequilibrium quantities with the equilibrium free energy, and this equality holds in both the classical and quantum regimes. We report a single-spin test and confirmation of the Jarzynski relation in the quantum regime using a single ultracold ^{40}Ca^{+} ion trapped in a harmonic potential, based on a general information-theoretic equality for a temporal evolution of the system sandwiched between two projective measurements. By considering both initially pure and mixed states, respectively, we verify, in an exact and fundamental fashion, the nonequilibrium quantum thermodynamics relevant to the mutual information and Jarzynski equality.

  10. Coherent optimal control of photosynthetic molecules

    NASA Astrophysics Data System (ADS)

    Caruso, F.; Montangero, S.; Calarco, T.; Huelga, S. F.; Plenio, M. B.

    2012-04-01

    We demonstrate theoretically that open-loop quantum optimal control techniques can provide efficient tools for the verification of various quantum coherent transport mechanisms in natural and artificial light-harvesting complexes under realistic experimental conditions. To assess the feasibility of possible biocontrol experiments, we introduce the main settings and derive optimally shaped and robust laser pulses that allow for the faithful preparation of specified initial states (such as localized excitation or coherent superposition, i.e., propagating and nonpropagating states) of the photosystem and probe efficiently the subsequent dynamics. With these tools, different transport pathways can be discriminated, which should facilitate the elucidation of genuine quantum dynamical features of photosystems and therefore enhance our understanding of the role that coherent processes may play in actual biological complexes.

  11. Generic short-time propagation of sharp-boundaries wave packets

    NASA Astrophysics Data System (ADS)

    Granot, E.; Marchewka, A.

    2005-11-01

    A general solution to the "shutter" problem is presented. The propagation of an arbitrary initially bounded wave function is investigated, and the general solution for any such function is formulated. It is shown that the exact solution can be written as an expression that depends only on the values of the function (and its derivatives) at the boundaries. In particular, it is shown that at short times (t << 2mx2/hbar, where x is the distance to the boundaries) the wave function propagation depends only on the wave function's values (or its derivatives) at the boundaries of the region. Finally, we generalize these findings to a non-singular wave function (i.e., for wave packets with finite-width boundaries) and suggest an experimental verification.

  12. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talley, Darren G.

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less

  13. Verification and accreditation schemes for climate change activities: A review of requirements for verification of greenhouse gas reductions and accreditation of verifiers—Implications for long-term carbon sequestration

    NASA Astrophysics Data System (ADS)

    Roed-Larsen, Trygve; Flach, Todd

    The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.

  14. Direct and full-scale experimental verifications towards ground-satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei

    2013-05-01

    Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.

  15. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  16. Rudolph A. Marcus and His Theory of Electron Transfer Reactions

    Science.gov Websites

    early 1950s and soon discovered ... a strong experimental program at Brookhaven on electron-transfer experimental work provided the first verification of several of the predictions of his theory. This, in turn Marcus theory, namely, experimental evidence for the so-called "inverted region" where rates

  17. Experimental verification of radial magnetic levitation force on the cylindrical magnets in ferrofluid dampers

    NASA Astrophysics Data System (ADS)

    Yang, Wenming; Wang, Pengkai; Hao, Ruican; Ma, Buchuan

    2017-03-01

    Analytical and numerical calculation methods of the radial magnetic levitation force on the cylindrical magnets in cylindrical vessels filled with ferrofluid was reviewed. An experimental apparatus to measure this force was designed and tailored, which could measure the forces in a range of 0-2.0 N with an accuracy of 0.001 N. After calibrated, this apparatus was used to study the radial magnetic levitation force experimentally. The results showed that the numerical method overestimates this force, while the analytical ones underestimate it. The maximum deviation between the numerical results and the experimental ones was 18.5%, while that between the experimental results with the analytical ones attained 68.5%. The latter deviation narrowed with the lengthening of the magnets. With the aids of the experimental verification of the radial magnetic levitation force, the effect of eccentric distance of magnets on the viscous energy dissipation in ferrofluid dampers could be assessed. It was shown that ignorance of the eccentricity of magnets during the estimation could overestimate the viscous dissipation in ferrofluid dampers.

  18. Behavioral biometrics for verification and recognition of malicious software agents

    NASA Astrophysics Data System (ADS)

    Yampolskiy, Roman V.; Govindaraju, Venu

    2008-04-01

    Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.

  19. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  20. Towards a Quantitative Endogenous Network Theory of Cancer Genesis and Progression: beyond ``cancer as diseases of genome''

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2011-03-01

    There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.

  1. Elastic suspension of a wind tunnel test section

    NASA Technical Reports Server (NTRS)

    Hacker, R.; Rock, S.; Debra, D. B.

    1982-01-01

    Experimental verification of the theory describing arbitrary motions of an airfoil is reported. The experimental apparatus is described. A mechanism was designed to provide two separate degrees of freedom without friction or backlash to mask the small but important aerodynamic effects of interest.

  2. Experimental verification of nanofluid shear-wave reconversion in ultrasonic fields.

    PubMed

    Forrester, Derek Michael; Huang, Jinrui; Pinfield, Valerie J; Luppé, Francine

    2016-03-14

    Here we present the verification of shear-mediated contributions to multiple scattering of ultrasound in suspensions. Acoustic spectroscopy was carried out with suspensions of silica of differing particle sizes and concentrations in water to find the attenuation at a broad range of frequencies. As the particle sizes approach the nanoscale, commonly used multiple scattering models fail to match experimental results. We develop a new model, taking into account shear mediated contributions, and find excellent agreement with the attenuation spectra obtained using two types of spectrometer. The results determine that shear-wave phenomena must be considered in ultrasound characterisation of nanofluids at even relatively low concentrations of scatterers that are smaller than one micrometre in diameter.

  3. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    NASA Astrophysics Data System (ADS)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  4. Development of a three-dimensional transient code for reactivity-initiated events of BWRs (boiling water reactors) - Models and code verifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uematsu, Hitoshi; Yamamoto, Toru; Izutsu, Sadayuki

    1990-06-01

    A reactivity-initiated event is a design-basis accident for the safety analysis of boiling water reactors. It is defined as a rapid transient of reactor power caused by a reactivity insertion of over $1.0 due to a postulated drop or abnormal withdrawal of the control rod from the core. Strong space-dependent feedback effects are associated with the local power increase due to control rod movement. A realistic treatment of the core status in a transient by a code with a detailed core model is recommended in evaluating this event. A three-dimensional transient code, ARIES, has been developed to meet this need.more » The code simulates the event with three-dimensional neutronics, coupled with multichannel thermal hydraulics, based on a nonequilibrium separated flow model. The experimental data obtained in reactivity accident tests performed with the SPERT III-E core are used to verify the entire code, including thermal-hydraulic models.« less

  5. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THREE RAPID PCR TECHNOLOGIES FOR IDAHO TECHNOLOGY R.A.I.D.® SYSTEM, APPLIED BIOSYSTEMS TAQMAN® E. COLI 0157:H7 DETECTION SYSTEM, AND INVITROGEN CORPORATION PATHALERTTM DETECTION KITS

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  6. Verification of ANSYS Fluent and OpenFOAM CFD platforms for prediction of impact flow

    NASA Astrophysics Data System (ADS)

    Tisovská, Petra; Peukert, Pavel; Kolář, Jan

    The main goal of the article is a verification of the heat transfer coefficient numerically predicted by two CDF platforms - ANSYS-Fluent and OpenFOAM on the problem of impact flows oncoming from 2D nozzle. Various mesh parameters and solver settings were tested under several boundary conditions and compared to known experimental results. The best solver setting, suitable for further optimization of more complex geometry is evaluated.

  7. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  8. Experimental Verification of the Theory of Oscillating Airfoils

    NASA Technical Reports Server (NTRS)

    Silverstein, Abe; Joyner, Upshur T

    1939-01-01

    Measurements have been made of the lift on an airfoil in pitching oscillation with a continuous-recording, instantaneous-force balance. The experimental values for the phase difference between the angle of attack and the lift are shown to be in close agreement with theory.

  9. On the assessment of biological life support system operation range

    NASA Astrophysics Data System (ADS)

    Bartsev, Sergey

    Biological life support systems (BLSS) can be used in long-term space missions only if well-thought-out assessment of the allowable operating range is obtained. The range has to account both permissible working parameters of BLSS and the critical level of perturbations of BLSS stationary state. Direct approach to outlining the range by statistical treatment of experimental data on BLSS destruction seems to be not applicable due to ethical, economical, and saving time reasons. Mathematical model is the unique tool for the generalization of experimental data and the extrapolation of the revealed regularities beyond empirical experience. The problem is that the quality of extrapolation depends on the adequacy of corresponding model verification, but good verification requires wide range of experimental data for fitting, which is not achievable for manned experimental BLSS. Possible way to improve the extrapolation quality of inevitably poorly verified models of manned BLSS is to extrapolate general tendency obtained from unmanned LSS theoretical-experiment investigations. Possibilities and limitations of such approach are discussed.

  10. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code

  11. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > GEFS > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code

  12. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  13. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  14. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  15. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  16. Manipulation strategies for massive space payloads

    NASA Technical Reports Server (NTRS)

    Book, Wayne J.

    1989-01-01

    Control for the bracing strategy is being examined. It was concluded earlier that trajectory planning must be improved to best achieve the bracing motion. Very interesting results were achieved which enable the inverse dynamics of flexible arms to be calculated for linearized motion in a more efficient manner than previously published. The desired motion of the end point beginning at t=0 and ending at t=t sub f is used to calculate the required torque at the joint. The solution is separated into a causal function that is zero for t is less than 0 and an accusal function which is zero for t is greater than t sub f. A number of alternative end point trajectories were explored in terms of the peak torque required, the amount of anticipatory action, and other issues. The single link case is the immediate subject and an experimental verification of that case is being performed. Modeling with experimental verification of closed chain dynamics continues. Modeling effort has pointed out inaccuracies that result from the choice of numerical techniques used to incorporate the closed chain constraints when modeling our experimental prototype RALF (Robotic Arm Large and Flexible). Results were compared to TREETOPS, a multi body code. The experimental verification work is suggesting new ways to make comparisons with systems having structural linearity and joint and geometric nonlinearity. The generation of inertial forces was studied with a small arm that will damp the large arm's vibration.

  17. Experimental investigation of practical unforgeable quantum money

    NASA Astrophysics Data System (ADS)

    Bozzio, Mathieu; Orieux, Adeline; Trigo Vidarte, Luis; Zaquine, Isabelle; Kerenidis, Iordanis; Diamanti, Eleni

    2018-01-01

    Wiesner's unforgeable quantum money scheme is widely celebrated as the first quantum information application. Based on the no-cloning property of quantum mechanics, this scheme allows for the creation of credit cards used in authenticated transactions offering security guarantees impossible to achieve by classical means. However, despite its central role in quantum cryptography, its experimental implementation has remained elusive because of the lack of quantum memories and of practical verification techniques. Here, we experimentally implement a quantum money protocol relying on classical verification that rigorously satisfies the security condition for unforgeability. Our system exploits polarization encoding of weak coherent states of light and operates under conditions that ensure compatibility with state-of-the-art quantum memories. We derive working regimes for our system using a security analysis taking into account all practical imperfections. Our results constitute a major step towards a real-world realization of this milestone protocol.

  18. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  19. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  20. Can legality verification enhace local rights to forest resources? Piloting the policy learning protocol in the Peruvian forest context

    Treesearch

    B. Cashore; I. Visseren-Hamakers; P. Caro Torres; W. de Jong; A. Denvir; D. Humphreys; Kathleen McGinley; G. Auld; S. Lupberger; C. McDermott; S. Sax; D. Yin

    2016-01-01

    This report, “Can Legality Verification Enhance Local Rights to Forest Resources? Piloting the policy learning protocol in the Peruvian forest context,” reports on the testing of the application of the 11-step Policy Learning Protocol in Peru in 2015-16. The Protocol (Cashore et al. 2014) enables actors to draw from international policy initiatives in order to improve...

  1. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF FOUR TEST KITS FOR THE ANALYSIS OF ATRAZINE IN WATER: ABRAXIS LLC ATRAZINE ELISA KIT, BEACON ANALYTICAL SYSTEMS, INC. ATRAZINE TUBE KIT, SILVER LAKE RESEARCH CORP. WATERSAFE PESTICIDE TEST AND STRATEGIC DIAGNOSTICS, INC. RAPID ASSAY KIT

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV ...

  2. Field Verification Program (Aquatic Disposal). Bioenergetic Effects of Black Rock Harbor Dredged Material on the Polychaete Nephtys incisa: A Field Verification.

    DTIC Science & Technology

    1988-03-01

    observed in the laboratory, and to determine the degree of correlation between the bioaccumulation of contaminants and bioenergetic responses...toxicity of liquid, suspended particulate, and solid phases; (c) estimating the potential contami- nant bioaccumulation ; and (d) describing the initial... bioaccumulation of dredged material contami- nants with biological responses from laboratory and field exposure to dredged material. However, this study

  3. Toward Improved Land Surface Initialization in Support of Regional WRF Forecasts at the Kenya Meteorological Service (KMS)

    NASA Technical Reports Server (NTRS)

    Case, Johnathan L.; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.

    2014-01-01

    Flooding and drought are two key forecasting challenges for the Kenya Meteorological Service (KMS). Atmospheric processes leading to excessive precipitation and/or prolonged drought can be quite sensitive to the state of the land surface, which interacts with the planetary boundary layer (PBL) of the atmosphere providing a source of heat and moisture. The development and evolution of precipitation systems are affected by heat and moisture fluxes from the land surface, particularly within weakly-sheared environments such as in the tropics and sub-tropics. These heat and moisture fluxes during the day can be strongly influenced by land cover, vegetation, and soil moisture content. Therefore, it is important to represent the land surface state as accurately as possible in land surface and numerical weather prediction (NWP) models. Enhanced regional modeling capabilities have the potential to improve forecast guidance in support of daily operations and high-impact weather over eastern Africa. KMS currently runs a configuration of the Weather Research and Forecasting (WRF) NWP model in real time to support its daily forecasting operations, making use of the NOAA/National Weather Service (NWS) Science and Training Resource Center's Environmental Modeling System (EMS) to manage and produce the KMS-WRF runs on a regional grid over eastern Africa. Two organizations at the NASA Marshall Space Flight Center in Huntsville, AL, SERVIR and the Shortterm Prediction Research and Transition (SPoRT) Center, have established a working partnership with KMS for enhancing its regional modeling capabilities through new datasets and tools. To accomplish this goal, SPoRT and SERVIR is providing enhanced, experimental land surface initialization datasets and model verification capabilities to KMS as part of this collaboration. To produce a land-surface initialization more consistent with the resolution of the KMS-WRF runs, the NASA Land Information System (LIS) is run at a comparable resolution to provide real-time, daily soil initialization data in place of data interpolated from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model soil moisture and temperature fields. Additionally, realtime green vegetation fraction (GVF) data from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (Suomi- NPP) satellite will be incorporated into the KMS-WRF runs, once it becomes publicly available from the National Environmental Satellite Data and Information Service (NESDIS). Finally, model verification capabilities will be transitioned to KMS using the Model Evaluation Tools (MET; Brown et al. 2009) package in conjunction with a dynamic scripting package developed by SPoRT (Zavodsky et al. 2014), to help quantify possible improvements in simulated temperature, moisture and precipitation resulting from the experimental land surface initialization. Furthermore, the transition of these MET tools will enable KMS to monitor model forecast accuracy in near real time. This paper presents preliminary efforts to improve land surface model initialization over eastern Africa in support of operations at KMS. The remainder of this extended abstract is organized as follows: The collaborating organizations involved in the project are described in Section 2; background information on LIS and the configuration for eastern Africa is presented in Section 3; the WRF configuration used in this modeling experiment is described in Section 4; sample experimental WRF output with and without LIS initialization data are given in Section 5; a summary is given in Section 6 followed by acknowledgements and references.

  4. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  5. Physical property measurements on analog granites related to the joint verification experiment

    NASA Astrophysics Data System (ADS)

    Martin, Randolph J., III; Coyner, Karl B.; Haupt, Robert W.

    1990-08-01

    A key element in JVE (Joint Verification Experiment) conducted jointly between the United States and the USSR is the analysis of the geology and physical properties of the rocks in the respective test sites. A study was initiated to examine unclassified crystalline rock specimens obtained from areas near the Soviet site, Semipalatinsk and appropriate analog samples selected from Mt. Katadin, Maine. These rocks were also compared to Sierra White and Westerly Granite which have been studied in great detail. Measurements performed to characterize these rocks were: (1) Uniaxial strain with simultaneous compressional and shear wave velocities; (2) Hydrostatic compression to 150 MPa with simultaneous compressional and shear wave velocities; (3) Attenuation measurements as a function of frequency and strain amplitude for both dry and water saturated conditions. Elastic moduli determined from the hydrostatic compression and uniaxial strain test show that the rock matrix/mineral properties were comparable in magnitudes which vary within 25 percent from sample to sample. These properties appear to be approximately isotropic, especially at high pressures. However, anisotropy evident for certain samples at pressures below 35 MPa is attributed to dominant pre-existing microcrack populations and their alignments. Dependence of extensional attenuation and Young's modulus on strain amplitude were experimentally determined for intact Sierra White granite using the hysteresis loop technique.

  6. CSTI Earth-to-orbit propulsion research and technology program overview

    NASA Technical Reports Server (NTRS)

    Gentz, Steven J.

    1993-01-01

    NASA supports a vigorous Earth-to-orbit (ETO) research and technology program as part of its Civil Space Technology Initiative. The purpose of this program is to provide an up-to-date technology base to support future space transportation needs for a new generation of lower cost, operationally efficient, long-lived and highly reliable ETO propulsion systems by enhancing the knowledge, understanding and design methodology applicable to advanced oxygen/hydrogen and oxygen/hydrocarbon ETO propulsion systems. Program areas of interest include analytical models, advanced component technology, instrumentation, and validation/verification testing. Organizationally, the program is divided between technology acquisition and technology verification as follows: (1) technology acquisition; and (2) technology verification.

  7. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  8. Numerical Studies of an Array of Fluidic Diverter Actuators for Flow Control

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.; Kuczmarski, Maria A.; Culley, Dennis E.; Raghu, Surya

    2011-01-01

    In this paper, we study the effect of boundary conditions on the behavior of an array of uniformly-spaced fluidic diverters with an ultimate goal to passively control their output phase. This understanding will aid in the development of advanced designs of actuators for flow control applications in turbomachinery. Computations show that a potential design is capable of generating synchronous outputs for various inlet boundary conditions if the flow inside the array is initiated from quiescence. However, when the array operation is originally asynchronous, several approaches investigated numerically demonstrate that re-synchronization of the actuators in the array is not practical since it is very sensitive to asymmetric perturbations and imperfections. Experimental verification of the insights obtained from the present study is currently being pursued.

  9. Further Analysis on the Mystery of the Surveyor III Dust Deposits

    NASA Technical Reports Server (NTRS)

    Metzger, Philip; Hintze, Paul; Trigwell, Steven; Lane, John

    2012-01-01

    The Apollo 12 lunar module (LM) landing near the Surveyor III spacecraft at the end of 1969 has remained the primary experimental verification of the predicted physics of plume ejecta effects from a rocket engine interacting with the surface of the moon. This was made possible by the return of the Surveyor III camera housing by the Apollo 12 astronauts, allowing detailed analysis of the composition of dust deposited by the LM plume. It was soon realized after the initial analysis of the camera housing that the LM plume tended to remove more dust than it had deposited. In the present study, coupons from the camera housing have been reexamined. In addition, plume effects recorded in landing videos from each Apollo mission have been studied for possible clues.

  10. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  11. Optical detection of random features for high security applications

    NASA Astrophysics Data System (ADS)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  12. Delay compensation in integrated communication and control systems. II - Implementation and verification

    NASA Technical Reports Server (NTRS)

    Luck, Rogelio; Ray, Asok

    1990-01-01

    The implementation and verification of the delay-compensation algorithm are addressed. The delay compensator has been experimentally verified at an IEEE 802.4 network testbed for velocity control of a DC servomotor. The performance of the delay-compensation algorithm was also examined by combined discrete-event and continuous-time simulation of the flight control system of an advanced aircraft that uses the SAE (Society of Automotive Engineers) linear token passing bus for data communications.

  13. Technical Note: Using experimentally determined proton spot scanning timing parameters to accurately model beam delivery time.

    PubMed

    Shen, Jiajian; Tryggestad, Erik; Younkin, James E; Keole, Sameer R; Furutani, Keith M; Kang, Yixiu; Herman, Michael G; Bues, Martin

    2017-10-01

    To accurately model the beam delivery time (BDT) for a synchrotron-based proton spot scanning system using experimentally determined beam parameters. A model to simulate the proton spot delivery sequences was constructed, and BDT was calculated by summing times for layer switch, spot switch, and spot delivery. Test plans were designed to isolate and quantify the relevant beam parameters in the operation cycle of the proton beam therapy delivery system. These parameters included the layer switch time, magnet preparation and verification time, average beam scanning speeds in x- and y-directions, proton spill rate, and maximum charge and maximum extraction time for each spill. The experimentally determined parameters, as well as the nominal values initially provided by the vendor, served as inputs to the model to predict BDTs for 602 clinical proton beam deliveries. The calculated BDTs (T BDT ) were compared with the BDTs recorded in the treatment delivery log files (T Log ): ∆t = T Log -T BDT . The experimentally determined average layer switch time for all 97 energies was 1.91 s (ranging from 1.9 to 2.0 s for beam energies from 71.3 to 228.8 MeV), average magnet preparation and verification time was 1.93 ms, the average scanning speeds were 5.9 m/s in x-direction and 19.3 m/s in y-direction, the proton spill rate was 8.7 MU/s, and the maximum proton charge available for one acceleration is 2.0 ± 0.4 nC. Some of the measured parameters differed from the nominal values provided by the vendor. The calculated BDTs using experimentally determined parameters matched the recorded BDTs of 602 beam deliveries (∆t = -0.49 ± 1.44 s), which were significantly more accurate than BDTs calculated using nominal timing parameters (∆t = -7.48 ± 6.97 s). An accurate model for BDT prediction was achieved by using the experimentally determined proton beam therapy delivery parameters, which may be useful in modeling the interplay effect and patient throughput. The model may provide guidance on how to effectively reduce BDT and may be used to identifying deteriorating machine performance. © 2017 American Association of Physicists in Medicine.

  14. FY2017 Final Report: Power of the People: A technical ethical and experimental examination of the use of crowdsourcing to support international nuclear safeguards verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire

    Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less

  15. Experimental Verification of the Use of Metal Filled Via Hole Fences for Crosstalk Control of Microstrip Lines in LTCC Packages

    NASA Technical Reports Server (NTRS)

    Ponchak, George E.; Chun, Donghoon; Yook, Jong-Gwan; Katehi, Linda P. B.

    2001-01-01

    Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior three-dimensional-finite element method (3-D-FEM) electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually Increases coupling between the lines: however, if the top of the via posts are connected by a metal strip, coupling is reduced. In this paper, experimental verification of the 3-D-FEM simulations is demonstrated for commercially fabricated low temperature cofired ceramic (LTCC) packages. In addition, measured attenuation of microstrip lines surrounded by the shielding structures is presented and shows that shielding structures do not change the attenuation characteristics of the line.

  16. Heat-straightening effects on the behavior of plates and rolled shapes : volume 2 : second interim report of phase 1.

    DOT National Transportation Integrated Search

    1987-08-01

    One of the primary reasons that highway departments are hesitant to use heat-straightening techniques to repair damaged steel girders is the lack of experimental verification of the process. A comprehensive experimental program on the subject has bee...

  17. Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.

    ERIC Educational Resources Information Center

    Kaya, Azmi

    1982-01-01

    Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…

  18. Effects of rotation on coolant passage heat transfer. Volume 2: Coolant passages with trips normal and skewed to the flow

    NASA Technical Reports Server (NTRS)

    Johnson, B. V.; Wagner, J. H.; Steuber, G. D.

    1993-01-01

    An experimental program was conducted to investigate heat transfer and pressure loss characteristics of rotating multipass passages, for configurations and dimensions typical of modem turbine blades. This experimental program is one part of the NASA Hot Section Technology (HOST) Initiative, which has as its overall objective the development and verification of improved analysis methods that will form the basis for a design system that will produce turbine components with improved durability. The objective of this program was the generation of a data base of heat transfer and pressure loss data required to develop heat transfer correlations and to assess computational fluid dynamic techniques for rotating coolant passages. The experimental work was broken down into two phases. Phase 1 consists of experiments conducted in a smooth wall large scale heat transfer model. A detailed discussion of these results was presented in volume 1 of a NASA Report. In Phase 2 the large scale model was modified to investigate the effects of skewed and normal passage turbulators. The results of Phase 2 along with comparison to Phase 1 is the subject of this Volume 2 NASA Report.

  19. TU-FG-BRB-05: A 3 Dimensional Prompt Gamma Imaging System for Range Verification in Proton Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, E; Chen, H; Polf, J

    2016-06-15

    Purpose: To report on the initial developments of a clinical 3-dimensional (3D) prompt gamma (PG) imaging system for proton radiotherapy range verification. Methods: The new imaging system under development consists of a prototype Compton camera to measure PG emission during proton beam irradiation and software to reconstruct, display, and analyze 3D images of the PG emission. For initial test of the system, PGs were measured with a prototype CC during a 200 cGy dose delivery with clinical proton pencil beams (ranging from 100 MeV – 200 MeV) to a water phantom. Measurements were also carried out with the CC placedmore » 15 cm from the phantom for a full range 150 MeV pencil beam and with its range shifted by 2 mm. Reconstructed images of the PG emission were displayed by the clinical PG imaging software and compared to the dose distributions of the proton beams calculated by a commercial treatment planning system. Results: Measurements made with the new PG imaging system showed that a 3D image could be reconstructed from PGs measured during the delivery of 200 cGy of dose, and that shifts in the Bragg peak range of as little as 2 mm could be detected. Conclusion: Initial tests of a new PG imaging system show its potential to provide 3D imaging and range verification for proton radiotherapy. Based on these results, we have begun work to improve the system with the goal that images can be produced from delivery of as little as 20 cGy so that the system could be used for in-vivo proton beam range verification on a daily basis.« less

  20. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  1. Testing Dialog-Verification of SIP Phones with Single-Message Denial-of-Service Attacks

    NASA Astrophysics Data System (ADS)

    Seedorf, Jan; Beckers, Kristian; Huici, Felipe

    The Session Initiation Protocol (SIP) is widely used for signaling in multimedia communications. However, many SIP implementations are still in their infancy and vulnerable to malicious messages. We investigate flaws in the SIP implementations of eight phones, showing that the deficient verification of SIP dialogs further aggravates the problem by making it easier for attacks to succeed. Our results show that the majority of the phones we tested are susceptible to these attacks.

  2. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF EIGHT RAPID TOXICITY TESTING SYSTEMS: STRATEGIC DIAGNOSTICS INC'S DELTATOX (R) AND MICTOTOX (R), SEVERN TRENT SERVICES ECLOX, HACH COMPANY TOXTRAK INTERLAB SUPPLY, LTD. POLYTOX (TM), CHECKLLIGHT, LTD TOXSCREEN, AQUA SURVEY, INC. IQ TOXICITY TEST (TM), HIDEX OY BIOTOX (TM)

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  3. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less

  4. Accuracy of self-reported smoking abstinence in clinical trials of hospital-initiated smoking interventions.

    PubMed

    Scheuermann, Taneisha S; Richter, Kimber P; Rigotti, Nancy A; Cummins, Sharon E; Harrington, Kathleen F; Sherman, Scott E; Zhu, Shu-Hong; Tindle, Hilary A; Preacher, Kristopher J

    2017-12-01

    To estimate the prevalence and predictors of failed biochemical verification of self-reported abstinence among participants enrolled in trials of hospital-initiated smoking cessation interventions. Comparison of characteristics between participants who verified and those who failed to verify self-reported abstinence. Multi-site randomized clinical trials conducted between 2010 and 2014 in hospitals throughout the United States. Recently hospitalized smokers who reported tobacco abstinence 6 months post-randomization and provided a saliva sample for verification purposes (n = 822). Outcomes were salivary cotinine-verified smoking abstinence at 10 and 15 ng/ml cut-points. Predictors and correlates included participant demographics and tobacco use; hospital diagnoses and treatment; and study characteristics collected via surveys and electronic medical records. Usable samples were returned by 69.8% of the 1178 eligible trial participants who reported 7-day point prevalence abstinence. The proportion of participants verified as quit was 57.8% [95% confidence interval (CI) = 54.4, 61.2; 10 ng/ml cut-off] or 60.6% (95% CI = 57.2, 63.9; 15 ng/ml). Factors associated independently with verification at 10 ng/ml were education beyond high school education [odds ratio (OR) = 1.51; 95% CI = 1.07, 2.11], continuous abstinence since hospitalization (OR = 2.82; 95% CI = 2.02, 3.94), mailed versus in-person sample (OR = 3.20; 95% CI = 1.96, 5.21) and race. African American participants were less likely to verify abstinence than white participants (OR = 0.64; 95% CI = 0.44, 0.93). Findings were similar for verification at 15 ng/ml. Verification rates did not differ by treatment group. In the United States, high rates (40%) of recently hospitalized smokers enrolled in smoking cessation trials fail biochemical verification of their self-reported abstinence. © 2017 Society for the Study of Addiction.

  5. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  6. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    NASA Technical Reports Server (NTRS)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  7. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  8. Model Checking Satellite Operational Procedures

    NASA Astrophysics Data System (ADS)

    Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri

    2011-08-01

    We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.

  9. Experimental verification of the spectral shift between near- and far-field peak intensities of plasmonic infrared nanoantennas.

    PubMed

    Alonso-González, P; Albella, P; Neubrech, F; Huck, C; Chen, J; Golmar, F; Casanova, F; Hueso, L E; Pucci, A; Aizpurua, J; Hillenbrand, R

    2013-05-17

    Theory predicts a distinct spectral shift between the near- and far-field optical response of plasmonic antennas. Here we combine near-field optical microscopy and far-field spectroscopy of individual infrared-resonant nanoantennas to verify experimentally this spectral shift. Numerical calculations corroborate our experimental results. We furthermore discuss the implications of this effect in surface-enhanced infrared spectroscopy.

  10. Theoretical verification of experimentally obtained conformation-dependent electronic conductance in a biphenyl molecule

    NASA Astrophysics Data System (ADS)

    Maiti, Santanu K.

    2014-07-01

    The experimentally obtained (Venkataraman et al. [1]) cosine squared relation of electronic conductance in a biphenyl molecule is verified theoretically within a tight-binding framework. Using Green's function formalism we numerically calculate two-terminal conductance as a function of relative twist angle among the molecular rings and find that the results are in good agreement with the experimental observation.

  11. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  12. Improving Numerical Weather Predictions of Summertime Precipitation Over the Southeastern U.S. Through a High-Resolution Initialization of the Surface State

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Kumar, Sujay V.; Krikishen, Jayanthi; Jedlovec, Gary J.

    2011-01-01

    It is hypothesized that high-resolution, accurate representations of surface properties such as soil moisture and sea surface temperature are necessary to improve simulations of summertime pulse-type convective precipitation in high resolution models. This paper presents model verification results of a case study period from June-August 2008 over the Southeastern U.S. using the Weather Research and Forecasting numerical weather prediction model. Experimental simulations initialized with high-resolution land surface fields from the NASA Land Information System (LIS) and sea surface temperature (SST) derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) are compared to a set of control simulations initialized with interpolated fields from the National Centers for Environmental Prediction 12-km North American Mesoscale model. The LIS land surface and MODIS SSTs provide a more detailed surface initialization at a resolution comparable to the 4-km model grid spacing. Soil moisture from the LIS spin-up run is shown to respond better to the extreme rainfall of Tropical Storm Fay in August 2008 over the Florida peninsula. The LIS has slightly lower errors and higher anomaly correlations in the top soil layer, but exhibits a stronger dry bias in the root zone. The model sensitivity to the alternative surface initial conditions is examined for a sample case, showing that the LIS/MODIS data substantially impact surface and boundary layer properties.

  13. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  14. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.

  15. Intelligent wear mode identification system for marine diesel engines based on multi-level belief rule base methodology

    NASA Astrophysics Data System (ADS)

    Yan, Xinping; Xu, Xiaojian; Sheng, Chenxing; Yuan, Chengqing; Li, Zhixiong

    2018-01-01

    Wear faults are among the chief causes of main-engine damage, significantly influencing the secure and economical operation of ships. It is difficult for engineers to utilize multi-source information to identify wear modes, so an intelligent wear mode identification model needs to be developed to assist engineers in diagnosing wear faults in diesel engines. For this purpose, a multi-level belief rule base (BBRB) system is proposed in this paper. The BBRB system consists of two-level belief rule bases, and the 2D and 3D characteristics of wear particles are used as antecedent attributes on each level. Quantitative and qualitative wear information with uncertainties can be processed simultaneously by the BBRB system. In order to enhance the efficiency of the BBRB, the silhouette value is adopted to determine referential points and the fuzzy c-means clustering algorithm is used to transform input wear information into belief degrees. In addition, the initial parameters of the BBRB system are constructed on the basis of expert-domain knowledge and then optimized by the genetic algorithm to ensure the robustness of the system. To verify the validity of the BBRB system, experimental data acquired from real-world diesel engines are analyzed. Five-fold cross-validation is conducted on the experimental data and the BBRB is compared with the other four models in the cross-validation. In addition, a verification dataset containing different wear particles is used to highlight the effectiveness of the BBRB system in wear mode identification. The verification results demonstrate that the proposed BBRB is effective and efficient for wear mode identification with better performance and stability than competing systems.

  16. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > PEOPLE Home Operational Products Experimental Data Verification / Development Contacts Change Log Events Calendar Events People Numerical Forecast Systems Coming Soon. NOAA

  17. An Overview and Empirical Comparison of Distance Metric Learning Methods.

    PubMed

    Moutafis, Panagiotis; Leng, Mengjun; Kakadiaris, Ioannis A

    2016-02-16

    In this paper, we first offer an overview of advances in the field of distance metric learning. Then, we empirically compare selected methods using a common experimental protocol. The number of distance metric learning algorithms proposed keeps growing due to their effectiveness and wide application. However, existing surveys are either outdated or they focus only on a few methods. As a result, there is an increasing need to summarize the obtained knowledge in a concise, yet informative manner. Moreover, existing surveys do not conduct comprehensive experimental comparisons. On the other hand, individual distance metric learning papers compare the performance of the proposed approach with only a few related methods and under different settings. This highlights the need for an experimental evaluation using a common and challenging protocol. To this end, we conduct face verification experiments, as this task poses significant challenges due to varying conditions during data acquisition. In addition, face verification is a natural application for distance metric learning because the encountered challenge is to define a distance function that: 1) accurately expresses the notion of similarity for verification; 2) is robust to noisy data; 3) generalizes well to unseen subjects; and 4) scales well with the dimensionality and number of training samples. In particular, we utilize well-tested features to assess the performance of selected methods following the experimental protocol of the state-of-the-art database labeled faces in the wild. A summary of the results is presented along with a discussion of the insights obtained and lessons learned by employing the corresponding algorithms.

  18. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  19. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  20. Constitutive modeling of superalloy single crystals with verification testing

    NASA Technical Reports Server (NTRS)

    Jordan, Eric; Walker, Kevin P.

    1985-01-01

    The goal is the development of constitutive equations to describe the elevated temperature stress-strain behavior of single crystal turbine blade alloys. The program includes both the development of a suitable model and verification of the model through elevated temperature-torsion testing. A constitutive model is derived from postulated constitutive behavior on individual crystallographic slip systems. The behavior of the entire single crystal is then arrived at by summing up the slip on all the operative crystallographic slip systems. This type of formulation has a number of important advantages, including the prediction orientation dependence and the ability to directly represent the constitutive behavior in terms which metallurgists use in describing the micromechanisms. Here, the model is briefly described, followed by the experimental set-up and some experimental findings to date.

  1. A method of atmospheric density measurements during space shuttle entry using ultraviolet-laser Rayleigh scattering

    NASA Technical Reports Server (NTRS)

    Mckenzie, Robert L.

    1988-01-01

    An analytical study and its experimental verification are described which show the performance capabilities and the hardware requirements of a method for measuring atmospheric density along the Space Shuttle flightpath during entry. Using onboard instrumentation, the technique relies on Rayleigh scattering of light from a pulsed ArF excimer laser operating at a wavelength of 193 nm. The method is shown to be capable of providing density measurements with an uncertainty of less than 1 percent and with a spatial resolution along the flightpath of 1 km, over an altitude range from 50 to 90 km. Experimental verification of the signal linearity and the expected signal-to-noise ratios is demonstrated in a simulation facility at conditions that duplicate the signal levels of the flight environment.

  2. Experimental verification of cleavage characteristic stress vs grain size

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, W.; Li, D.; Yao, M.

    Instead of the accepted cleavage fracture stress [sigma][sub f] proposed by Knott et al, a new parameter S[sub co], named as ''cleavage characteristic stress,'' has been recently recommended to characterize the microscopic resistance to cleavage fracture. To give a definition, S[sub co] is the fracture stress at the brittle/ductile transition temperature of steels in plain tension, below which the yield strength approximately equals the true fracture stress combined with an abrupt curtailment of ductility. By considering a single-grain microcrack arrested at a boundary, Huang and Yao set up an expression of S[sub co] as a function of grain size. Themore » present work was arranged to provide an experimental verification of S[sub co] vs grain size.« less

  3. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  4. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    NASA Astrophysics Data System (ADS)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC-based SBRT treatment planning in the routine clinical setting.

  5. Kelvin–Helmholtz instability in an ultrathin air film causes drop splashing on smooth surfaces

    PubMed Central

    Liu, Yuan; Tan, Peng; Xu, Lei

    2015-01-01

    When a fast-moving drop impacts onto a smooth substrate, splashing will be produced at the edge of the expanding liquid sheet. This ubiquitous phenomenon lacks a fundamental understanding. Combining experiment with model, we illustrate that the ultrathin air film trapped under the expanding liquid front triggers splashing. Because this film is thinner than the mean free path of air molecules, the interior airflow transfers momentum with an unusually high velocity comparable to the speed of sound and generates a stress 10 times stronger than the airflow in common situations. Such a large stress initiates Kelvin–Helmholtz instabilities at small length scales and effectively produces splashing. Our model agrees quantitatively with experimental verifications and brings a fundamental understanding to the ubiquitous phenomenon of drop splashing on smooth surfaces. PMID:25713350

  6. Joint Estimation of Source Range and Depth Using a Bottom-Deployed Vertical Line Array in Deep Water

    PubMed Central

    Li, Hui; Yang, Kunde; Duan, Rui; Lei, Zhixiong

    2017-01-01

    This paper presents a joint estimation method of source range and depth using a bottom-deployed vertical line array (VLA). The method utilizes the information on the arrival angle of direct (D) path in space domain and the interference characteristic of D and surface-reflected (SR) paths in frequency domain. The former is related to a ray tracing technique to backpropagate the rays and produces an ambiguity surface of source range. The latter utilizes Lloyd’s mirror principle to obtain an ambiguity surface of source depth. The acoustic transmission duct is the well-known reliable acoustic path (RAP). The ambiguity surface of the combined estimation is a dimensionless ad hoc function. Numerical efficiency and experimental verification show that the proposed method is a good candidate for initial coarse estimation of source position. PMID:28590442

  7. End-point detection in potentiometric titration by continuous wavelet transform.

    PubMed

    Jakubowska, Małgorzata; Baś, Bogusław; Kubiak, Władysław W

    2009-10-15

    The aim of this work was construction of the new wavelet function and verification that a continuous wavelet transform with a specially defined dedicated mother wavelet is a useful tool for precise detection of end-point in a potentiometric titration. The proposed algorithm does not require any initial information about the nature or the type of analyte and/or the shape of the titration curve. The signal imperfection, as well as random noise or spikes has no influence on the operation of the procedure. The optimization of the new algorithm was done using simulated curves and next experimental data were considered. In the case of well-shaped and noise-free titration data, the proposed method gives the same accuracy and precision as commonly used algorithms. But, in the case of noisy or badly shaped curves, the presented approach works good (relative error mainly below 2% and coefficients of variability below 5%) while traditional procedures fail. Therefore, the proposed algorithm may be useful in interpretation of the experimental data and also in automation of the typical titration analysis, specially in the case when random noise interfere with analytical signal.

  8. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  9. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  10. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  11. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  12. The NASA Space Launch System Program Systems Engineering Approach for Affordability

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    The National Aeronautics and Space Administration is currently developing the Space Launch System to provide the United States with a capability to launch large Payloads into Low Earth orbit and deep space. One of the development tenets of the SLS Program is affordability. One initiative to enhance affordability is the SLS approach to requirements definition, verification and system certification. The key aspects of this initiative include: 1) Minimizing the number of requirements, 2) Elimination of explicit verification requirements, 3) Use of certified models of subsystem capability in lieu of requirements when appropriate and 4) Certification of capability beyond minimum required capability. Implementation of each aspect is described and compared to a "typical" systems engineering implementation, including a discussion of relative risk. Examples of each implementation within the SLS Program are provided.

  13. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less

  14. CFD modeling and experimental verification of a single-stage coaxial Stirling-type pulse tube cryocooler without either double-inlet or multi-bypass operating at 30-35 K using mixed stainless steel mesh regenerator matrices

    NASA Astrophysics Data System (ADS)

    Dang, Haizheng; Zhao, Yibo

    2016-09-01

    This paper presents the CFD modeling and experimental verifications of a single-stage inertance tube coaxial Stirling-type pulse tube cryocooler operating at 30-35 K using mixed stainless steel mesh regenerator matrices without either double-inlet or multi-bypass. A two-dimensional axis-symmetric CFD model with the thermal non-equilibrium mode is developed to simulate the internal process, and the underlying mechanism of significantly reducing the regenerator losses with mixed matrices is discussed in detail based on the given six cases. The modeling also indicates that the combination of the given different mesh segments can be optimized to achieve the highest cooling efficiency or the largest exergy ratio, and then the verification experiments are conducted in which the satisfactory agreements between simulated and tested results are observed. The experiments achieve a no-load temperature of 27.2 K and the cooling power of 0.78 W at 35 K, or 0.29 W at 30 K, with an input electric power of 220 W and a reject temperature of 300 K.

  15. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  16. Quantitative measurements of vaporization, burst ionization, and emission characteristics of shaped charge barium releases

    NASA Technical Reports Server (NTRS)

    Hoch, Edward L.; Hallinan, Thomas J.; Stenbaek-Nielsen, Hans C.

    1994-01-01

    Intensity-calibrated color video recordings of three barium-shaped charge injections in the ionopshere were used to determine the initial ionization, the column density corresponding to unity optical depth, and the yield of vaporized barium in the fast jet. It was found that the initial ionization at the burst was less than 1% and that 0% burst ionization was consistent with the observations. Owing to the Doppler shift, the column density for optical thickness in the neutral barium varies somewhat according to the velocity distribution. For the cases examined here, the column density was 2-5 x 10(exp 10) atoms/sq cm. This value, which occurred 12 to 15 s after release, should be approximately valid for most shaped charge experiments. The yield was near 30% (15% in the fast jet) for two of the releases and was somewhat lower in the third, which also had a lower peak velocity. This study also demonstrated the applicability of the computer simulation code developed for chemical releases by Stenbaek-Nielsen and provided experimental verification of the Doppler-corrected emission rates calculated b Stenbaek-Nielsen (1989).

  17. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > RAP, HRRR > Home Operational Products Experimental Data Verification Model Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post

  18. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    NASA Astrophysics Data System (ADS)

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  19. Turbine Engine Testing.

    DTIC Science & Technology

    1981-01-01

    per-rev, ring weighting factor, etc.) and with compression system design . A detailed description of the SAE methodology is provided in Ref. 1...offers insights into the practical application of experimental aeromechanical procedures and establishes the process of valid design assessment, avoiding...considerations given to the total engine system. Design Verification in the Experimental Laboratory Certain key parameters are influencing the design of modern

  20. Resistivity Correction Factor for the Four-Probe Method: Experiment I

    NASA Astrophysics Data System (ADS)

    Yamashita, Masato; Yamaguchi, Shoji; Enjoji, Hideo

    1988-05-01

    Experimental verification of the theoretically derived resistivity correction factor (RCF) is presented. Resistivity and sheet resistance measurements by the four-probe method are made on three samples: isotropic graphite, ITO film and Au film. It is indicated that the RCF can correct the apparent variations of experimental data to yield reasonable resistivities and sheet resistances.

  1. ARC-1980-AC80-0512-2

    NASA Image and Video Library

    1980-06-05

    N-231 High Reynolds Number Channel Facility (An example of a Versatile Wind Tunnel) Tunnel 1 I is a blowdown Facility that utilizes interchangeable test sections and nozzles. The facility provides experimental support for the fluid mechanics research, including experimental verification of aerodynamic computer codes and boundary-layer and airfoil studies that require high Reynolds number simulation. (Tunnel 1)

  2. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > HOME Home Operational Products Experimental Data Verification Model PAGE LOGO NCEP HOME NWS LOGO NOAA HOME NOAA HOME Disclaimer for this non-operational web page

  3. The Sedov Blast Wave as a Radial Piston Verification Test

    DOE PAGES

    Pederson, Clark; Brown, Bart; Morgan, Nathaniel

    2016-06-22

    The Sedov blast wave is of great utility as a verification problem for hydrodynamic methods. The typical implementation uses an energized cell of finite dimensions to represent the energy point source. We avoid this approximation by directly finding the effects of the energy source as a boundary condition (BC). Furthermore, the proposed method transforms the Sedov problem into an outward moving radial piston problem with a time-varying velocity. A portion of the mesh adjacent to the origin is removed and the boundaries of this hole are forced with the velocities from the Sedov solution. This verification test is implemented onmore » two types of meshes, and convergence is shown. Our results from the typical initial condition (IC) method and the new BC method are compared.« less

  4. Alternative sample sizes for verification dose experiments and dose audits

    NASA Astrophysics Data System (ADS)

    Taylor, W. A.; Hansen, J. M.

    1999-01-01

    ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.

  5. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  6. Methode de Calcul du Flutter en Presence de jeu Mecanique et Verification Experimentale (Flutter Analysis Method in Presence of Mechanical Play and Experimental Verification)

    DTIC Science & Technology

    2000-05-01

    Flexible Aircraft Control", held in Ottawa, Canada, 18-20 October 1999, and published in RTO MP-36. 9-2 INTRODUCTION 2. PRINCIPES DE LA METHODE DE CALCUL...constitude par un .les pressions sur la gouveme et le ensemble de 17 pouts de jauge , de 20 moment de charni~re sont surestimds accildrom~tes, de 5...les corrdlations calcul-essais 130 mm). des rdponses dc jauges de contraintes A 12 Le calcul, comme les essais, permettent chargements statiques. Cette

  7. Experimental verification of a new laminar airfoil: A project for the graduate program in aeronautics

    NASA Technical Reports Server (NTRS)

    Nicks, Oran W.; Korkan, Kenneth D.

    1991-01-01

    Two reports on student activities to determine the properties of a new laminar airfoil which were delivered at a conference on soaring technology are presented. The papers discuss a wind tunnel investigation and analysis of the SM701 airfoil and verification of the SM701 airfoil aerodynamic charcteristics utilizing theoretical techniques. The papers are based on a combination of analytical design, hands-on model fabrication, wind tunnel calibration and testing, data acquisition and analysis, and comparison of test results and theory.

  8. Bounded Parametric Model Checking for Elementary Net Systems

    NASA Astrophysics Data System (ADS)

    Knapik, Michał; Szreter, Maciej; Penczek, Wojciech

    Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.

  9. Verification of an IGBT Fusing Switch for Over-current Protection of the SNS HVCM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benwell, Andrew; Kemp, Mark; Burkhart, Craig

    2010-06-11

    An IGBT based over-current protection system has been developed to detect faults and limit the damage caused by faults in high voltage converter modulators. During normal operation, an IGBT enables energy to be transferred from storage capacitors to a H-bridge. When a fault occurs, the over-current protection system detects the fault, limits the fault current and opens the IGBT to isolate the remaining stored energy from the fault. This paper presents an experimental verification of the over-current protection system under applicable conditions.

  10. Low level vapor verification of monomethyl hydrazine

    NASA Technical Reports Server (NTRS)

    Mehta, Narinder

    1990-01-01

    The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.

  11. Estimation of Ecosystem Parameters of the Community Land Model with DREAM: Evaluation of the Potential for Upscaling Net Ecosystem Exchange

    NASA Astrophysics Data System (ADS)

    Hendricks Franssen, H. J.; Post, H.; Vrugt, J. A.; Fox, A. M.; Baatz, R.; Kumbhar, P.; Vereecken, H.

    2015-12-01

    Estimation of net ecosystem exchange (NEE) by land surface models is strongly affected by uncertain ecosystem parameters and initial conditions. A possible approach is the estimation of plant functional type (PFT) specific parameters for sites with measurement data like NEE and application of the parameters at other sites with the same PFT and no measurements. This upscaling strategy was evaluated in this work for sites in Germany and France. Ecosystem parameters and initial conditions were estimated with NEE-time series of one year length, or a time series of only one season. The DREAM(zs) algorithm was used for the estimation of parameters and initial conditions. DREAM(zs) is not limited to Gaussian distributions and can condition to large time series of measurement data simultaneously. DREAM(zs) was used in combination with the Community Land Model (CLM) v4.5. Parameter estimates were evaluated by model predictions at the same site for an independent verification period. In addition, the parameter estimates were evaluated at other, independent sites situated >500km away with the same PFT. The main conclusions are: i) simulations with estimated parameters reproduced better the NEE measurement data in the verification periods, including the annual NEE-sum (23% improvement), annual NEE-cycle and average diurnal NEE course (error reduction by factor 1,6); ii) estimated parameters based on seasonal NEE-data outperformed estimated parameters based on yearly data; iii) in addition, those seasonal parameters were often also significantly different from their yearly equivalents; iv) estimated parameters were significantly different if initial conditions were estimated together with the parameters. We conclude that estimated PFT-specific parameters improve land surface model predictions significantly at independent verification sites and for independent verification periods so that their potential for upscaling is demonstrated. However, simulation results also indicate that possibly the estimated parameters mask other model errors. This would imply that their application at climatic time scales would not improve model predictions. A central question is whether the integration of many different data streams (e.g., biomass, remotely sensed LAI) could solve the problems indicated here.

  12. Experimental setup for the measurement of induction motor cage currents

    NASA Astrophysics Data System (ADS)

    Bottauscio, Oriano; Chiampi, Mario; Donadio, Lorenzo; Zucca, Mauro

    2005-04-01

    An experimental setup for measurement of the currents flowing in the rotor bars of induction motors during synchronous no-load tests is described in the paper. The experimental verification of the high-frequency phenomena in the rotor cage is fundamental for a deep insight of the additional loss estimation by numerical methods. The attention is mainly focused on the analysis and design of the transducers developed for the cage current measurement.

  13. Collapse of Experimental Colloidal Aging using Record Dynamics

    NASA Astrophysics Data System (ADS)

    Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter

    The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.

  14. Experimental quantum verification in the presence of temporally correlated noise

    NASA Astrophysics Data System (ADS)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  15. Collinear cluster tri-partition: Kinematics constraints and stability of collinearity

    NASA Astrophysics Data System (ADS)

    Holmvall, P.; Köster, U.; Heinz, A.; Nilsson, T.

    2017-01-01

    Background: A new mode of nuclear fission has been proposed by the FOBOS Collaboration, called collinear cluster tri-partition (CCT), and suggests that three heavy fission fragments can be emitted perfectly collinearly in low-energy fission. This claim is based on indirect observations via missing-energy events using the 2 v 2 E method. This proposed CCT seems to be an extraordinary new aspect of nuclear fission. It is surprising that CCT escaped observation for so long given the relatively high reported yield of roughly 0.5 % relative to binary fission. These claims call for an independent verification with a different experimental technique. Purpose: Verification experiments based on direct observation of CCT fragments with fission-fragment spectrometers require guidance with respect to the allowed kinetic-energy range, which we present in this paper. Furthermore, we discuss corresponding model calculations which, if CCT is found in such verification experiments, could indicate how the breakups proceed. Since CCT refers to collinear emission, we also study the intrinsic stability of collinearity. Methods: Three different decay models are used that together span the timescales of three-body fission. These models are used to calculate the possible kinetic-energy ranges of CCT fragments by varying fragment mass splits, excitation energies, neutron multiplicities, and scission-point configurations. Calculations are presented for the systems 235U(nth,f ) and 252Cf(s f ) , and the fission fragments previously reported for CCT; namely, isotopes of the elements Ni, Si, Ca, and Sn. In addition, we use semiclassical trajectory calculations with a Monte Carlo method to study the intrinsic stability of collinearity. Results: CCT has a high net Q value but, in a sequential decay, the intermediate steps are energetically and geometrically unfavorable or even forbidden. Moreover, perfect collinearity is extremely unstable, and broken by the slightest perturbation. Conclusions: According to our results, the central fragment would be very difficult to detect due to its low kinetic energy, raising the question of why other 2 v 2 E experiments could not detect a missing-mass signature corresponding to CCT. Considering the high kinetic energies of the outer fragments reported in our study, direct-observation experiments should be able to observe CCT. Furthermore, we find that a realization of CCT would require an unphysical fine tuning of the initial conditions. Finally, our stability calculations indicate that, due to the pronounced instability of the collinear configuration, a prolate scission configuration does not necessarily lead to collinear emission, nor does equatorial emission necessarily imply an oblate scission configuration. In conclusion, our results enable independent experimental verification and encourage further critical theoretical studies of CCT.

  16. System description: IVY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCune, W.; Shumsky, O.

    2000-02-04

    IVY is a verified theorem prover for first-order logic with equality. It is coded in ACL2, and it makes calls to the theorem prover Otter to search for proofs and to the program MACE to search for countermodels. Verifications of Otter and MACE are not practical because they are coded in C. Instead, Otter and MACE give detailed proofs and models that are checked by verified ACL2 programs. In addition, the initial conversion to clause form is done by verified ACL2 code. The verification is done with respect to finite interpretations.

  17. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  18. Hubble Space Telescope high speed photometer orbital verification

    NASA Technical Reports Server (NTRS)

    Richards, Evan E.

    1991-01-01

    The purpose of this report is to provide a summary of the results of the HSP (High Speed Photometer) Orbital Verification (OV) tests and to report conclusions and lessons learned from the initial operations of the HSP. The HSP OV plan covered the activities through fine (phase 3) alignment. This report covers all activities (OV, SV, and SAO) from launch to the completion of phase 3 alignment. Those activities in this period that are not OV tests are described to the extent that they relate to OV activities.

  19. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broome, Scott Thomas; Flint, Gregory Mark; Dewers, Thomas

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failuremore » and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.« less

  20. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  1. Character Recognition Method by Time-Frequency Analyses Using Writing Pressure

    NASA Astrophysics Data System (ADS)

    Watanabe, Tatsuhito; Katsura, Seiichiro

    With the development of information and communication technology, personal verification becomes more and more important. In the future ubiquitous society, the development of terminals handling personal information requires the personal verification technology. The signature is one of the personal verification methods; however, the number of characters is limited in the case of the signature and therefore false signature is used easily. Thus, personal identification is difficult from handwriting. This paper proposes a “haptic pen” that extracts the writing pressure, and shows a character recognition method by time-frequency analyses. Although the figures of characters written by different amanuenses are similar, the differences appear in the time-frequency domain. As a result, it is possible to use the proposed character recognition for personal identification more exactly. The experimental results showed the viability of the proposed method.

  2. EM&V for Energy Efficiency Policies and Initiatives

    EPA Pesticide Factsheets

    Learn how representatives of jurisdictions, companies, and other entities can use evaluation, measurement, and verification (EM&V) in demand-side energy efficiency (EE) investments to achieve intended environmental, energy, and economic goals.

  3. Scenarios for exercising technical approaches to verified nuclear reductions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions willmore » take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information for establishing a conceptual approach to a five-year technical program plan for research and development of nuclear arms reductions verification and transparency technologies and procedures.« less

  4. A mesoscopic reaction rate model for shock initiation of multi-component PBX explosives.

    PubMed

    Liu, Y R; Duan, Z P; Zhang, Z Y; Ou, Z C; Huang, F L

    2016-11-05

    The primary goal of this research is to develop a three-term mesoscopic reaction rate model that consists of a hot-spot ignition, a low-pressure slow burning and a high-pressure fast reaction terms for shock initiation of multi-component Plastic Bonded Explosives (PBX). Thereinto, based on the DZK hot-spot model for a single-component PBX explosive, the hot-spot ignition term as well as its reaction rate is obtained through a "mixing rule" of the explosive components; new expressions for both the low-pressure slow burning term and the high-pressure fast reaction term are also obtained by establishing the relationships between the reaction rate of the multi-component PBX explosive and that of its explosive components, based on the low-pressure slow burning term and the high-pressure fast reaction term of a mesoscopic reaction rate model. Furthermore, for verification, the new reaction rate model is incorporated into the DYNA2D code to simulate numerically the shock initiation process of the PBXC03 and the PBXC10 multi-component PBX explosives, and the numerical results of the pressure histories at different Lagrange locations in explosive are found to be in good agreements with previous experimental data. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Experimental verification of layout physical verification of silicon photonics

    NASA Astrophysics Data System (ADS)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  6. Odysseus's Sailboat Dilemma

    ERIC Educational Resources Information Center

    Wong, Siu-ling; Chun, Ka-wai Cecilia; Mak, Se-yuen

    2007-01-01

    We describe a physics investigation project inspired by one of the adventures of Odysseus in Homer's "Odyssey." The investigation uses the laws of mechanics, vector algebra and a simple way to construct a fan-and-sail-cart for experimental verification.

  7. Resistivity Correction Factor for the Four-Probe Method: Experiment III

    NASA Astrophysics Data System (ADS)

    Yamashita, Masato; Nishii, Toshifumi; Kurihara, Hiroshi; Enjoji, Hideo; Iwata, Atsushi

    1990-04-01

    Experimental verification of the theoretically derived resistivity correction factor F is presented. Factor F is applied to a system consisting of a rectangular parallelepiped sample and a square four-probe array. Resistivity and sheet resistance measurements are made on isotropic graphites and crystalline ITO films. Factor F corrects experimental data and leads to reasonable resistivity and sheet resistance.

  8. Full-Scale Experimental Verification of Soft-Story-Only Retrofits of Wood-Frame Buildings using Hybrid Testing

    Treesearch

    Elaina Jennings; John W. van de Lindt; Ershad Ziaei; Pouria Bahmani; Sangki Park; Xiaoyun Shao; Weichiang Pang; Douglas Rammer; Gary Mochizuki; Mikhail Gershfeld

    2015-01-01

    The FEMA P-807 Guidelines were developed for retrofitting soft-story wood-frame buildings based on existing data, and the method had not been verified through full-scale experimental testing. This article presents two different retrofit designs based directly on the FEMA P-807 Guidelines that were examined at several different seismic intensity levels. The...

  9. Experimental Verification of the Individual Energy Dependencies of the Partial L-Shell Photoionization Cross Sections of Pd and Mo

    NASA Astrophysics Data System (ADS)

    Hönicke, Philipp; Kolbe, Michael; Müller, Matthias; Mantler, Michael; Krämer, Markus; Beckhoff, Burkhard

    2014-10-01

    An experimental method for the verification of the individually different energy dependencies of L1-, L2-, and L3- subshell photoionization cross sections is described. The results obtained for Pd and Mo are well in line with theory regarding both energy dependency and absolute values, and confirm the theoretically calculated cross sections by Scofield from the early 1970 s and, partially, more recent data by Trzhaskovskaya, Nefedov, and Yarzhemsky. The data also demonstrate the questionability of quantitative x-ray spectroscopical results based on the widely used fixed jump ratio approximated cross sections with energy independent ratios. The experiments are carried out by employing the radiometrically calibrated instrumentation of the Physikalisch-Technische Bundesanstalt at the electron storage ring BESSY II in Berlin; the obtained fluorescent intensities are thereby calibrated at an absolute level in reference to the International System of Units. Experimentally determined fixed fluorescence line ratios for each subshell are used for a reliable deconvolution of overlapping fluorescence lines. The relevant fundamental parameters of Mo and Pd are also determined experimentally in order to calculate the subshell photoionization cross sections independently of any database.

  10. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    NASA Astrophysics Data System (ADS)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  11. Analytical and Experimental Investigations of Sodium Heat Pipes and Thermal Energy Storage Systems.

    DTIC Science & Technology

    1982-01-01

    continued) Figure Page 5.1 Cylindrical container for eutectic salt (LiF-NgF -KF) . . . . . . 91 5.2 TESC sample . . . . . . ... . . 0...of fluorides of Mg, Li and K. Experimental results have been used to verify the melting point, and latent heat of fusion of the eutectic salt , in...a melting or solidification curve will provide experimental verification for the latent heat value and melting point of a given eutectic salt . In the

  12. Experimental verification of Pyragas-Schöll-Fiedler control.

    PubMed

    von Loewenich, Clemens; Benner, Hartmut; Just, Wolfram

    2010-09-01

    We present an experimental realization of time-delayed feedback control proposed by Schöll and Fiedler. The scheme enables us to stabilize torsion-free periodic orbits in autonomous systems, and to overcome the so-called odd number limitation. The experimental control performance is in quantitative agreement with the bifurcation analysis of simple model systems. The results uncover some general features of the control scheme which are deemed to be relevant for a large class of setups.

  13. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  14. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  15. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  16. Development of a process for high capacity-arc heater production of silicon

    NASA Technical Reports Server (NTRS)

    Reed, W. H.; Meyer, T. N.; Fey, M. G.; Harvey, F. J.; Arcella, F. G.

    1978-01-01

    The realization of low cost, electric power from large-area silicon, photovoltaic arrays will depend on the development of new methods for large capacity production of solar grade (SG) silicon with a cost of less than $10 per kilogram by 1986 (established Department of Energy goal). The objective of the program is to develop a method to produce SG silicon in large quantities based on the high temperature-sodium reduction of silicon tetrachloride (SiCl4) to yield molten silicon and the coproduct salt vapor (NaCl). Commercial ac electric arc heaters will be utilized to provide a hyper-heated mixture of argon and hydrogen which will furnish the required process energy. The reactor is designed for a nominal silicon flow rate of 45 kg/hr. Analyses and designs have been conducted to evaluate the process and complete the initial design of the experimental verification unit.

  17. Fluorescence correlation spectroscopy: the case of subdiffusion.

    PubMed

    Lubelski, Ariel; Klafter, Joseph

    2009-03-18

    The theory of fluorescence correlation spectroscopy is revisited here for the case of subdiffusing molecules. Subdiffusion is assumed to stem from a continuous-time random walk process with a fat-tailed distribution of waiting times and can therefore be formulated in terms of a fractional diffusion equation (FDE). The FDE plays the central role in developing the fluorescence correlation spectroscopy expressions, analogous to the role played by the simple diffusion equation for regular systems. Due to the nonstationary nature of the continuous-time random walk/FDE, some interesting properties emerge that are amenable to experimental verification and may help in discriminating among subdiffusion mechanisms. In particular, the current approach predicts 1), a strong dependence of correlation functions on the initial time (aging); 2), sensitivity of correlation functions to the averaging procedure, ensemble versus time averaging (ergodicity breaking); and 3), that the basic mean-squared displacement observable depends on how the mean is taken.

  18. Material Model Evaluation of a Composite Honeycomb Energy Absorber

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Annett, Martin S.; Fasanella, Edwin L.; Polanco, Michael A.

    2012-01-01

    A study was conducted to evaluate four different material models in predicting the dynamic crushing response of solid-element-based models of a composite honeycomb energy absorber, designated the Deployable Energy Absorber (DEA). Dynamic crush tests of three DEA components were simulated using the nonlinear, explicit transient dynamic code, LS-DYNA . In addition, a full-scale crash test of an MD-500 helicopter, retrofitted with DEA blocks, was simulated. The four material models used to represent the DEA included: *MAT_CRUSHABLE_FOAM (Mat 63), *MAT_HONEYCOMB (Mat 26), *MAT_SIMPLIFIED_RUBBER/FOAM (Mat 181), and *MAT_TRANSVERSELY_ANISOTROPIC_CRUSHABLE_FOAM (Mat 142). Test-analysis calibration metrics included simple percentage error comparisons of initial peak acceleration, sustained crush stress, and peak compaction acceleration of the DEA components. In addition, the Roadside Safety Verification and Validation Program (RSVVP) was used to assess similarities and differences between the experimental and analytical curves for the full-scale crash test.

  19. Effluent sampling of Scout D and Delta launch vehicle exhausts

    NASA Technical Reports Server (NTRS)

    Hulten, W. C.; Storey, R. W.; Gregory, G. L.; Woods, D. C.; Harris, F. S., Jr.

    1974-01-01

    Characterization of engine-exhaust effluents (hydrogen chloride, aluminum oxide, carbon dioxide, and carbon monoxide) has been attempted by conducting field experiments monitoring the exhaust cloud from a Scout-Algol III vehicle launch and a Delta-Thor vehicle launch. The exhaust cloud particulate size number distribution (total number of particles as a function of particle diameter), mass loading, morphology, and elemental composition have been determined within limitations. The gaseous species in the exhaust cloud have been identified. In addition to the ground-based measurements, instrumented aircraft flights through the low-altitude, stabilized-exhaust cloud provided measurements which identified CO and HCI gases and Al2O3 particles. Measurements of the initial exhaust cloud during formation and downwind at several distances have established sampling techniques which will be used for experimental verification of model predictions of effluent dispersion and fallout from exhaust clouds.

  20. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  1. AIR QUALITY FORECAST VERIFICATION USING SATELLITE DATA

    EPA Science Inventory

    NOAA 's operational geostationary satellite retrievals of aerosol optical depths (AODs) were used to verify National Weather Service (NWS) experimental (research mode) particulate matter (PM2.5) forecast guidance issued during the summer 2004 International Consortium for Atmosp...

  2. Ac electronic tunneling at optical frequencies

    NASA Technical Reports Server (NTRS)

    Faris, S. M.; Fan, B.; Gustafson, T. K.

    1974-01-01

    Rectification characteristics of non-superconducting metal-barrier-metal junctions deduced from electronic tunneling have been observed experimentally for optical frequency irradiation of the junction. The results provide verification of optical frequency Fermi level modulation and electronic tunneling current modulation.

  3. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  4. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A.

    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasismore » was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.« less

  5. Circulation of spoof surface plasmon polaritons: Implementation and verification

    NASA Astrophysics Data System (ADS)

    Pan, Junwei; Wang, Jiafu; Qiu, Tianshuo; Pang, Yongqiang; Li, Yongfeng; Zhang, Jieqiu; Qu, Shaobo

    2018-05-01

    In this letter, we are dedicated to implementation and experimental verification of broadband circulator for spoof surface plasmon polaritons (SSPPs). For the ease of fabrication, a circulator operating in X band was firstly designed. The comb-like transmission lines (CL-TLs), a typical SSPP structure, are adopted as the three branches of the Y-junction. To enable broadband coupling of SSPP, a transition section is added on each end of the CL-TLs. Through such a design, the circulator can operate under the sub-wavelength SSPP mode in a broad band. The simulation results show that the insertion loss is less than 0.5dB while the isolation and return loss are higher than 20dB in 9.4-12.0GHz. A prototype was fabricated and measured. The experimental results are consistent with the simulation results and verify the broadband circulation performance in X band.

  6. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  7. Adapted RF pulse design for SAR reduction in parallel excitation with experimental verification at 9.4 T.

    PubMed

    Wu, Xiaoping; Akgün, Can; Vaughan, J Thomas; Andersen, Peter; Strupp, John; Uğurbil, Kâmil; Van de Moortele, Pierre-François

    2010-07-01

    Parallel excitation holds strong promises to mitigate the impact of large transmit B1 (B+1) distortion at very high magnetic field. Accelerated RF pulses, however, inherently tend to require larger values in RF peak power which may result in substantial increase in Specific Absorption Rate (SAR) in tissues, which is a constant concern for patient safety at very high field. In this study, we demonstrate adapted rate RF pulse design allowing for SAR reduction while preserving excitation target accuracy. Compared with other proposed implementations of adapted rate RF pulses, our approach is compatible with any k-space trajectories, does not require an analytical expression of the gradient waveform and can be used for large flip angle excitation. We demonstrate our method with numerical simulations based on electromagnetic modeling and we include an experimental verification of transmit pattern accuracy on an 8 transmit channel 9.4 T system.

  8. Low cost solar array project silicon materials task. Development of a process for high capacity arc heater production of silicon for solar arrays

    NASA Technical Reports Server (NTRS)

    Fey, M. G.

    1981-01-01

    The experimental verification system for the production of silicon via the arc heater-sodium reduction of SiCl4 was designed, fabricated, installed, and operated. Each of the attendant subsystems was checked out and operated to insure performance requirements. These subsystems included: the arc heaters/reactor, cooling water system, gas system, power system, Control & Instrumentation system, Na injection system, SiCl4 injection system, effluent disposal system and gas burnoff system. Prior to introducing the reactants (Na and SiCl4) to the arc heater/reactor, a series of gas only-power tests was conducted to establish the operating parameters of the three arc heaters of the system. Following the successful completion of the gas only-power tests and the readiness tests of the sodium and SiCl4 injection systems, a shakedown test of the complete experimental verification system was conducted.

  9. Development of automated optical verification technologies for control systems

    NASA Astrophysics Data System (ADS)

    Volegov, Peter L.; Podgornov, Vladimir A.

    1999-08-01

    The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.

  10. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  11. A silicon strip detector array for energy verification and quality assurance in heavy ion therapy.

    PubMed

    Debrot, Emily; Newall, Matthew; Guatelli, Susanna; Petasecca, Marco; Matsufuji, Naruhiro; Rosenfeld, Anatoly B

    2018-02-01

    The measurement of depth dose profiles for range and energy verification of heavy ion beams is an important aspect of quality assurance procedures for heavy ion therapy facilities. The steep dose gradients in the Bragg peak region of these profiles require the use of detectors with high spatial resolution. The aim of this work is to characterize a one dimensional monolithic silicon detector array called the "serial Dose Magnifying Glass" (sDMG) as an independent ion beam energy and range verification system used for quality assurance conducted for ion beams used in heavy ion therapy. The sDMG detector consists of two linear arrays of 128 silicon sensitive volumes each with an effective size of 2mm × 50μm × 100μm fabricated on a p-type substrate at a pitch of 200 μm along a single axis of detection. The detector was characterized for beam energy and range verification by measuring the response of the detector when irradiated with a 290 MeV/u 12 C ion broad beam incident along the single axis of the detector embedded in a PMMA phantom. The energy of the 12 C ion beam incident on the detector and the residual energy of an ion beam incident on the phantom was determined from the measured Bragg peak position in the sDMG. Ad hoc Monte Carlo simulations of the experimental setup were also performed to give further insight into the detector response. The relative response profiles along the single axis measured with the sDMG detector were found to have good agreement between experiment and simulation with the position of the Bragg peak determined to fall within 0.2 mm or 1.1% of the range in the detector for the two cases. The energy of the beam incident on the detector was found to vary less than 1% between experiment and simulation. The beam energy incident on the phantom was determined to be (280.9 ± 0.8) MeV/u from the experimental and (280.9 ± 0.2) MeV/u from the simulated profiles. These values coincide with the expected energy of 281 MeV/u. The sDMG detector response was studied experimentally and characterized using a Monte Carlo simulation. The sDMG detector was found to accurately determine the 12 C beam energy and is suited for fast energy and range verification quality assurance. It is proposed that the sDMG is also applicable for verification of treatment planning systems that rely on particle range. © 2017 American Association of Physicists in Medicine.

  12. A Validation and Code-to-Code Verification of FAST for a Megawatt-Scale Wind Turbine with Aeroelastically Tailored Blades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan

    This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less

  13. A Validation and Code-to-Code Verification of FAST for a Megawatt-Scale Wind Turbine with Aeroelastically Tailored Blades

    DOE PAGES

    Guntur, Srinivas; Jonkman, Jason; Sievers, Ryan; ...

    2017-08-29

    This paper presents validation and code-to-code verification of the latest version of the U.S. Department of Energy, National Renewable Energy Laboratory wind turbine aeroelastic engineering simulation tool, FAST v8. A set of 1,141 test cases, for which experimental data from a Siemens 2.3 MW machine have been made available and were in accordance with the International Electrotechnical Commission 61400-13 guidelines, were identified. These conditions were simulated using FAST as well as the Siemens in-house aeroelastic code, BHawC. This paper presents a detailed analysis comparing results from FAST with those from BHawC as well as experimental measurements, using statistics including themore » means and the standard deviations along with the power spectral densities of select turbine parameters and loads. Results indicate a good agreement among the predictions using FAST, BHawC, and experimental measurements. Here, these agreements are discussed in detail in this paper, along with some comments regarding the differences seen in these comparisons relative to the inherent uncertainties in such a model-based analysis.« less

  14. Experimental verification of theoretical equations for acoustic radiation force on compressible spherical particles in traveling waves.

    PubMed

    Johnson, Kennita A; Vormohr, Hannah R; Doinikov, Alexander A; Bouakaz, Ayache; Shields, C Wyatt; López, Gabriel P; Dayton, Paul A

    2016-05-01

    Acoustophoresis uses acoustic radiation force to remotely manipulate particles suspended in a host fluid for many scientific, technological, and medical applications, such as acoustic levitation, acoustic coagulation, contrast ultrasound imaging, ultrasound-assisted drug delivery, etc. To estimate the magnitude of acoustic radiation forces, equations derived for an inviscid host fluid are commonly used. However, there are theoretical predictions that, in the case of a traveling wave, viscous effects can dramatically change the magnitude of acoustic radiation forces, which make the equations obtained for an inviscid host fluid invalid for proper estimation of acoustic radiation forces. To date, experimental verification of these predictions has not been published. Experimental measurements of viscous effects on acoustic radiation forces in a traveling wave were conducted using a confocal optical and acoustic system and values were compared with available theories. Our results show that, even in a low-viscosity fluid such as water, the magnitude of acoustic radiation forces is increased manyfold by viscous effects in comparison with what follows from the equations derived for an inviscid fluid.

  15. Experimental verification of theoretical equations for acoustic radiation force on compressible spherical particles in traveling waves

    NASA Astrophysics Data System (ADS)

    Johnson, Kennita A.; Vormohr, Hannah R.; Doinikov, Alexander A.; Bouakaz, Ayache; Shields, C. Wyatt; López, Gabriel P.; Dayton, Paul A.

    2016-05-01

    Acoustophoresis uses acoustic radiation force to remotely manipulate particles suspended in a host fluid for many scientific, technological, and medical applications, such as acoustic levitation, acoustic coagulation, contrast ultrasound imaging, ultrasound-assisted drug delivery, etc. To estimate the magnitude of acoustic radiation forces, equations derived for an inviscid host fluid are commonly used. However, there are theoretical predictions that, in the case of a traveling wave, viscous effects can dramatically change the magnitude of acoustic radiation forces, which make the equations obtained for an inviscid host fluid invalid for proper estimation of acoustic radiation forces. To date, experimental verification of these predictions has not been published. Experimental measurements of viscous effects on acoustic radiation forces in a traveling wave were conducted using a confocal optical and acoustic system and values were compared with available theories. Our results show that, even in a low-viscosity fluid such as water, the magnitude of acoustic radiation forces is increased manyfold by viscous effects in comparison with what follows from the equations derived for an inviscid fluid.

  16. Application of additive laser technologies in the gas turbine blades design process

    NASA Astrophysics Data System (ADS)

    Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.

    2017-11-01

    An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.

  17. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies.

    PubMed

    Caswell, Joseph M; Singh, Manraj; Persinger, Michael A

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  18. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  19. Fire safety experiments on MIR Orbital Station

    NASA Technical Reports Server (NTRS)

    Egorov, S. D.; Belayev, A. YU.; Klimin, L. P.; Voiteshonok, V. S.; Ivanov, A. V.; Semenov, A. V.; Zaitsev, E. N.; Balashov, E. V.; Andreeva, T. V.

    1995-01-01

    The process of heterogeneous combustion of most materials under zero-g without forced motion of air is practically impossible. However, ventilation is required to support astronauts' life and cool equipment. The presence of ventilation flows in station compartments at accidental ignition can cause a fire. An additional, but exceedingly important parameter of the fire risk of solid materials under zero-g is the minimum air gas velocity at which the extinction of materials occurs. Therefore, the conception of fire safety can be based on temporarily lowering the intensity of ventilation and even turning it off. The information on the limiting conditions of combustion under natural conditions is needed from both scientific and practical points of view. It will enable us to judge the reliability of results of ground-based investigations and develop a conception of fire safety of inhabited sealed compartments of space stations to by provided be means of nontraditional and highly-effective methods without both employing large quantities of fire-extinguishing compounds and hard restrictions on use of polymers. In this connection, an experimental installation was created to study the process of heterogeneous combustion of solid non-metals and to determine the conditions of its extinction under microgravity. This installation was delivered to the orbital station 'Mir' and the cosmonauts Viktorenko and Kondakova performed initial experiments on it in late 1994. The experimental installation consists of a combustion chamber with an electrical systems for ignition of samples, a device for cleaning air from combustion products, an air suction unit, air pipes and a control panel. The whole experiment is controlled by telemetry and recorded with two video cameras located at two different places. Besides the picture, parameters are recorded to determine the velocity of the air flow incoming to the samples, the time points of switching on/off the devices, etc. The combustion chamber temperature is also controlled. The main objectives of experiments of this series were as follows: (1) verification of the reliability of the installation in orbital flight; (2) verification of the experimental procedure; and (3) investigation of combustion of two types of materials under microgravity at various velocities of the incoming air flow.

  20. Strength of bolted wood joints with various ratios of member thicknesses

    Treesearch

    Thomas Lee Wilkinson

    1978-01-01

    Procedures have been recommended–such as in the National Design Specification–for design of bolted joints in wood members where the side members are thicker or thinner than half the main member thickness. However, these recommendations have had no experimental verification up to now. The same is true for joints with other than three members. This study experimentally...

  1. Fatigue 󈨛. Volume 2,

    DTIC Science & Technology

    1987-06-01

    non -propagating cracks should be considered and maximum principal strain amplitude Is the controlling parameter. FATIGUE DAMAGE MAPS The preceding...fatigue is strain- controlled and not stress- controlled . The small effect of R-ratio suggested by Figure 2 may simply reflect the high experimental ...present a model (and its experimental verification) describing non -damaging notches in fatigue. &FFECT OF GRAIN SIZE AND TEMPERATURE In this part we shall

  2. Experimental evidence for a new single-event upset (SEU) mode in a CMOS SRAM obtained from model verification

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Lo, R. Y.

    1987-01-01

    Modeling of SEU has been done in a CMOS static RAM containing 1-micron-channel-length transistors fabricated from a p-well epilayer process using both circuit-simulation and numerical-simulation techniques. The modeling results have been experimentally verified with the aid of heavy-ion beams obtained from a three-stage tandem van de Graaff accelerator. Experimental evidence for a novel SEU mode in an ON n-channel device is presented.

  3. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    NASA Astrophysics Data System (ADS)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  4. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  5. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  6. Study of the penetration of a plate made of titanium alloy VT6 with a steel ball

    NASA Astrophysics Data System (ADS)

    Buzyurkin, A. E.

    2018-03-01

    The purpose of this work is the development and verification of mathematical relationships, adapted to the package of finite element analysis LS-DYNA and describing the deformation and destruction of a titanium plate in a high-speed collision. Using data from experiments on the interaction of a steel ball with a titanium plate made of VT6 alloy, verification of the available constants necessary for describing the behavior of the material using the Johnson-Cook relationships was performed, as well as verification of the parameters of the fracture model used in the numerical modeling of the collision process. An analysis of experimental data on the interaction of a spherical impactor with a plate showed that the data accepted for VT6 alloy in the first approximation for deformation hardening in the Johnson-Cook model give too high results on the residual velocities of the impactor when piercing the plate.

  7. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  8. Bond graph modeling and experimental verification of a novel scheme for fault diagnosis of rolling element bearings in special operating conditions

    NASA Astrophysics Data System (ADS)

    Mishra, C.; Samantaray, A. K.; Chakraborty, G.

    2016-09-01

    Vibration analysis for diagnosis of faults in rolling element bearings is complicated when the rotor speed is variable or slow. In the former case, the time interval between the fault-induced impact responses in the vibration signal are non-uniform and the signal strength is variable. In the latter case, the fault-induced impact response strength is weak and generally gets buried in the noise, i.e. noise dominates the signal. This article proposes a diagnosis scheme based on a combination of a few signal processing techniques. The proposed scheme initially represents the vibration signal in terms of uniformly resampled angular position of the rotor shaft by using the interpolated instantaneous angular position measurements. Thereafter, intrinsic mode functions (IMFs) are generated through empirical mode decomposition (EMD) of resampled vibration signal which is followed by thresholding of IMFs and signal reconstruction to de-noise the signal and envelope order tracking to diagnose the faults. Data for validating the proposed diagnosis scheme are initially generated from a multi-body simulation model of rolling element bearing which is developed using bond graph approach. This bond graph model includes the ball and cage dynamics, localized fault geometry, contact mechanics, rotor unbalance, and friction and slip effects. The diagnosis scheme is finally validated with experiments performed with the help of a machine fault simulator (MFS) system. Some fault scenarios which could not be experimentally recreated are then generated through simulations and analyzed through the developed diagnosis scheme.

  9. Systematic approach to verification and validation: High explosive burn models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph; Scovel, Christina A.

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the samemore » experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code, run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.« less

  10. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams

    NASA Astrophysics Data System (ADS)

    Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.

    2013-08-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated targets within a few per cent. Moreover, the simulated distal activity fall-off positions, representing the central quantity for treatment monitoring in terms of beam range verification, are found to agree within 0.6 mm with the measurements at different initial beam energies in both homogeneous and heterogeneous targets. Based on work presented at the Third European Workshop on Monte Carlo Treatment Planning (Seville, 15-18 May 2012).

  11. Lightweight Small Arms Technologies

    DTIC Science & Technology

    2006-11-01

    conducted using several methods. Initial measurements were obtained using a strand burner , followed by closed bomb measurements using both pressed... pellets and entire cases. Specialized fixtures were developed to measure primer and booster combustion properties. The final verification of interior

  12. ICE CONTROL - Towards optimizing wind energy production during icing events

    NASA Astrophysics Data System (ADS)

    Dorninger, Manfred; Strauss, Lukas; Serafin, Stefano; Beck, Alexander; Wittmann, Christoph; Weidle, Florian; Meier, Florian; Bourgeois, Saskia; Cattin, René; Burchhart, Thomas; Fink, Martin

    2017-04-01

    Forecasts of wind power production loss caused by icing weather conditions are produced by a chain of physical models. The model chain consists of a numerical weather prediction model, an icing model and a production loss model. Each element of the model chain is affected by significant uncertainty, which can be quantified using targeted observations and a probabilistic forecasting approach. In this contribution, we present preliminary results from the recently launched project ICE CONTROL, an Austrian research initiative on measurements, probabilistic forecasting, and verification of icing on wind turbine blades. ICE CONTROL includes an experimental field phase, consisting of measurement campaigns in a wind park in Rhineland-Palatinate, Germany, in the winters 2016/17 and 2017/18. Instruments deployed during the campaigns consist of a conventional icing detector on the turbine hub and newly devised ice sensors (eologix Sensor System) on the turbine blades, as well as meteorological sensors for wind, temperature, humidity, visibility, and precipitation type and spectra. Liquid water content and spectral characteristics of super-cooled water droplets are measured using a Fog Monitor FM-120. Three cameras document the icing conditions on the instruments and on the blades. Different modelling approaches are used to quantify the components of the model-chain uncertainties. The uncertainty related to the initial conditions of the weather prediction is evaluated using the existing global ensemble prediction system (EPS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). Furthermore, observation system experiments are conducted with the AROME model and its 3D-Var data assimilation to investigate the impact of additional observations (such as Mode-S aircraft data, SCADA data and MSG cloud mask initialization) on the numerical icing forecast. The uncertainty related to model formulation is estimated from multi-physics ensembles based on the Weather Research and Forecasting model (WRF) by perturbing parameters in the physical parameterization schemes. In addition, uncertainties of the icing model and of its adaptations to the rotating turbine blade are addressed. The model forecasts combined with the suite of instruments and their measurements make it possible to conduct a step-wise verification of all the components of the model chain - a novel aspect compared to similar ongoing and completed forecasting projects.

  13. Nuclear Energy Experiments to the Center for Global Security and Cooperation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Douglas M.

    2015-06-01

    This is to serve as verification that the Center 6200 experimental pieces supplied to the Technology Training and Demonstration Area within the Center of Global Security and Cooperation are indeed unclassified unlimited released for viewing.

  14. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  15. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  16. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  17. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  18. Traveler Phase 1A Joint Review

    NASA Technical Reports Server (NTRS)

    St. John, Clint; Scofield, Jan; Skoog, Mark; Flock, Alex; Williams, Ethan; Guirguis, Luke; Loudon, Kevin; Sutherland, Jeffrey; Lehmann, Richard; Garland, Michael; hide

    2017-01-01

    The briefing contains the preliminary findings and suggestions for improvement of methods used in development and evaluation of a multi monitor runtime assurance architecture for autonomous flight vehicles. Initial system design, implementation, verification, and flight testing has been conducted. As of yet detailed data review is incomplete, and flight testing has been limited to initial monitor force fights. Detailed monitor flight evaluations have yet to be performed.

  19. The Inhibition of the Rayleigh-Taylor Instability by Rotation.

    PubMed

    Baldwin, Kyle A; Scase, Matthew M; Hill, Richard J A

    2015-07-01

    It is well-established that the Coriolis force that acts on fluid in a rotating system can act to stabilise otherwise unstable flows. Chandrasekhar considered theoretically the effect of the Coriolis force on the Rayleigh-Taylor instability, which occurs at the interface between a dense fluid lying on top of a lighter fluid under gravity, concluding that rotation alone could not stabilise this system indefinitely. Recent numerical work suggests that rotation may, nevertheless, slow the growth of the instability. Experimental verification of these results using standard techniques is problematic, owing to the practical difficulty in establishing the initial conditions. Here, we present a new experimental technique for studying the Rayleigh-Taylor instability under rotation that side-steps the problems encountered with standard techniques by using a strong magnetic field to destabilize an otherwise stable system. We find that rotation about an axis normal to the interface acts to retard the growth rate of the instability and stabilise long wavelength modes; the scale of the observed structures decreases with increasing rotation rate, asymptoting to a minimum wavelength controlled by viscosity. We present a critical rotation rate, dependent on Atwood number and the aspect ratio of the system, for stabilising the most unstable mode.

  20. The Inhibition of the Rayleigh-Taylor Instability by Rotation

    PubMed Central

    Baldwin, Kyle A.; Scase, Matthew M.; Hill, Richard J. A.

    2015-01-01

    It is well-established that the Coriolis force that acts on fluid in a rotating system can act to stabilise otherwise unstable flows. Chandrasekhar considered theoretically the effect of the Coriolis force on the Rayleigh-Taylor instability, which occurs at the interface between a dense fluid lying on top of a lighter fluid under gravity, concluding that rotation alone could not stabilise this system indefinitely. Recent numerical work suggests that rotation may, nevertheless, slow the growth of the instability. Experimental verification of these results using standard techniques is problematic, owing to the practical difficulty in establishing the initial conditions. Here, we present a new experimental technique for studying the Rayleigh-Taylor instability under rotation that side-steps the problems encountered with standard techniques by using a strong magnetic field to destabilize an otherwise stable system. We find that rotation about an axis normal to the interface acts to retard the growth rate of the instability and stabilise long wavelength modes; the scale of the observed structures decreases with increasing rotation rate, asymptoting to a minimum wavelength controlled by viscosity. We present a critical rotation rate, dependent on Atwood number and the aspect ratio of the system, for stabilising the most unstable mode. PMID:26130005

  1. Experimental comparison and validation of hot-ball method with guarded hot plate method on polyurethane foams

    NASA Astrophysics Data System (ADS)

    Hudec, Ján; Glorieux, Christ; Dieška, Peter; Kubičár, Ľudovít

    2016-07-01

    The Hot-ball method is an innovative transient method for measuring thermophysical properties. The principle is based on heating of a small ball, incorporated in measured medium, by constant heating power and simultaneous measuring of the ball's temperature response since the heating was initiated. The shape of the temperature response depends on thermophysical properties of the medium, where the sensor is placed. This method is patented by Institute of Physics, SAS, where the method and sensors based on this method are being developed. At the beginning of the development of sensors for this method we were oriented on monitoring applications, where relative precision is much more important than accuracy. Meanwhile, the quality of sensors was improved good enough to be used for a new application - absolute measuring of thermophysical parameters of low thermally conductive materials. This paper describes experimental verification and validation of measurement by hot-ball method. Thanks to cooperation with Laboratory of Soft Matter and Biophysics of Catholic University of Leuven in Belgium, established Guarded Hot Plate method was used as a reference. Details about measuring setups, description of the experiments and results of the comparison are presented.

  2. Mechanisms and dynamics of protonation and lithiation of ferrocene.

    PubMed

    Sharma, Nishant; Ajay, Jayanth K; Venkatasubbaiah, Krishnan; Lourderaj, Upakarasamy

    2015-09-14

    By elucidating the mechanism of the simplest electrophilic substitution reaction of ferrocene, it was found that the verification of the protonation reaction has been a difficulty. In the work reported here, ab initio chemical dynamics simulations were performed at B3LYP/DZVP level of theory to understand the atomic level mechanisms of protonation and lithiation of ferrocene. Protonation of ferrocene resulted in the agostic and metal-protonated forms. Trajectory calculations revealed that protonation of ferrocene occurs by exo and endo mechanisms, with exo being the major path. H(+) was found to be mobile and hopped from the Cp ring to the metal center and vice versa after the initial attack on ferrocene, with the metal-complex having a shorter lifetime. These results remove the ambiguity surrounding the mechanism, as proposed in earlier experimental and computational studies. Lithiation of ferrocene resulted in the formation of cation-π and metal-lithiated complexes. Similar to protonation, trajectory results revealed that both exo and endo paths were followed, with the exo path being the major one. In addition, lithiated-ferrocene exhibited planetary motion. The major path (exo) followed in the protonation and lithiation of ferrocene is consistent with the observations in earlier experimental studies for other hard electrophiles.

  3. Global scale predictability of floods

    NASA Astrophysics Data System (ADS)

    Weerts, Albrecht; Gijsbers, Peter; Sperna Weiland, Frederiek

    2016-04-01

    Flood (and storm surge) forecasting at the continental and global scale has only become possible in recent years (Emmerton et al., 2016; Verlaan et al., 2015) due to the availability of meteorological forecast, global scale precipitation products and global scale hydrologic and hydrodynamic models. Deltares has setup GLOFFIS a research-oriented multi model operational flood forecasting system based on Delft-FEWS in an open experimental ICT facility called Id-Lab. In GLOFFIS both the W3RA and PCRGLOB-WB model are run in ensemble mode using GEFS and ECMWF-EPS (latency 2 days). GLOFFIS will be used for experiments into predictability of floods (and droughts) and their dependency on initial state estimation, meteorological forcing and the hydrologic model used. Here we present initial results of verification of the ensemble flood forecasts derived with the GLOFFIS system. Emmerton, R., Stephens, L., Pappenberger, F., Pagano, T., Weerts, A., Wood, A. Salamon, P., Brown, J., Hjerdt, N., Donnelly, C., Cloke, H. Continental and Global Scale Flood Forecasting Systems, WIREs Water (accepted), 2016 Verlaan M, De Kleermaeker S, Buckman L. GLOSSIS: Global storm surge forecasting and information system 2015, Australasian Coasts & Ports Conference, 15-18 September 2015,Auckland, New Zealand.

  4. Improvement in thrust force estimation of solenoid valve considering minor hysteresis loop

    NASA Astrophysics Data System (ADS)

    Yoon, Myung-Hwan; Choi, Yun-Yong; Hong, Jung-Pyo

    2017-05-01

    Solenoid valve is a very important hydraulic actuator for an automatic transmission in terms of shift quality. The same form of pressure for the clutch and the input current are required for an ideal control. However, the gap between a pressure and a current can occur which brings a delay in a transmission and a decrease in quality. This problem is caused by hysteresis phenomenon. As the ascending or descending magnetic field is applied to the solenoid, different thrust forces are generated. This paper suggests the calculation method of the thrust force considering the hysteresis phenomenon and consequently the accurate force can be obtained. Such hysteresis occurs in ferromagnetic materials, however the hysteresis phenomenon includes a minor hysteresis loop which begins with an initial magnetization curve and is generated by DC biased field density. As the core of the solenoid is ferromagnetic material, an accurate thrust force is obtained by applying the minor hysteresis loop compared to the force calculated by considering only the initial magnetization curve. An analytical background and the detailed explanation of measuring the minor hysteresis loop are presented. Furthermore experimental results and finite element analysis results are compared for the verification.

  5. Effects of magnetization on fusion product trapping and secondary neutron spectraa)

    NASA Astrophysics Data System (ADS)

    Knapp, P. F.; Schmit, P. F.; Hansen, S. B.; Gomez, M. R.; Hahn, K. D.; Sinars, D. B.; Peterson, K. J.; Slutz, S. A.; Sefkow, A. B.; Awe, T. J.; Harding, E.; Jennings, C. A.; Desjarlais, M. P.; Chandler, G. A.; Cooper, G. W.; Cuneo, M. E.; Geissel, M.; Harvey-Thompson, A. J.; Porter, J. L.; Rochau, G. A.; Rovang, D. C.; Ruiz, C. L.; Savage, M. E.; Smith, I. C.; Stygar, W. A.; Herrmann, M. C.

    2015-05-01

    By magnetizing the fusion fuel in inertial confinement fusion (ICF) systems, the required stagnation pressure and density can be relaxed dramatically. This happens because the magnetic field insulates the hot fuel from the cold pusher and traps the charged fusion burn products. This trapping allows the burn products to deposit their energy in the fuel, facilitating plasma self-heating. Here, we report on a comprehensive theory of this trapping in a cylindrical DD plasma magnetized with a purely axial magnetic field. Using this theory, we are able to show that the secondary fusion reactions can be used to infer the magnetic field-radius product, BR, during fusion burn. This parameter, not ρR, is the primary confinement parameter in magnetized ICF. Using this method, we analyze data from recent Magnetized Liner Inertial Fusion experiments conducted on the Z machine at Sandia National Laboratories. We show that in these experiments BR ≈ 0.34(+0.14/-0.06) MG . cm, a ˜ 14× increase in BR from the initial value, and confirming that the DD-fusion tritons are magnetized at stagnation. This is the first experimental verification of charged burn product magnetization facilitated by compression of an initial seed magnetic flux.

  6. Computed torque control of a free-flying cooperat ing-arm robot

    NASA Technical Reports Server (NTRS)

    Koningstein, Ross; Ullman, Marc; Cannon, Robert H., Jr.

    1989-01-01

    The unified approach to solving free-floating space robot manipulator end-point control problems is presented using a control formulation based on an extension of computed torque. Once the desired end-point accelerations have been specified, the kinematic equations are used with momentum conservation equations to solve for the joint accelerations in any of the robot's possible configurations: fixed base or free-flying with open/closed chain grasp. The joint accelerations can then be used to calculate the arm control torques and internal forces using a recursive order N algorithm. Initial experimental verification of these techniques has been performed using a laboratory model of a two-armed space robot. This fully autonomous spacecraft system experiences the drag-free, zero G characteristics of space in two dimensions through the use of an air cushion support system. Results of these initial experiments are included which validate the correctness of the proposed methodology. The further problem of control in the large where not only the manipulator tip positions but the entire system consisting of base and arms must be controlled is also presented. The availability of a physical testbed has brought a keener insight into the subtleties of the problem at hand.

  7. 40 CFR 1065.325 - Intake-flow calibration.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Section 1065.325 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Flow-Related Measurements § 1065.325 Intake-flow calibration. (a) Calibrate intake-air flow meters upon initial installation. Follow the...

  8. TETAM Model Verification Study. Volume I. Representation of Intervisibility, Initial Comparisons

    DTIC Science & Technology

    1976-02-01

    simulation models in terms of firings, engagements, and losses between tank and antitank as compared with the field data collected during the free play battles of Field Experiment 11.8 are found in Volume III. (Author)

  9. 75 FR 55799 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-14

    ... against polyomaviruses. Development Status: Pre-clinical. Inventors: Christopher B. Buck and Diana V... can serve as positive controls in chemokine receptor studies designed to identify novel... chemokine studies. Experimental verification of response to CXC family chemokines: The scientists have...

  10. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NAM > Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts

  11. Motional timescale predictions by molecular dynamics simulations: case study using proline and hydroxyproline sidechain dynamics.

    PubMed

    Aliev, Abil E; Kulke, Martin; Khaneja, Harmeet S; Chudasama, Vijay; Sheppard, Tom D; Lanigan, Rachel M

    2014-02-01

    We propose a new approach for force field optimizations which aims at reproducing dynamics characteristics using biomolecular MD simulations, in addition to improved prediction of motionally averaged structural properties available from experiment. As the source of experimental data for dynamics fittings, we use (13) C NMR spin-lattice relaxation times T1 of backbone and sidechain carbons, which allow to determine correlation times of both overall molecular and intramolecular motions. For structural fittings, we use motionally averaged experimental values of NMR J couplings. The proline residue and its derivative 4-hydroxyproline with relatively simple cyclic structure and sidechain dynamics were chosen for the assessment of the new approach in this work. Initially, grid search and simplexed MD simulations identified large number of parameter sets which fit equally well experimental J couplings. Using the Arrhenius-type relationship between the force constant and the correlation time, the available MD data for a series of parameter sets were analyzed to predict the value of the force constant that best reproduces experimental timescale of the sidechain dynamics. Verification of the new force-field (termed as AMBER99SB-ILDNP) against NMR J couplings and correlation times showed consistent and significant improvements compared to the original force field in reproducing both structural and dynamics properties. The results suggest that matching experimental timescales of motions together with motionally averaged characteristics is the valid approach for force field parameter optimization. Such a comprehensive approach is not restricted to cyclic residues and can be extended to other amino acid residues, as well as to the backbone. Copyright © 2013 Wiley Periodicals, Inc.

  12. Simple Experimental Verification of the Relation between the Band-Gap Energy and the Energy of Photons Emitted by LEDs

    ERIC Educational Resources Information Center

    Precker, Jurgen W.

    2007-01-01

    The wavelength of the light emitted by a light-emitting diode (LED) is intimately related to the band-gap energy of the semiconductor from which the LED is made. We experimentally estimate the band-gap energies of several types of LEDs, and compare them with the energies of the emitted light, which ranges from infrared to white. In spite of…

  13. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  14. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  15. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  16. Satellite Power System (SPS) concept definition study (exhibit C)

    NASA Technical Reports Server (NTRS)

    Haley, G. M.

    1979-01-01

    The major outputs of the study are the constructability studies which resulted in the definition of the concepts for satellite, rectenna, and satellite construction base construction. Transportation analyses resulted in definition of heavy-lift launch vehicle, electric orbit transfer vehicle, personnel orbit transfer vehicle, and intra-orbit transfer vehicle as well as overall operations related to transportation systems. The experiment/verification program definition resulted in the definition of elements for the Ground-Based Experimental Research and Key Technology plans. These studies also resulted in conceptual approaches for early space technology verification. The cost analysis defined the overall program and cost data for all program elements and phases.

  17. Palmprint verification using Lagrangian decomposition and invariant interest points

    NASA Astrophysics Data System (ADS)

    Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.

    2011-06-01

    This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.

  18. Arithmetic Circuit Verification Based on Symbolic Computer Algebra

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo

    This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.

  19. Hydrogen and Storage Initiatives at the NASA JSC White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Maes, Miguel; Woods, Stephen S.

    2006-01-01

    NASA WSTF Hydrogen Activities: a) Aerospace Test; b) System Certification & Verification; c) Component, System, & Facility Hazard Assessment; d) Safety Training Technical Transfer: a) Development of Voluntary Consensus Standards and Practices; b) Support of National Hydrogen Infrastructure Development.

  20. Transit Reliability Information Program : Reliability Verification Demonstration Plan for Rapid Rail Vehicles

    DOT National Transportation Integrated Search

    1981-08-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national Data Ban...

  1. The American College of Surgeons Children's Surgery Verification and Quality Improvement Program: implications for anesthesiologists.

    PubMed

    Houck, Constance S; Deshpande, Jayant K; Flick, Randall P

    2017-06-01

    The Task Force for Children's Surgical Care, an ad-hoc multidisciplinary group of invited leaders in pediatric perioperative medicine, was assembled in May 2012 to consider approaches to optimize delivery of children's surgical care in today's competitive national healthcare environment. Over the subsequent 3 years, with support from the American College of Surgeons (ACS) and Children's Hospital Association (CHA), the group established principles regarding perioperative resource standards, quality improvement and safety processes, data collection, and verification that were used to develop an ACS-sponsored Children's Surgery Verification and Quality Improvement Program (ACS CSV). The voluntary ACS CSV was officially launched in January 2017 and more than 125 pediatric surgical programs have expressed interest in verification. ACS CSV-verified programs have specific requirements for pediatric anesthesia leadership, resources, and the availability of pediatric anesthesiologists or anesthesiologists with pediatric expertise to care for infants and young children. The present review outlines the history of the ACS CSV, key elements of the program, and the standards specific to pediatric anesthesiology. As with the pediatric trauma programs initiated more than 40 years ago, this program has the potential to significantly improve surgical care for infants and children in the United States and Canada.

  2. Nuclear magnetic resonance diffusion pore imaging: Experimental phase detection by double diffusion encoding

    NASA Astrophysics Data System (ADS)

    Demberg, Kerstin; Laun, Frederik Bernd; Windschuh, Johannes; Umathum, Reiner; Bachert, Peter; Kuder, Tristan Anselm

    2017-02-01

    Diffusion pore imaging is an extension of diffusion-weighted nuclear magnetic resonance imaging enabling the direct measurement of the shape of arbitrarily formed, closed pores by probing diffusion restrictions using the motion of spin-bearing particles. Examples of such pores comprise cells in biological tissue or oil containing cavities in porous rocks. All pores contained in the measurement volume contribute to one reconstructed image, which reduces the problem of vanishing signal at increasing resolution present in conventional magnetic resonance imaging. It has been previously experimentally demonstrated that pore imaging using a combination of a long and a narrow magnetic field gradient pulse is feasible. In this work, an experimental verification is presented showing that pores can be imaged using short gradient pulses only. Experiments were carried out using hyperpolarized xenon gas in well-defined pores. The phase required for pore image reconstruction was retrieved from double diffusion encoded (DDE) measurements, while the magnitude could either be obtained from DDE signals or classical diffusion measurements with single encoding. The occurring image artifacts caused by restrictions of the gradient system, insufficient diffusion time, and by the phase reconstruction approach were investigated. Employing short gradient pulses only is advantageous compared to the initial long-narrow approach due to a more flexible sequence design when omitting the long gradient and due to faster convergence to the diffusion long-time limit, which may enable application to larger pores.

  3. Mathematical Capture of Human Crowd Behavioral Data for Computational Model Building, Verification, and Validation

    DTIC Science & Technology

    2011-03-21

    throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983

  4. Verification of passive cooling techniques in the Super-FRS beam collimators

    NASA Astrophysics Data System (ADS)

    Douma, C. A.; Gellanki, J.; Najafi, M. A.; Moeini, H.; Kalantar-Nayestanaki, N.; Rigollet, C.; Kuiken, O. J.; Lindemulder, M. F.; Smit, H. A. J.; Timersma, H. J.

    2016-08-01

    The Super FRagment Separator (Super-FRS) at the FAIR facility will be the largest in-flight separator of heavy ions in the world. One of the essential steps in the separation procedure is to stop the unwanted ions with beam collimators. In one of the most common situations, the heavy ions are produced by a fission reaction of a primary 238U-beam (1.5 GeV/u) hitting a 12C target (2.5 g/cm2). In this situation, some of the produced ions are highly charged states of 238U. These ions can reach the collimators with energies of up to 1.3 GeV/u and a power of up to 500 W. Under these conditions, a cooling system is required to prevent damage to the collimators and to the corresponding electronics. Due to the highly radioactive environment, both the collimators and the cooling system must be suitable for robot handling. Therefore, an active cooling system is undesirable because of the increased possibility of malfunctioning and other complications. By using thermal simulations (performed with NX9 of Siemens PLM), the possibility of passive cooling is explored. The validity of these simulations is tested by independent comparison with other simulation programs and by experimental verification. The experimental verification is still under analysis, but preliminary results indicate that the explored passive cooling option provides sufficient temperature reduction.

  5. Global Characterization of Protein Altering Mutations in Prostate Cancer

    DTIC Science & Technology

    2011-08-01

    prevalence of candidate cancer genes observed here in prostate cancer. (3) Perform integrative analyses of somatic mutation with gene expression and copy...analyses of somatic mutation with gene expression and copy number change data collected on the same samples. Body This is a “synergy” project between...However, to perform initial verification/validation studies, we have evaluated the mutation calls for several genes discovered initially by the

  6. Dual-Tree Complex Wavelet Transform and Image Block Residual-Based Multi-Focus Image Fusion in Visual Sensor Networks

    PubMed Central

    Yang, Yong; Tong, Song; Huang, Shuying; Lin, Pan

    2014-01-01

    This paper presents a novel framework for the fusion of multi-focus images explicitly designed for visual sensor network (VSN) environments. Multi-scale based fusion methods can often obtain fused images with good visual effect. However, because of the defects of the fusion rules, it is almost impossible to completely avoid the loss of useful information in the thus obtained fused images. The proposed fusion scheme can be divided into two processes: initial fusion and final fusion. The initial fusion is based on a dual-tree complex wavelet transform (DTCWT). The Sum-Modified-Laplacian (SML)-based visual contrast and SML are employed to fuse the low- and high-frequency coefficients, respectively, and an initial composited image is obtained. In the final fusion process, the image block residuals technique and consistency verification are used to detect the focusing areas and then a decision map is obtained. The map is used to guide how to achieve the final fused image. The performance of the proposed method was extensively tested on a number of multi-focus images, including no-referenced images, referenced images, and images with different noise levels. The experimental results clearly indicate that the proposed method outperformed various state-of-the-art fusion methods, in terms of both subjective and objective evaluations, and is more suitable for VSNs. PMID:25587878

  7. Dual-tree complex wavelet transform and image block residual-based multi-focus image fusion in visual sensor networks.

    PubMed

    Yang, Yong; Tong, Song; Huang, Shuying; Lin, Pan

    2014-11-26

    This paper presents a novel framework for the fusion of multi-focus images explicitly designed for visual sensor network (VSN) environments. Multi-scale based fusion methods can often obtain fused images with good visual effect. However, because of the defects of the fusion rules, it is almost impossible to completely avoid the loss of useful information in the thus obtained fused images. The proposed fusion scheme can be divided into two processes: initial fusion and final fusion. The initial fusion is based on a dual-tree complex wavelet transform (DTCWT). The Sum-Modified-Laplacian (SML)-based visual contrast and SML are employed to fuse the low- and high-frequency coefficients, respectively, and an initial composited image is obtained. In the final fusion process, the image block residuals technique and consistency verification are used to detect the focusing areas and then a decision map is obtained. The map is used to guide how to achieve the final fused image. The performance of the proposed method was extensively tested on a number of multi-focus images, including no-referenced images, referenced images, and images with different noise levels. The experimental results clearly indicate that the proposed method outperformed various state-of-the-art fusion methods, in terms of both subjective and objective evaluations, and is more suitable for VSNs.

  8. SU-E-T-287: Robustness Study of Passive-Scattering Proton Therapy in Lung: Is Range and Setup Uncertainty Calculation On the Initial CT Enough to Predict the Plan Robustness?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, X; Dormer, J; Kenton, O

    Purpose: Plan robustness of the passive-scattering proton therapy treatment of lung tumors has been studied previously using combined uncertainties of 3.5% in CT number and 3 mm geometric shifts. In this study, we investigate whether this method is sufficient to predict proton plan robustness by comparing to plans performed on weekly verification CT scans. Methods: Ten lung cancer patients treated with passive-scattering proton therapy were randomly selected. All plans were prescribed 6660cGy in 37 fractions. Each initial plan was calculated using +/− 3.5% range and +/− 0.3cm setup uncertainty in x, y and z directions in Eclipse TPS(Method-A). Throughout themore » treatment course, patients received weekly verification CT scans to assess the daily treatment variation(Method-B). After contours and imaging registrations are verified by the physician, the initial plan with the same beamline and compensator was mapped into the verification CT. Dose volume histograms (DVH) were evaluated for robustness study. Results: Differences are observed between method A and B in terms of iCTV coverage and lung dose. Method-A shows all the iCTV D95 are within +/− 1% difference, while 20% of cases fall outside +/−1% range in Method-B. In the worst case scenario(WCS), the iCTV D95 is reduced by 2.5%. All lung V5 and V20 are within +/−5% in Method-A while 15% of V5 and 10% of V20 fall outside of +/−5% in Method-B. In the WCS, Lung V5 increased by 15% and V20 increased by 9%. Method A and B show good agreement with regard to cord maximum and Esophagus mean dose. Conclusion: This study suggests that using range and setup uncertainty calculation (+/−3.5% and +/−3mm) may not be sufficient to predict the WCS. In the absence of regular verification scans, expanding the conventional uncertainty parameters(e.g., to +/−3.5% and +/−4mm) may be needed to better reflect plan actual robustness.« less

  9. Experimental verification of an interpolation algorithm for improved estimates of animal position

    NASA Astrophysics Data System (ADS)

    Schell, Chad; Jaffe, Jules S.

    2004-07-01

    This article presents experimental verification of an interpolation algorithm that was previously proposed in Jaffe [J. Acoust. Soc. Am. 105, 3168-3175 (1999)]. The goal of the algorithm is to improve estimates of both target position and target strength by minimizing a least-squares residual between noise-corrupted target measurement data and the output of a model of the sonar's amplitude response to a target at a set of known locations. Although this positional estimator was shown to be a maximum likelihood estimator, in principle, experimental verification was desired because of interest in understanding its true performance. Here, the accuracy of the algorithm is investigated by analyzing the correspondence between a target's true position and the algorithm's estimate. True target position was measured by precise translation of a small test target (bead) or from the analysis of images of fish from a coregistered optical imaging system. Results with the stationary spherical test bead in a high signal-to-noise environment indicate that a large increase in resolution is possible, while results with commercial aquarium fish indicate a smaller increase is obtainable. However, in both experiments the algorithm provides improved estimates of target position over those obtained by simply accepting the angular positions of the sonar beam with maximum output as target position. In addition, increased accuracy in target strength estimation is possible by considering the effects of the sonar beam patterns relative to the interpolated position. A benefit of the algorithm is that it can be applied ``ex post facto'' to existing data sets from commercial multibeam sonar systems when only the beam intensities have been stored after suitable calibration.

  10. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  11. Investigation of air cleaning system response to accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrae, R.W.; Bolstad, J.W.; Foster, R.D.

    1980-01-01

    Air cleaning system response to the stress of accident conditions are being investigated. A program overview and hghlight recent results of our investigation are presented. The program includes both analytical and experimental investigations. Computer codes for predicting effects of tornados, explosions, fires, and material transport are described. The test facilities used to obtain supportive experimental data to define structural integrity and confinement effectiveness of ventilation system components are described. Examples of experimental results for code verification, blower response to tornado transients, and filter response to tornado and explosion transients are reported.

  12. Simulation and Experimental Study on Cavitating Water Jet Nozzle

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; He, Kai; Cai, Jiannan; Hu, Shaojie; Li, Jiuhua; Du, Ruxu

    2017-01-01

    Cavitating water jet technology is a new kind of water jet technology with many advantages, such as energy-saving, efficient, environmentally-friendly and so on. Based on the numerical simulation and experimental verification in this paper, the research on cavitating nozzle has been carried out, which includes comparison of the cleaning ability of the cavitating jet and the ordinary jet, and comparison of cavitation effects of different structures of cavitating nozzles.

  13. Rapid Verification of Candidate Serological Biomarkers Using Gel-based, Label-free Multiple Reaction Monitoring

    PubMed Central

    Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.

    2011-01-01

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088

  14. Rapid verification of candidate serological biomarkers using gel-based, label-free multiple reaction monitoring.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W

    2011-09-02

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.

  15. HiMAT highly maneuverable aircraft technology, flight report

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Flight verification of a primary flight control system, designed to control the unstable HiMAT aircraft is presented. The initial flight demonstration of a maneuver autopilot in the level cruise mode and the gathering of a limited amount of airspeed calibration data.

  16. 78 FR 78257 - Verification of Statements of Account Submitted by Cable Operators and Satellite Carriers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... from nearly all of the parties in this proceeding. All of these parties raised issues of first impression that were not addressed in the initial phase of this proceeding. The Office is studying these new...

  17. Cooler-Lower Down

    ERIC Educational Resources Information Center

    Deeson, Eric

    1971-01-01

    Reports a verification that hot water begins to freeze sooner than cooler water. Includes the investigations that lead to the conclusions that convection is a major influence, water content may have some effect, and the melting of the ice under the container makes no difference on the experimental results. (DS)

  18. Experimental verification of long-term evolution radio transmissions over dual-polarization combined fiber and free-space optics optical infrastructures.

    PubMed

    Bohata, J; Zvanovec, S; Pesek, P; Korinek, T; Mansour Abadi, M; Ghassemlooy, Z

    2016-03-10

    This paper describes the experimental verification of the utilization of long-term evolution radio over fiber (RoF) and radio over free space optics (RoFSO) systems using dual-polarization signals for cloud radio access network applications determining the specific utilization limits. A number of free space optics configurations are proposed and investigated under different atmospheric turbulence regimes in order to recommend the best setup configuration. We show that the performance of the proposed link, based on the combination of RoF and RoFSO for 64 QAM at 2.6 GHz, is more affected by the turbulence based on the measured difference error vector magnitude value of 5.5%. It is further demonstrated the proposed systems can offer higher noise immunity under particular scenarios with the signal-to-noise ratio reliability limit of 5 dB in the radio frequency domain for RoF and 19.3 dB in the optical domain for a combination of RoF and RoFSO links.

  19. Design Considerations and Experimental Verification of a Rail Brake Armature Based on Linear Induction Motor Technology

    NASA Astrophysics Data System (ADS)

    Sakamoto, Yasuaki; Kashiwagi, Takayuki; Hasegawa, Hitoshi; Sasakawa, Takashi; Fujii, Nobuo

    This paper describes the design considerations and experimental verification of an LIM rail brake armature. In order to generate power and maximize the braking force density despite the limited area between the armature and the rail and the limited space available for installation, we studied a design method that is suitable for designing an LIM rail brake armature; we considered adoption of a ring winding structure. To examine the validity of the proposed design method, we developed a prototype ring winding armature for the rail brakes and examined its electromagnetic characteristics in a dynamic test system with roller rigs. By repeating various tests, we confirmed that unnecessary magnetic field components, which were expected to be present under high speed running condition or when a ring winding armature was used, were not present. Further, the necessary magnetic field component and braking force attained the desired values. These studies have helped us to develop a basic design method that is suitable for designing the LIM rail brake armatures.

  20. Experimental Analysis of Voltage Drop Compensation in a DC Electrified Railway by Introducing an Energy Storage System Incorporating EDLCs

    NASA Astrophysics Data System (ADS)

    Konishi, Takeshi; Hase, Shin-Ichi; Nakamichi, Yoshinobu; Nara, Hidetaka; Uemura, Tadashi

    Interest has been shown in the concept of an energy storage system aimed at leveling load and improving energy efficiency by charging during vehicle regeneration and discharging during running. Such a system represents an efficient countermeasure against pantograph point voltage drop, power load fluctuation and regenerative power loss. We selected an EDLC model as an energy storage medium and a step-up/step-down chopper as a power converter to exchange power between the storage medium and overhead lines. Basic verification was conducted using a mini-model for DC 400V, demonstrating characteristics suitable for its use as an energy storage system. Based on these results, an energy storage system was built for DC 600V and a verification test conducted in conjunction with the Enoshima Electric Railway Co. Ltd. This paper gives its experimental analysis of voltage drop compensation in a DC electrified railway and some discussions based on the test.

  1. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  2. A Model based Examination of Conditions for Ignition of Turbidity Currents on Slopes

    NASA Astrophysics Data System (ADS)

    Mehta, A. J.; Krishna, G.

    2009-12-01

    Turbidity currents form a major mechanism for the movement of sediment in the natural environment. Self-accelerating turbidity currents over continental slopes are of considerable scientific and engineering interest due to their role as agents for submarine sediment transportation from the shelf to the seabed. Such currents are called ignitive provided they eventually reach a catastrophic state as acceleration results in high sediment loads due to erosion of the sloping bed. A numerical model, which treats the fluid and the particles as two separate phases, is applied to investigate the effects of particle size, initial flow friction velocity and mild bed slope on the ignitive condition. Laboratory experimental data have been included as part of the analysis for qualitative comparison purposes. Ignition for the smallest of the three selected sizes (0.21mm) of medium sand typical of Florida beaches was found to depend on the initial conditions at the head of the slope as determined by the pressure gradient. Bed slope seemed to be of secondary importance. For the two sands with larger grain sizes (0.28mm and 0.35mm) the slope was found to play a more important role when compared to the initial pressure gradient. For a given pressure gradient, increasing the slope increased the likelihood of self-acceleration. It is concluded that in general ignition cannot be defined merely in terms of positive values of the velocity gradient and the sediment flux gradient along the slope. Depending on particle size the initial pressure gradient can also play a role. For the selected initial conditions (grain size, pressure gradient and bed slope), out of the 54 combinations tested, all except three satisfied the Knapp-Bagnold criterion for auto-suspension irrespective of whether the turbid current was ignitive or non-ignitive. In all 54 cases the current was found to erode the bed. Further use of the model will require accommodation of wider ranges of sediment size and bed density, and a thorough verification against experimental data.

  3. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  4. NDEC: A NEA platform for nuclear data testing, verification and benchmarking

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Michel-Sendis, F.; Cabellos, O.; Bossant, M.; Soppera, N.

    2017-09-01

    The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle) platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.

  5. Status on the Verification of Combustion Stability for the J-2X Engine Thrust Chamber Assembly

    NASA Technical Reports Server (NTRS)

    Casiano, Matthew; Hinerman, Tim; Kenny, R. Jeremy; Hulka, Jim; Barnett, Greg; Dodd, Fred; Martin, Tom

    2013-01-01

    Development is underway of the J -2X engine, a liquid oxygen/liquid hydrogen rocket engine for use on the Space Launch System. The Engine E10001 began hot fire testing in June 2011 and testing will continue with subsequent engines. The J -2X engine main combustion chamber contains both acoustic cavities and baffles. These stability aids are intended to dampen the acoustics in the main combustion chamber. Verification of the engine thrust chamber stability is determined primarily by examining experimental data using a dynamic stability rating technique; however, additional requirements were included to guard against any spontaneous instability or rough combustion. Startup and shutdown chug oscillations are also characterized for this engine. This paper details the stability requirements and verification including low and high frequency dynamics, a discussion on sensor selection and sensor port dynamics, and the process developed to assess combustion stability. A status on the stability results is also provided and discussed.

  6. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    NASA Astrophysics Data System (ADS)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  7. Providing an empirical basis for optimizing the verification and testing phases of software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  8. Development and experimental qualification of a calculation scheme for the evaluation of gamma heating in experimental reactors. Application to MARIA and Jules Horowitz (JHR) MTR Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarchalski, M.; Pytel, K.; Wroblewska, M.

    2015-07-01

    Precise computational determination of nuclear heating which consists predominantly of gamma heating (more than 80 %) is one of the challenges in material testing reactor exploitation. Due to sophisticated construction and conditions of experimental programs planned in JHR it became essential to use most accurate and precise gamma heating model. Before the JHR starts to operate, gamma heating evaluation methods need to be developed and qualified in other experimental reactor facilities. This is done inter alia using OSIRIS, MINERVE or EOLE research reactors in France. Furthermore, MARIA - Polish material testing reactor - has been chosen to contribute to themore » qualification of gamma heating calculation schemes/tools. This reactor has some characteristics close to those of JHR (beryllium usage, fuel element geometry). To evaluate gamma heating in JHR and MARIA reactors, both simulation tools and experimental program have been developed and performed. For gamma heating simulation, new calculation scheme and gamma heating model of MARIA have been carried out using TRIPOLI4 and APOLLO2 codes. Calculation outcome has been verified by comparison to experimental measurements in MARIA reactor. To have more precise calculation results, model of MARIA in TRIPOLI4 has been made using the whole geometry of the core. This has been done for the first time in the history of MARIA reactor and was complex due to cut cone shape of all its elements. Material composition of burnt fuel elements has been implemented from APOLLO2 calculations. An experiment for nuclear heating measurements and calculation verification has been done in September 2014. This involved neutron, photon and nuclear heating measurements at selected locations in MARIA reactor using in particular Rh SPND, Ag SPND, Ionization Chamber (all three from CEA), KAROLINA calorimeter (NCBJ) and Gamma Thermometer (CEA/SCK CEN). Measurements were done in forty points using four channels. Maximal nuclear heating evaluated from measurements is of the order of 2.5 W/g at half of the possible MARIA power - 15 MW. The approach and the detailed program for experimental verification of calculations will be presented. The following points will be discussed: - Development of a gamma heating model of MARIA reactor with TRIPOLI 4 (coupled neutron-photon mode) and APOLLO2 model taking into account the key parameters like: configuration of the core, experimental loading, control rod location, reactor power, fuel depletion); - Design of specific measurement tools for MARIA experiments including for instance a new single-cell calorimeter called KAROLINA calorimeter; - MARIA experimental program description and a preliminary analysis of results; - Comparison of calculations for JHR and MARIA cores with experimental verification analysis, calculation behavior and n-γ 'environments'. (authors)« less

  9. Experimental verification of rank 1 chaos in switch-controlled Chua circuit.

    PubMed

    Oksasoglu, Ali; Ozoguz, Serdar; Demirkol, Ahmet S; Akgul, Tayfun; Wang, Qiudong

    2009-03-01

    In this paper, we provide the first experimental proof for the existence of rank 1 chaos in the switch-controlled Chua circuit by following a step-by-step procedure given by the theory of rank 1 maps. At the center of this procedure is a periodically kicked limit cycle obtained from the unforced system. Then, this limit cycle is subjected to periodic kicks by adding externally controlled switches to the original circuit. Both the smooth nonlinearity and the piecewise linear cases are considered in this experimental investigation. Experimental results are found to be in concordance with the conclusions of the theory.

  10. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  11. Free microparticles—An inducing mechanism of pre-firing in high pressure gas switches for fast linear transformer drivers

    NASA Astrophysics Data System (ADS)

    Li, Xiaoang; Pei, Zhehao; Wu, Zhicheng; Zhang, Yuzhao; Liu, Xuandong; Li, Yongdong; Zhang, Qiaogen

    2018-03-01

    Microparticle initiated pre-firing of high pressure gas switches for fast linear transformer drivers (FLTDs) is experimentally and theoretically verified. First, a dual-electrode gas switch equipped with poly-methyl methacrylate baffles is used to capture and collect the microparticles. By analyzing the electrode surfaces and the collecting baffles by a laser scanning confocal microscope, microparticles ranging in size from tens of micrometers to over 100 μm are observed under the typical working conditions of FLTDs. The charging and movement of free microparticles in switch cavity are studied, and the strong DC electric field drives the microparticles to bounce off the electrode. Three different modes of free microparticle motion appear to be responsible for switch pre-firing. (i) Microparticles adhere to the electrode surface and act as a fixed protrusion which distorts the local electric field and initiates the breakdown in the gap. (ii) One particle escapes toward the opposite electrode and causes a near-electrode microdischarge, inducing the breakdown of the residual gap. (iii) Multiple moving microparticles are occasionally in cascade, leading to pre-firing. Finally, as experimental verification, repetitive discharges at ±90 kV are conducted in a three-electrode field-distortion gas switch, with two 8 mm gaps and pressurized with nitrogen. An ultrasonic probe is employed to monitor the bounce signals. In pre-firing incidents, the bounce is detected shortly before the collapse of the voltage waveform, which demonstrates that free microparticles contribute significantly to the mechanism that induces pre-firing in FLTD gas switches.

  12. Free microparticles-An inducing mechanism of pre-firing in high pressure gas switches for fast linear transformer drivers.

    PubMed

    Li, Xiaoang; Pei, Zhehao; Wu, Zhicheng; Zhang, Yuzhao; Liu, Xuandong; Li, Yongdong; Zhang, Qiaogen

    2018-03-01

    Microparticle initiated pre-firing of high pressure gas switches for fast linear transformer drivers (FLTDs) is experimentally and theoretically verified. First, a dual-electrode gas switch equipped with poly-methyl methacrylate baffles is used to capture and collect the microparticles. By analyzing the electrode surfaces and the collecting baffles by a laser scanning confocal microscope, microparticles ranging in size from tens of micrometers to over 100 μm are observed under the typical working conditions of FLTDs. The charging and movement of free microparticles in switch cavity are studied, and the strong DC electric field drives the microparticles to bounce off the electrode. Three different modes of free microparticle motion appear to be responsible for switch pre-firing. (i) Microparticles adhere to the electrode surface and act as a fixed protrusion which distorts the local electric field and initiates the breakdown in the gap. (ii) One particle escapes toward the opposite electrode and causes a near-electrode microdischarge, inducing the breakdown of the residual gap. (iii) Multiple moving microparticles are occasionally in cascade, leading to pre-firing. Finally, as experimental verification, repetitive discharges at ±90 kV are conducted in a three-electrode field-distortion gas switch, with two 8 mm gaps and pressurized with nitrogen. An ultrasonic probe is employed to monitor the bounce signals. In pre-firing incidents, the bounce is detected shortly before the collapse of the voltage waveform, which demonstrates that free microparticles contribute significantly to the mechanism that induces pre-firing in FLTD gas switches.

  13. Calibration of region-specific gates pile driving formula for LRFD : final report 561.

    DOT National Transportation Integrated Search

    2016-05-01

    This research project proposes new DOTD pile driving formulas for pile capacity verification using pile driving blow : counts obtained at either end-of-initial driving (EOID) or at the beginning-of-restrike (BOR). The pile driving : formulas were dev...

  14. Initial study and verification of a distributed fiber optic corrosion monitoring system for transportation structures.

    DOT National Transportation Integrated Search

    2012-07-01

    For this study, a novel optical fiber sensing system was developed and tested for the monitoring of corrosion in : transportation systems. The optical fiber sensing system consists of a reference long period fiber gratings (LPFG) sensor : for corrosi...

  15. Testing of full-size reinforced concrete beams strengthened with FRP composites : experimental results and design methods verification

    DOT National Transportation Integrated Search

    2000-06-01

    In 1997, a load rating of an historic reinforced concrete bridge in Oregon, Horsetail Creek Bridge, indicated substandard shear and moment capacities of the beams. As a result, the Bridge was strengthened with fiber reinforced : polymer composites as...

  16. Experimental Study of an Assembly with Extreme Particulate, Molecular, and Biological Requirements in Different Environmental Scenarios from Quality Point of View

    NASA Astrophysics Data System (ADS)

    Müller, A.; Urich, D.; Kreck, G.; Metzmacher, M.; Lindner, R.

    2018-04-01

    The presentation will cover results from an ESA supported investigation to collect lessons learned for mechanism assembly with the focus on quality and contamination requirements verification in exploration projects such as ExoMars.

  17. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF SEVEN TECHNOLOGIES DETECTING TOXICITY IN DRINKING WATER (R2)

    EPA Science Inventory

    Rapid toxicity technologies can detect certain toxins and with testing it can be determined their susceptibility to interfering chemical in controlled experimental matrix. Rapid toxicity technologies do not identify or determine the concentrations of specific contaminants, but s...

  18. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  19. A fingerprint key binding algorithm based on vector quantization and error correction

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  20. International Space Station Bus Regulation With NASA Glenn Research Center Flywheel Energy Storage System Development Unit

    NASA Technical Reports Server (NTRS)

    Kascak, Peter E.; Kenny, Barbara H.; Dever, Timothy P.; Santiago, Walter; Jansen, Ralph H.

    2001-01-01

    An experimental flywheel energy storage system is described. This system is being used to develop a flywheel based replacement for the batteries on the International Space Station (ISS). Motor control algorithms which allow the flywheel to interface with a simplified model of the ISS power bus, and function similarly to the existing ISS battery system, are described. Results of controller experimental verification on a 300 W-hr flywheel are presented.

  1. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  2. Comparing phase-sensitive and phase-insensitive echolocation target images using a monaural audible sonar.

    PubMed

    Kuc, Roman

    2018-04-01

    This paper describes phase-sensitive and phase-insensitive processing of monaural echolocation waveforms to generate target maps. Composite waveforms containing both the emission and echoes are processed to estimate the target impulse response using an audible sonar. Phase-sensitive processing yields the composite signal envelope, while phase-insensitive processing that starts with the composite waveform power spectrum yields the envelope of the autocorrelation function. Analysis and experimental verification show that multiple echoes form an autocorrelation function that produces near-range phantom-reflector artifacts. These artifacts interfere with true target echoes when the first true echo occurs at a time that is less than the total duration of the target echoes. Initial comparison of phase-sensitive and phase-insensitive maps indicates that both display important target features, indicating that phase is not vital. A closer comparison illustrates the improved resolution of phase-sensitive processing, the near-range phantom-reflectors produced by phase-insensitive processing, and echo interference and multiple reflection artifacts that were independent of the processing.

  3. Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couch, R; Wang, P

    2003-05-06

    In this quarter, our primary effort has been focused on model verification, emphasizing on consistency in result for parallel and serial simulation runs, Progress has been made in refining the parallel thermal algorithms and in diminishing discretization effects in the contact region between the rollers and slab. We have received the metrology data of the ingot profile at the end of the fifth pass from Alcoa. Detailed comparisons between the data and the initial simulation result are being performed. Forthcoming from Alcoa are modifications to the fracture model based on additional experiments at lower strain rates. The original fracture model,more » was implemented in the finite element code, but damage in the rolling simulation was not correct due to the modeling errors at lower strain rates and high stress triaxiality. Validation simulations for the fracture model will continue when the experimentally-based adjustments to the parameter values become available.« less

  4. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions.more » The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.« less

  5. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  6. Novel inhibitors of dengue virus methyltransferase: discovery by in vitro-driven virtual screening on a desktop computer grid.

    PubMed

    Podvinec, Michael; Lim, Siew Pheng; Schmidt, Tobias; Scarsi, Marco; Wen, Daying; Sonntag, Louis-Sebastian; Sanschagrin, Paul; Shenkin, Peter S; Schwede, Torsten

    2010-02-25

    Dengue fever is a viral disease that affects 50-100 million people annually and is one of the most important emerging infectious diseases in many areas of the world. Currently, neither specific drugs nor vaccines are available. Here, we report on the discovery of new inhibitors of the viral NS5 RNA methyltransferase, a promising flavivirus drug target. We have used a multistage molecular docking approach to screen a library of more than 5 million commercially available compounds against the two binding sites of this enzyme. In 263 compounds chosen for experimental verification, we found 10 inhibitors with IC(50) values of <100 microM, of which four exhibited IC(50) values of <10 microM in in vitro assays. The initial hit list also contained 25 nonspecific aggregators. We discuss why this likely occurred for this particular target. We also describe our attempts to use aggregation prediction to further guide the study, following this finding.

  7. Spatiotemporal motion boundary detection and motion boundary velocity estimation for tracking moving objects with a moving camera: a level sets PDEs approach with concurrent camera motion compensation.

    PubMed

    Feghali, Rosario; Mitiche, Amar

    2004-11-01

    The purpose of this study is to investigate a method of tracking moving objects with a moving camera. This method estimates simultaneously the motion induced by camera movement. The problem is formulated as a Bayesian motion-based partitioning problem in the spatiotemporal domain of the image quence. An energy functional is derived from the Bayesian formulation. The Euler-Lagrange descent equations determine imultaneously an estimate of the image motion field induced by camera motion and an estimate of the spatiotemporal motion undary surface. The Euler-Lagrange equation corresponding to the surface is expressed as a level-set partial differential equation for topology independence and numerically stable implementation. The method can be initialized simply and can track multiple objects with nonsimultaneous motions. Velocities on motion boundaries can be estimated from geometrical properties of the motion boundary. Several examples of experimental verification are given using synthetic and real-image sequences.

  8. Monitoring of Air Pollution by Satellites (MAPS), phase 1

    NASA Technical Reports Server (NTRS)

    Ludwig, C. B.; Malkmus, W.; Griggs, M.; Bartle, E. R.

    1972-01-01

    Results are reported upon which the design of a satellite remote gas filter correlation (RGFC) instrument can be based. Although a final decision about the feasibility of measuring some of the pollutants with the required accuracy is still outstanding and subject to further theoretical and experimental verifications, viable concepts are presented which permit the initiation of the design phase. The pollutants which are of concern in the troposphere and stratosphere were selected. The infrared bands of these pollutants were identified, together with the bands of interfering gases, and the line parameters of the pollutants as well as interfering gases were generated through a computer program. Radiative transfer calculations (line-by-line) were made to establish the radiation levels at the top of the atmosphere and the signal levels at the detector of the RGFC instrument. Based upon these results the channels for the RGFC were selected. Finally, the problem areas, which need further investigations, were delineated and the supporting data requirements were established.

  9. Comparison of ISRU Excavation System Model Blade Force Methodology and Experimental Results

    NASA Technical Reports Server (NTRS)

    Gallo, Christopher A.; Wilkinson, R. Allen; Mueller, Robert P.; Schuler, Jason M.; Nick, Andrew J.

    2010-01-01

    An Excavation System Model has been written to simulate the collection and transportation of regolith on the Moon. The calculations in this model include an estimation of the forces on the digging tool as a result of excavation into the regolith. Verification testing has been performed and the forces recorded from this testing were compared to the calculated theoretical data. A prototype lunar vehicle built at the NASA Johnson Space Center (JSC) was tested with a bulldozer type blade developed at the NASA Kennedy Space Center (KSC) attached to the front. This is the initial correlation of actual field test data to the blade forces calculated by the Excavation System Model and the test data followed similar trends with the predicted values. This testing occurred in soils developed at the NASA Glenn Research Center (GRC) which are a mixture of different types of sands and whose soil properties have been well characterized. Three separate analytical models are compared to the test data.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husain, Tausif; Hasan, Iftekhar; Sozer, Yilmaz

    This paper presents the design considerations of a double-sided transverse flux machine (TFM) for direct-drive wind turbine applications. The TFM has a modular structure with quasi-U stator cores and ring windings. The rotor is constructed with ferrite magnets in a flux-concentrating arrangement to achieve high air gap flux density. The design considerations for this TFM with respect to initial sizing, pole number selection, key design ratios, and pole shaping are presented in this paper. Pole number selection is critical in the design process of a TFM because it affects both the torque density and power factor under fixed magnetic andmore » changing electrical loading. Several key design ratios are introduced to facilitate the design procedure. The effect of pole shaping on back-emf and inductance is also analyzed. These investigations provide guidance toward the required design of a TFM for direct-drive applications. The analyses are carried out using analytical and three-dimensional finite element analysis. A prototype is under construction for experimental verification.« less

  11. Motional timescale predictions by molecular dynamics simulations: Case study using proline and hydroxyproline sidechain dynamics

    PubMed Central

    Aliev, Abil E; Kulke, Martin; Khaneja, Harmeet S; Chudasama, Vijay; Sheppard, Tom D; Lanigan, Rachel M

    2014-01-01

    We propose a new approach for force field optimizations which aims at reproducing dynamics characteristics using biomolecular MD simulations, in addition to improved prediction of motionally averaged structural properties available from experiment. As the source of experimental data for dynamics fittings, we use 13C NMR spin-lattice relaxation times T1 of backbone and sidechain carbons, which allow to determine correlation times of both overall molecular and intramolecular motions. For structural fittings, we use motionally averaged experimental values of NMR J couplings. The proline residue and its derivative 4-hydroxyproline with relatively simple cyclic structure and sidechain dynamics were chosen for the assessment of the new approach in this work. Initially, grid search and simplexed MD simulations identified large number of parameter sets which fit equally well experimental J couplings. Using the Arrhenius-type relationship between the force constant and the correlation time, the available MD data for a series of parameter sets were analyzed to predict the value of the force constant that best reproduces experimental timescale of the sidechain dynamics. Verification of the new force-field (termed as AMBER99SB-ILDNP) against NMR J couplings and correlation times showed consistent and significant improvements compared to the original force field in reproducing both structural and dynamics properties. The results suggest that matching experimental timescales of motions together with motionally averaged characteristics is the valid approach for force field parameter optimization. Such a comprehensive approach is not restricted to cyclic residues and can be extended to other amino acid residues, as well as to the backbone. Proteins 2014; 82:195–215. © 2013 Wiley Periodicals, Inc. PMID:23818175

  12. 78 FR 68848 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-15

    .... Representatives may include registered nurses, social and community service managers, health educators, or social... per burden per respondents respondent response Initial Questionnaire Telephone Registered Nurses 100 1... and Human Service 400 1 15/60 Assistants. Telephone Verification Registered Nurses, Social 2,400 1 10...

  13. 31 CFR Appendix N to Subpart C of... - Financial Crimes Enforcement Network

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Post Office Box 39, Vienna, VA 22183. 3. Requests for amendments of records. Initial determinations... Request, Financial Crimes Enforcement Network, Post Office Box 39, Vienna, VA 22183. 4. Verification of... an accounting of disclosures, must satisfy one of the following identification requirements before...

  14. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  15. Investigation of Cleanliness Verification Techniques for Rocket Engine Hardware

    NASA Technical Reports Server (NTRS)

    Fritzemeier, Marilyn L.; Skowronski, Raymund P.

    1994-01-01

    Oxidizer propellant systems for liquid-fueled rocket engines must meet stringent cleanliness requirements for particulate and nonvolatile residue. These requirements were established to limit residual contaminants which could block small orifices or ignite in the oxidizer system during engine operation. Limiting organic residues in high pressure oxygen systems, such as in the Space Shuttle Main Engine (SSME), is particularly important. The current method of cleanliness verification for the SSME uses an organic solvent flush of the critical hardware surfaces. The solvent is filtered and analyzed for particulate matter followed by gravimetric determination of the nonvolatile residue (NVR) content of the filtered solvent. The organic solvents currently specified for use (1, 1, 1-trichloroethane and CFC-113) are ozone-depleting chemicals slated for elimination by December 1995. A test program is in progress to evaluate alternative methods for cleanliness verification that do not require the use of ozone-depleting chemicals and that minimize or eliminate the use of solvents regulated as hazardous air pollutants or smog precursors. Initial results from the laboratory test program to evaluate aqueous-based methods and organic solvent flush methods for NVR verification are provided and compared with results obtained using the current method. Evaluation of the alternative methods was conducted using a range of contaminants encountered in the manufacture of rocket engine hardware.

  16. An Educational Laboratory for Digital Control and Rapid Prototyping of Power Electronic Circuits

    ERIC Educational Resources Information Center

    Choi, Sanghun; Saeedifard, M.

    2012-01-01

    This paper describes a new educational power electronics laboratory that was developed primarily to reinforce experimentally the fundamental concepts presented in a power electronics course. The developed laboratory combines theoretical design, simulation studies, digital control, fabrication, and verification of power-electronic circuits based on…

  17. Experimental Verification of Multiple-input Multiple Output (MIMO) Beamforming Capabilities Using a Time-division Coherent MIMO Radar

    DTIC Science & Technology

    2015-04-01

    Measurement of radiative and nonradiative recombination rates in InGaAsP and AlGaAs light sources’, IEEE J. Quantum Electron., 1984, QE-20, (8), pp. 838–854 ELECTRONICS LETTERS 16th September 2004 Vol. 40 No. 19

  18. Experimental Verification of Pneumatic Transport System for the Rapid Excavation of Tunnels: Part 1. Installation of Test Facility

    DOT National Transportation Integrated Search

    1978-03-01

    This report deals with the selection of a test site, the design of a test installation, equipment selection, the installation and start-up of a pneumatic pipeline system for the transportation of tunnel muck. A review of prior pneumatic applications ...

  19. Self-Justification as a Determinant of Performance-Style Effectiveness

    ERIC Educational Resources Information Center

    McKenna, Ralph J.

    1971-01-01

    This study examined experimentally the effect of justification on role playing, attempting a more complete verification of the performance style type. Also of concern was whether scores on the Performance Style Test were generalizable to overt behavior on the part of females. Results supported both concerns. (Author/CG)

  20. Comment on Y.-H. Hsu et al., "electrical and mechanical fully coupled theory and experimental verification of Rosen-type piezoelectric transformers" [see reference [1

    PubMed

    Yang, Jiashi

    2007-04-01

    This letter discusses the difference between piezoelectric constitutive relations for the case of one-dimensional stress and the case of one-dimensional strain, and its implications in the modeling of Rosen piezoelectric transformers.

  1. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration consists of the following components: - The NOAA Environmental Modeling System (NEMS) version of the Non updates for the 12 km parent domain and the 3 km CONUS/Alaska nests. The non-cycled nests (Hawaii, Puerto

  2. Verification and Validation of Monte Carlo N-Particle 6 for Computing Gamma Protection Factors

    DTIC Science & Technology

    2015-03-26

    methods for evaluating RPFs, which it used for the subsequent 30 years. These approaches included computational modeling, radioisotopes , and a high...1.2.1. Past Methods of Experimental Evaluation ........................................................ 2 1.2.2. Modeling Efforts...Other Considerations ......................................................................................... 14 2.4. Monte Carlo Methods

  3. EXPERIMENTAL DESIGN CONSIDERATIONS WHEN VERIFYING THE PERFORMANCE OF MONITORING TECHNOLOGIES FOR DIOXIN AND DIOXIN-LIKE COMPOUNDS IN SOILS AND SEDIMENTS

    EPA Science Inventory

    A performance verification demonstration of technologies capable of detecting dioxin and dioxin-like compounds in soil and sediment samples was conducted in April 2004 under the U.S. Environmental Protection Agency's Superfund Innovative Technology Evaluation (SITE) Monitoring an...

  4. Issues of planning trajectory of parallel robots taking into account zones of singularity

    NASA Astrophysics Data System (ADS)

    Rybak, L. A.; Khalapyan, S. Y.; Gaponenko, E. V.

    2018-03-01

    A method for determining the design characteristics of a parallel robot necessary to provide specified parameters of its working space that satisfy the controllability requirement is developed. The experimental verification of the proposed method was carried out using an approximate planar 3-RPR mechanism.

  5. Experimental verification of the influence of time-dependent material properties on long-term bridge characteristics.

    DOT National Transportation Integrated Search

    2006-08-01

    Post-tensioned cast-in-place box girder bridges are commonly used in California. Losses in tension in : the steel prestressing tendons used in these bridges occur over time due to creep and shrinkage of : concrete and relaxation of the tendons. The u...

  6. Apparatus for Teaching Physics: Linearizing a Nonlinear Spring.

    ERIC Educational Resources Information Center

    Wagner, Glenn

    1995-01-01

    Describes a method to eliminate the nonlinearity from a spring that is used in experimental verification of Hooke's Law where students are asked to determine the force constant and the linear equation that describes the extension of the spring as a function of the mass placed on it. (JRH)

  7. Testing of full-size reinforced concrete beams strengthened with FRP composites : experimental results and design methods verification(appendices)

    DOT National Transportation Integrated Search

    2000-06-01

    In 1997, a load rating of an historic reinforced concrete bridge in Oregon, Horsetail Creek Bridge, indicated substandard shear and moment capacities of the beams. As a result, the Bridge was strengthened with fiber reinforced polymer composites as a...

  8. Prototype automated post-MECO ascent I-load Verification Data Table

    NASA Technical Reports Server (NTRS)

    Lardas, George D.

    1990-01-01

    A prototype automated processor for quality assurance of Space Shuttle post-Main Engine Cut Off (MECO) ascent initialization parameters (I-loads) is described. The processor incorporates Clips rules adapted from the quality assurance criteria for the post-MECO ascent I-loads. Specifically, the criteria are implemented for nominal and abort targets, as given in the 'I-load Verification Data Table, Part 3, Post-MECO Ascent, Version 2.1, December 1989.' This processor, ivdt, compares a given l-load set with the stated mission design and quality assurance criteria. It determines which I-loads violate the stated criteria, and presents a summary of I-loads that pass or fail the tests.

  9. Verification of Three Dimensional Triangular Prismatic Discrete Ordinates Transport Code ENSEMBLE-TRIZ by Comparison with Monte Carlo Code GMVP

    NASA Astrophysics Data System (ADS)

    Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi

    2014-06-01

    This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.

  10. Quantum blind dual-signature scheme without arbitrator

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  11. Modeling of Pressure Drop During Refrigerant Condensation in Pipe Minichannels

    NASA Astrophysics Data System (ADS)

    Sikora, Małgorzata; Bohdal, Tadeusz

    2017-12-01

    Investigations of refrigerant condensation in pipe minichannels are very challenging and complicated issue. Due to the multitude of influences very important is mathematical and computer modeling. Its allows for performing calculations for many different refrigerants under different flow conditions. A large number of experimental results published in the literature allows for experimental verification of correctness of the models. In this work is presented a mathematical model for calculation of flow resistance during condensation of refrigerants in the pipe minichannel. The model was developed in environment based on conservation equations. The results of calculations were verified by authors own experimental investigations results.

  12. Square wave voltammetry at the dropping mercury electrode: Experimental

    USGS Publications Warehouse

    Turner, J.A.; Christie, J.H.; Vukovic, M.; Osteryoung, R.A.

    1977-01-01

    Experimental verification of earlier theoretical work for square wave voltammetry at the dropping mercury electrode is given. Experiments using ferric oxalate and cadmium(II) in HCl confirm excellent agreement with theory. Experimental peak heights and peak widths are found to be within 2% of calculated results. An example of trace analysis using square wave voltammetry at the DME is presented. The technique is shown to have the same order of sensitivity as differential pulse polarography but is much faster to perform. A detection limit for cadmium in 0.1 M HCl for the system used here was 7 ?? 10-8 M.

  13. Sound absorption by a Helmholtz resonator

    NASA Astrophysics Data System (ADS)

    Komkin, A. I.; Mironov, M. A.; Bykov, A. I.

    2017-07-01

    Absorption characteristics of a Helmholtz resonator positioned at the end wall of a circular duct are considered. The absorption coefficient of the resonator is experimentally investigated as a function of the diameter and length of the resonator neck and the depth of the resonator cavity. Based on experimental data, the linear analytic model of a Helmholtz resonator is verified, and the results of verification are used to determine the dissipative attached length of the resonator neck so as to provide the agreement between experimental and calculated data. Dependences of sound absorption by a Helmholtz resonator on its geometric parameters are obtained.

  14. Experimental verification of the rainbow trapping effect in adiabatic plasmonic gratings

    PubMed Central

    Gan, Qiaoqiang; Gao, Yongkang; Wagner, Kyle; Vezenov, Dmitri; Ding, Yujie J.; Bartoli, Filbert J.

    2011-01-01

    We report the experimental observation of a trapped rainbow in adiabatically graded metallic gratings, designed to validate theoretical predictions for this unique plasmonic structure. One-dimensional graded nanogratings were fabricated and their surface dispersion properties tailored by varying the grating groove depth, whose dimensions were confirmed by atomic force microscopy. Tunable plasmonic bandgaps were observed experimentally, and direct optical measurements on graded grating structures show that light of different wavelengths in the 500–700-nm region is “trapped” at different positions along the grating, consistent with computer simulations, thus verifying the “rainbow” trapping effect. PMID:21402936

  15. Experimental Verification of the Theory of Wind-Tunnel Boundary Interference

    NASA Technical Reports Server (NTRS)

    Theodorsen, Theodore; Silverstein, Abe

    1935-01-01

    The results of an experimental investigation on the boundary-correction factor are presented in this report. The values of the boundary-correction factor from the theory, which at the present time is virtually completed, are given in the report for all conventional types of tunnels. With the isolation of certain disturbing effects, the experimental boundary-correction factor was found to be in satisfactory agreement with the theoretically predicted values, thus verifying the soundness and sufficiency of the theoretical analysis. The establishment of a considerable velocity distortion, in the nature of a unique blocking effect, constitutes a principal result of the investigation.

  16. Nonideal Rayleigh–Taylor mixing

    PubMed Central

    Lim, Hyunkyung; Iwerks, Justin; Glimm, James; Sharp, David H.

    2010-01-01

    Rayleigh–Taylor mixing is a classical hydrodynamic instability that occurs when a light fluid pushes against a heavy fluid. The two main sources of nonideal behavior in Rayleigh–Taylor (RT) mixing are regularizations (physical and numerical), which produce deviations from a pure Euler equation, scale invariant formulation, and nonideal (i.e., experimental) initial conditions. The Kolmogorov theory of turbulence predicts stirring at all length scales for the Euler fluid equations without regularization. We interpret mathematical theories of existence and nonuniqueness in this context, and we provide numerical evidence for dependence of the RT mixing rate on nonideal regularizations; in other words, indeterminacy when modeled by Euler equations. Operationally, indeterminacy shows up as nonunique solutions for RT mixing, parametrized by Schmidt and Prandtl numbers, in the large Reynolds number (Euler equation) limit. Verification and validation evidence is presented for the large eddy simulation algorithm used here. Mesh convergence depends on breaking the nonuniqueness with explicit use of the laminar Schmidt and Prandtl numbers and their turbulent counterparts, defined in terms of subgrid scale models. The dependence of the mixing rate on the Schmidt and Prandtl numbers and other physical parameters will be illustrated. We demonstrate numerically the influence of initial conditions on the mixing rate. Both the dominant short wavelength initial conditions and long wavelength perturbations are observed to play a role. By examination of two classes of experiments, we observe the absence of a single universal explanation, with long and short wavelength initial conditions, and the various physical and numerical regularizations contributing in different proportions in these two different contexts. PMID:20615983

  17. Benchmarking on Tsunami Currents with ComMIT

    NASA Astrophysics Data System (ADS)

    Sharghi vand, N.; Kanoglu, U.

    2015-12-01

    There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)

  18. Near Wall Dynamics in Colloidal Suspensions Studied by Evansescent Wave Dynamic Light Scattering

    NASA Astrophysics Data System (ADS)

    Lang, Peter R.

    2011-03-01

    The dynamics of dispersed colloidal particles is slowed down, and becomes anisotropic in the ultimate vicinity of a flat wall due to the wall drag effect. Although theoretically predicted in the early 20th century, experimental verification of this effect for Brownian particles became possible only in the late 80s. Since then a variety of experimental investigations on near wall Brownian dynamics by evanescent wave dynamic light scattering (EWDLS) has been published. In this contribution the method of EWDLS will be briefly introduced, experiments at low and high colloid concentration for hard-sphere suspensions, and the theoretical prediction for measured initial slopes of correlation functions will be discussed. On increasing the particle concentration the influence of the wall drag effect is found to diminishes gradually, until it becomes negligible at volume fractions above ϕ 0.35. The effect that a wall exerts on the orientational dynamics was investigated for different kinds of colloids. Experiments, simulations and a virial expansion theory show that rotational dynamics is slowed down as well. However, the effect is prominent in EWDLS only if the particles' short axis is of the order of the evanescent wave penetration depth. The author acknowledges financial support from the EU through FP7, project Nanodirect (Grant 395 No. NMP4-SL-2008-213948).

  19. Multi-Response Optimization of Granaticinic Acid Production by Endophytic Streptomyces thermoviolaceus NT1, Using Response Surface Methodology

    PubMed Central

    Roy, Sudipta; Halder, Suman Kumar; Banerjee, Debdulal

    2016-01-01

    Streptomyces thermoviolaceus NT1, an endophytic isolate, was studied for optimization of granaticinic acid production. It is an antimicrobial metabolite active against even drug resistant bacteria. Different media, optimum glucose concentration, initial media pH, incubation temperature, incubation period, and inoculum size were among the selected parameters optimized in the one-variable-at-a-time (OVAT) approach, where glucose concentration, pH, and temperature were found to play a critical role in antibiotic production by this strain. Finally, the Box–Behnken experimental design (BBD) was employed with three key factors (selected after OVAT studies) for response surface methodological (RSM) analysis of this optimization study.RSM analysis revealed a multifactorial combination; glucose 0.38%, pH 7.02, and temperature 36.53 °C as the optimum conditions for maximum antimicrobial yield. Experimental verification of model analysis led to 3.30-fold (61.35 mg/L as compared to 18.64 mg/L produced in un-optimized condition) enhanced granaticinic acid production in ISP2 medium with 5% inoculum and a suitable incubation period of 10 days. So, the conjugated optimization study for maximum antibiotic production from Streptomyces thermoviolaceus NT1 was found to result in significantly higher yield, which might be exploited in industrial applications. PMID:28952581

  20. The Feasibility of Applying AC Driven Low-Temperature Plasma for Multi-Cycle Detonation Initiation

    NASA Astrophysics Data System (ADS)

    Zheng, Dianfeng

    2016-11-01

    Ignition is a key system in pulse detonation engines (PDE). As advanced ignition methods, nanosecond pulse discharge low-temperature plasma ignition is used in some combustion systems, and continuous alternating current (AC) driven low-temperature plasma using dielectric barrier discharge (DBD) is used for the combustion assistant. However, continuous AC driven plasmas cannot be used for ignition in pulse detonation engines. In this paper, experimental and numerical studies of pneumatic valve PDE using an AC driven low-temperature plasma igniter were described. The pneumatic valve was jointly designed with the low-temperature plasma igniter, and the numerical simulation of the cold-state flow field in the pneumatic valve showed that a complex flow in the discharge area, along with low speed, was beneficial for successful ignition. In the experiments ethylene was used as the fuel and air as oxidizing agent, ignition by an AC driven low-temperature plasma achieved multi-cycle intermittent detonation combustion on a PDE, the working frequency of the PDE reached 15 Hz and the peak pressure of the detonation wave was approximately 2.0 MPa. The experimental verifications of the feasibility in PDE ignition expanded the application field of AC driven low-temperature plasma. supported by National Natural Science Foundation of China (No. 51176001)

  1. Optimization of the fabrication of novel stealth PLA-based nanoparticles by dispersion polymerization using D-optimal mixture design

    PubMed Central

    Adesina, Simeon K.; Wight, Scott A.; Akala, Emmanuel O.

    2015-01-01

    Purpose Nanoparticle size is important in drug delivery. Clearance of nanoparticles by cells of the reticuloendothelial system has been reported to increase with increase in particle size. Further, nanoparticles should be small enough to avoid lung or spleen filtering effects. Endocytosis and accumulation in tumor tissue by the enhanced permeability and retention effect are also processes that are influenced by particle size. We present the results of studies designed to optimize crosslinked biodegradable stealth polymeric nanoparticles fabricated by dispersion polymerization. Methods Nanoparticles were fabricated using different amounts of macromonomer, initiators, crosslinking agent and stabilizer in a dioxane/DMSO/water solvent system. Confirmation of nanoparticle formation was by scanning electron microscopy (SEM). Particle size was measured by dynamic light scattering (DLS). D-optimal mixture statistical experimental design was used for the experimental runs, followed by model generation (Scheffe polynomial) and optimization with the aid of a computer software. Model verification was done by comparing particle size data of some suggested solutions to the predicted particle sizes. Results and Conclusion Data showed that average particle sizes follow the same trend as predicted by the model. Negative terms in the model corresponding to the crosslinking agent and stabilizer indicate the important factors for minimizing particle size. PMID:24059281

  2. Optimization of the fabrication of novel stealth PLA-based nanoparticles by dispersion polymerization using D-optimal mixture design.

    PubMed

    Adesina, Simeon K; Wight, Scott A; Akala, Emmanuel O

    2014-11-01

    Nanoparticle size is important in drug delivery. Clearance of nanoparticles by cells of the reticuloendothelial system has been reported to increase with increase in particle size. Further, nanoparticles should be small enough to avoid lung or spleen filtering effects. Endocytosis and accumulation in tumor tissue by the enhanced permeability and retention effect are also processes that are influenced by particle size. We present the results of studies designed to optimize cross-linked biodegradable stealth polymeric nanoparticles fabricated by dispersion polymerization. Nanoparticles were fabricated using different amounts of macromonomer, initiators, crosslinking agent and stabilizer in a dioxane/DMSO/water solvent system. Confirmation of nanoparticle formation was by scanning electron microscopy (SEM). Particle size was measured by dynamic light scattering (DLS). D-optimal mixture statistical experimental design was used for the experimental runs, followed by model generation (Scheffe polynomial) and optimization with the aid of a computer software. Model verification was done by comparing particle size data of some suggested solutions to the predicted particle sizes. Data showed that average particle sizes follow the same trend as predicted by the model. Negative terms in the model corresponding to the cross-linking agent and stabilizer indicate the important factors for minimizing particle size.

  3. Liquid droplet radiator program at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Presler, A. F.; Coles, C. E.; Diem-Kirsop, P. S.; White, K. A., III

    1985-01-01

    The NASA Lewis Research Center and the Air Force Rocket Propulsion Laboratory (AFRPL) are jointly engaged in a program for technical assessment of the Liquid Droplet Radiator (LDR) concept as an advanced high performance heat ejection component for future space missions. NASA Lewis has responsibility for the technology needed for the droplet generator, for working fluid qualification, and for investigating the physics of droplets in space; NASA Lewis is also conducting systems/mission analyses for potential LDR applications with candidate space power systems. For the droplet generator technology task, both micro-orifice fabrication techniques and droplet stream formation processes have been experimentally investigated. High quality micro-orifices (to 50 micron diameter) are routinely fabricated with automated equipment. Droplet formation studies have established operating boundaries for the generation of controlled and uniform droplet streams. A test rig is currently being installed for the experimental verification, under simulated space conditions, of droplet radiation heat transfer performance analyses and the determination of the effect radiative emissivity of multiple droplet streams. Initial testing has begun in the NASA Lewis Zero-Gravity Facility for investigating droplet stream behavior in microgravity conditions. This includes the effect of orifice wetting on jet dynamics and droplet formation. Results for both Brayton and Stirling power cycles have identified favorable mass and size comparisons of the LDR with conventional radiator concepts.

  4. Influences of operational parameters on phosphorus removal in batch and continuous electrocoagulation process performance.

    PubMed

    Nguyen, Dinh Duc; Yoon, Yong Soo; Bui, Xuan Thanh; Kim, Sung Su; Chang, Soon Woong; Guo, Wenshan; Ngo, Huu Hao

    2017-11-01

    Performance of an electrocoagulation (EC) process in batch and continuous operating modes was thoroughly investigated and evaluated for enhancing wastewater phosphorus removal under various operating conditions, individually or combined with initial phosphorus concentration, wastewater conductivity, current density, and electrolysis times. The results revealed excellent phosphorus removal (72.7-100%) for both processes within 3-6 min of electrolysis, with relatively low energy requirements, i.e., less than 0.5 kWh/m 3 for treated wastewater. However, the removal efficiency of phosphorus in the continuous EC operation mode was better than that in batch mode within the scope of the study. Additionally, the rate and efficiency of phosphorus removal strongly depended on operational parameters, including wastewater conductivity, initial phosphorus concentration, current density, and electrolysis time. Based on experimental data, statistical model verification of the response surface methodology (RSM) (multiple factor optimization) was also established to provide further insights and accurately describe the interactive relationship between the process variables, thus optimizing the EC process performance. The EC process using iron electrodes is promising for improving wastewater phosphorus removal efficiency, and RSM can be a sustainable tool for predicting the performance of the EC process and explaining the influence of the process variables.

  5. Minimum information about a biofilm experiment (MIABiE): standards for reporting experiments and data on sessile microbial communities living at interfaces.

    PubMed

    Lourenço, Anália; Coenye, Tom; Goeres, Darla M; Donelli, Gianfranco; Azevedo, Andreia S; Ceri, Howard; Coelho, Filipa L; Flemming, Hans-Curt; Juhna, Talis; Lopes, Susana P; Oliveira, Rosário; Oliver, Antonio; Shirtliff, Mark E; Sousa, Ana M; Stoodley, Paul; Pereira, Maria Olivia; Azevedo, Nuno F

    2014-04-01

    The minimum information about a biofilm experiment (MIABiE) initiative has arisen from the need to find an adequate and scientifically sound way to control the quality of the documentation accompanying the public deposition of biofilm-related data, particularly those obtained using high-throughput devices and techniques. Thereby, the MIABiE consortium has initiated the identification and organization of a set of modules containing the minimum information that needs to be reported to guarantee the interpretability and independent verification of experimental results and their integration with knowledge coming from other fields. MIABiE does not intend to propose specific standards on how biofilms experiments should be performed, because it is acknowledged that specific research questions require specific conditions which may deviate from any standardization. Instead, MIABiE presents guidelines about the data to be recorded and published in order for the procedure and results to be easily and unequivocally interpreted and reproduced. Overall, MIABiE opens up the discussion about a number of particular areas of interest and attempts to achieve a broad consensus about which biofilm data and metadata should be reported in scientific journals in a systematic, rigorous and understandable manner. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  6. Effects of magnetization on fusion product trapping and secondary neutron spectra

    DOE PAGES

    Knapp, Patrick F.; Schmit, Paul F.; Hansen, Stephanie B.; ...

    2015-05-14

    In magnetizing the fusion fuel in inertial confinement fusion (ICF) systems, we found that the required stagnation pressure and density can be relaxed dramatically. This happens because the magnetic field insulates the hot fuel from the cold pusher and traps the charged fusion burn products. This trapping allows the burn products to deposit their energy in the fuel, facilitating plasma self-heating. Here, we report on a comprehensive theory of this trapping in a cylindrical DD plasma magnetized with a purely axial magnetic field. Using this theory, we are able to show that the secondary fusion reactions can be used tomore » infer the magnetic field-radius product, BR, during fusion burn. This parameter, not ρR, is the primary confinement parameter in magnetized ICF. Using this method, we analyze data from recent Magnetized Liner InertialFusion experiments conducted on the Z machine at Sandia National Laboratories. Furthermore, we show that in these experiments BR ≈ 0.34(+0.14/-0.06) MG · cm, a ~ 14× increase in BR from the initial value, and confirming that the DD-fusion tritons are magnetized at stagnation. Lastly, this is the first experimental verification of charged burn product magnetization facilitated by compression of an initial seed magnetic flux.« less

  7. Modeling human response errors in synthetic flight simulator domain

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  8. Pyroelectric effect in tryglicyne sulphate single crystals - Differential measurement method

    NASA Astrophysics Data System (ADS)

    Trybus, M.

    2018-06-01

    A simple mathematical model of the pyroelectric phenomenon was used to explain the electric response of the TGS (triglycine sulphate) samples in the linear heating process in ferroelectric and paraelectric phases. Experimental verification of mathematical model was realized. TGS single crystals were grown and four electrode samples were fabricated. Differential measurements of the pyroelectric response of two different regions of the samples were performed and the results were compared with data obtained from the model. Experimental results are in good agreement with model calculations.

  9. Proceedings of the Annual Symposium on Frequency Control (41st) Held in Philadelphia, Pennsylvania on 27-29 May 1987

    DTIC Science & Technology

    1987-05-29

    Controler A Fig.1 Experimental setip, P.S.O, : Phase sen:sitive detector. 0 VC.X.O. : Voltage controlled crystal oscillator. 1 A : Post - detector amplifier...the sampling period samples were obtained using a pair of fre- used in the experimental verification. :uency counters controlled by a desk-top...reduce the effect of group delay changes. The first method can te implemented by actively -_ - - . - or passively controlling the environment around

  10. Hawking radiation in an electromagnetic waveguide?

    PubMed

    Schützhold, Ralf; Unruh, William G

    2005-07-15

    It is demonstrated that the propagation of electromagnetic waves in an appropriately designed waveguide is (for large wavelengths) analogous to that within a curved space-time--such as around a black hole. As electromagnetic radiation (e.g., microwaves) can be controlled, amplified, and detected (with present-day technology) much easier than sound, for example, we propose a setup for the experimental verification of the Hawking effect. Apart from experimentally testing this striking prediction, this would facilitate the investigation of the trans-Planckian problem.

  11. Continuing Issues (FY 1979) Regarding DoD Use of the Space Transportation System.

    DTIC Science & Technology

    1979-12-01

    estimates) to the cost of launching experimental payloads In the sortie mode is the analytical verification of compatibility ("Integrsaton") of the experiment...with the Shuttle; the Integration cost mya be reduced by the Air Force by treir proposed "class cargo" generalized integra- tion analysis that, once...camactness and light weight (for a given experimental weight) rather than on intrinsic cost , to minmize lazc costa as couted by the NASA volume and weight

  12. Toward Ada Verification: A Collection of Relevant Topics

    DTIC Science & Technology

    1986-06-01

    presumably it is this- if there are no default values, a programming error which results in failure to initialize a variable is more likely to advertise ... disavantages tu using AVID. First, TDL is a more complicated interface than first-order logic (as used in the CSG). Second, AVID is unsupported and

  13. Verification and Trust: Background Investigations Preceding Faculty Appointment

    ERIC Educational Resources Information Center

    Finkin, Matthew W.; Post, Robert C.; Thomson, Judith J.

    2004-01-01

    Many employers in the United States have responded to the terrorist attacks of September 11, 2001, by initiating or expanding policies requiring background checks of prospective employees. Their ability to perform such checks has been abetted by the growth of computerized databases and of commercial enterprises that facilitate access to personal…

  14. Verification and Trust: Background Investigations Preceding Faculty Appointment

    ERIC Educational Resources Information Center

    Academe, 2004

    2004-01-01

    Many employers in the United States have been initiating or expanding policies requiring background checks of prospective employees. The ability to perform such checks has been abetted by the growth of computerized databases and of commercial enterprises that facilitate access to personal information. Employers now have ready access to public…

  15. In vivo dose verification method in catheter based high dose rate brachytherapy.

    PubMed

    Jaselskė, Evelina; Adlienė, Diana; Rudžianskas, Viktoras; Urbonavičius, Benas Gabrielis; Inčiūra, Arturas

    2017-12-01

    In vivo dosimetry is a powerful tool for dose verification in radiotherapy. Its application in high dose rate (HDR) brachytherapy is usually limited to the estimation of gross errors, due to inability of the dosimetry system/ method to record non-uniform dose distribution in steep dose gradient fields close to the radioactive source. In vivo dose verification in interstitial catheter based HDR brachytherapy is crucial since the treatment is performed inserting radioactive source at the certain positions within the catheters that are pre-implanted into the tumour. We propose in vivo dose verification method for this type of brachytherapy treatment which is based on the comparison between experimentally measured and theoretical dose values calculated at well-defined locations corresponding dosemeter positions in the catheter. Dose measurements were performed using TLD 100-H rods (6 mm long, 1 mm diameter) inserted in a certain sequences into additionally pre-implanted dosimetry catheter. The adjustment of dosemeter positioning in the catheter was performed using reconstructed CT scans of patient with pre-implanted catheters. Doses to three Head&Neck and one Breast cancer patient have been measured during several randomly selected treatment fractions. It was found that the average experimental dose error varied from 4.02% to 12.93% during independent in vivo dosimetry control measurements for selected Head&Neck cancer patients and from 7.17% to 8.63% - for Breast cancer patient. Average experimental dose error was below the AAPM recommended margin of 20% and did not exceed the measurement uncertainty of 17.87% estimated for this type of dosemeters. Tendency of slightly increasing average dose error was observed in every following treatment fraction of the same patient. It was linked to the changes of theoretically estimated dosemeter positions due to the possible patient's organ movement between different treatment fractions, since catheter reconstruction was performed for the first treatment fraction only. These findings indicate potential for further average dose error reduction in catheter based brachytherapy by at least 2-3% in the case that catheter locations will be adjusted before each following treatment fraction, however it requires more detailed investigation. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  16. Experimental Verification of the Structural Glass Beam-Columns Strength

    NASA Astrophysics Data System (ADS)

    Pešek, Ondřej; Melcher, Jindřich; Balázs, Ivan

    2017-10-01

    This paper deals with experimental research of axially and laterally loaded members made of structural (laminated) glass. The purpose of the research is the evaluation of buckling strength and actual behaviour of the beam-columns due to absence of standards for design of glass load-bearing structures. The experimental research follows the previous one focusing on measuring of initial geometrical imperfections of glass members, testing of glass beams and columns. Within the frame of the research 9 specimens were tested. All of them were of the same geometry (length 2000 mm, width 200 mm and thickness 16 mm) but different composition - laminated double glass made of annealed glass or fully tempered glass panes bonded together by PVB or EVASAFE foil. Specimens were at first loaded by axial force and then by constantly increasing bending moment up to failure. During testing lateral deflections, vertical deflection and normal stresses at mid-span were measured. A maximum load achieved during testing has been adopted as flexural-lateral-torsional buckling strength. The results of experiments were statistically evaluated according to the European standard for design of structures EN 1990, appendix D. There are significant differences between specimens made of annealed glass or fully tempered glass. Differences between specimens loaded by axial forces 1 kN and 2 kN are negligible. The next step was to determine the design strength by calculation procedure based on buckling curves approach intended for design of steel columns and develop interaction criterion for glass beams-columns.

  17. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-02

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. [Optimization of vacuum belt drying process of Gardeniae Fructus in Reduning injection by Box-Behnken design-response surface methodology].

    PubMed

    Huang, Dao-sheng; Shi, Wei; Han, Lei; Sun, Ke; Chen, Guang-bo; Wu Jian-xiong; Xu, Gui-hong; Bi, Yu-an; Wang, Zhen-zhong; Xiao, Wei

    2015-06-01

    To optimize the belt drying process conditions optimization of Gardeniae Fructus extract from Reduning injection by Box-Behnken design-response surface methodology, on the basis of single factor experiment, a three-factor and three-level Box-Behnken experimental design was employed to optimize the drying technology of Gardeniae Fructus extract from Reduning injection. With drying temperature, drying time, feeding speed as independent variables and the content of geniposide as dependent variable, the experimental data were fitted to a second order polynomial equation, establishing the mathematical relationship between the content of geniposide and respective variables. With the experimental data analyzed by Design-Expert 8. 0. 6, the optimal drying parameter was as follows: the drying temperature was 98.5 degrees C , the drying time was 89 min, the feeding speed was 99.8 r x min(-1). Three verification experiments were taked under this technology and the measured average content of geniposide was 564. 108 mg x g(-1), which was close to the model prediction: 563. 307 mg x g(-1). According to the verification test, the Gardeniae Fructus belt drying process is steady and feasible. So single factor experiments combined with response surface method (RSM) could be used to optimize the drying technology of Reduning injection Gardenia extract.

  19. Partial defect verification of spent fuel assemblies by PDET: Principle and field testing in Interim Spent fuel Storage Facility (CLAB) in Sweden

    DOE PAGES

    Ham, Y.; Kerr, P.; Sitaraman, S.; ...

    2016-05-05

    Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less

  20. Partial Defect Verification of Spent Fuel Assemblies by PDET: Principle and Field Testing in Interim Spent Fuel Storage Facility (CLAB) in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Y.S.; Kerr, P.; Sitaraman, S.

    The need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called 'difficult-to-access' areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into 'difficult-to-access' areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reported themore » successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17x17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly burnup levels. (authors)« less

  1. Partial defect verification of spent fuel assemblies by PDET: Principle and field testing in Interim Spent fuel Storage Facility (CLAB) in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Y.; Kerr, P.; Sitaraman, S.

    Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less

  2. Mineral Bionization - Surface Chemical Modeling of the Emergence of Life

    NASA Astrophysics Data System (ADS)

    Arrhenius, G.

    2001-12-01

    The earliest stages in entering an RNA-world require natural mechanisms that are capable of selective concentration of simple aldehydes from dilute solution in the environment (4), furthermore phosphorylation of the sequestered aldehydes (2) and their catalytic condensation to form, selectively, tetrose- (threose) or pentose- (ribose) phosphate (3); the latter representing the R in RNA. A variety of common positively charged sheet structure minerals (mixed valence double layer metal hydroxide minerals such as hydrotalcite and green rust) have proven to be remarkably capable of performing these crucial tasks under simplified natural conditions (1). These prebiotic model reactions have demonstrated plausible closure of the gap, previously thought to preclude the natural formation of nucleoside phosphates, the backbone components of the information carrying genetic material. Pioneering research by other workers (5) has demonstrated the feasibility of necessary further steps in the chain toward functional RNA; mineral (montmorillonite) catalyzed oligomerization of nucleotides, the formation of complementary RNA strands (6) and the enzymatic activity of RNA (ribozymes). These contributions have placed the initially conjectural concept of an initial RNA-world on an experimental footing. Remaining problems include the initial transfer of information to spontaneously forming RNA, sufficient to convey biofunctionallity (7). Also in this central problem mineral surface interactions may be speculated to play a natural role; a question that is open to experimental verification. References. 1. Pitsch, S.; Eschenmoser, A.; Gedulin, B.; Hui, S. and Arrhenius, G. Origins Life Evol. Biosphere, 1994, 24 (5), 389. 2. Kolb, V.; Zhang, S.; Xu, Y.; Arrhenius, G. Origins Life Evol. Biosphere, 1997, 27, 485. 3. Krishnamurthy, R.; Pitsch, S.; Arrhenius, G. Origins Life Evol. Biosphere, Origins Life Evol. Biosphere 1999, 29, 139 4. Pitsch, S.; Krishnamurthy, R.; Arrhenius, G. Helv. Chim. Acta. 2000, 83, 2398. 5. Ferris, J. P.; Ertem, G. J. J. Am. Chem. Soc. 1993, 115, 1270. 6. Orgel ,L.E. J. Theoretical Biol. 1986, 123, 127-149 7. Arrhenius, G; Life out of Chaos. In Palyi et al. eds. Fundamentals of Life, Elsevier, Paris, 2001

  3. Numerical Simulation of the Fluid-Structure Interaction of a Surface Effect Ship Bow Seal

    NASA Astrophysics Data System (ADS)

    Bloxom, Andrew L.

    Numerical simulations of fluid-structure interaction (FSI) problems were performed in an effort to verify and validate a commercially available FSI tool. This tool uses an iterative partitioned coupling scheme between CD-adapco's STAR-CCM+ finite volume fluid solver and Simulia's Abaqus finite element structural solver to simulate the FSI response of a system. Preliminary verification and validation work (V&V) was carried out to understand the numerical behavior of the codes individually and together as a FSI tool. Verification and Validation work that was completed included code order verification of the respective fluid and structural solvers with Couette-Poiseuille flow and Euler-Bernoulli beam theory. These results confirmed the 2 nd order accuracy of the spatial discretizations used. Following that, a mixture of solution verifications and model calibrations was performed with the inclusion of the physics models implemented in the solution of the FSI problems. Solution verifications were completed for fluid and structural stand-alone models as well as for the coupled FSI solutions. These results re-confirmed the spatial order of accuracy but for more complex flows and physics models as well as the order of accuracy of the temporal discretizations. In lieu of a good material definition, model calibration is performed to reproduce the experimental results. This work used model calibration for both instances of hyperelastic materials which were presented in the literature as validation cases because these materials were defined as linear elastic. Calibrated, three dimensional models of the bow seal on the University of Michigan bow seal test platform showed the ability to reproduce the experimental results qualitatively through averaging of the forces and seal displacements. These simulations represent the only current 3D results for this case. One significant result of this study is the ability to visualize the flow around the seal and to directly measure the seal resistances at varying cushion pressures, seal immersions, forward speeds, and different seal materials. SES design analysis could greatly benefit from the inclusion of flexible seals in simulations, and this work is a positive step in that direction. In future work, the inclusion of more complex seal geometries and contact will further enhance the capability of this tool.

  4. Development of an automated on-line pepsin digestion-liquid chromatography-tandem mass spectrometry configuration for the rapid analysis of protein adducts of chemical warfare agents.

    PubMed

    Carol-Visser, Jeroen; van der Schans, Marcel; Fidder, Alex; Hulst, Albert G; van Baar, Ben L M; Irth, Hubertus; Noort, Daan

    2008-07-01

    Rapid monitoring and retrospective verification are key issues in protection against and non-proliferation of chemical warfare agents (CWA). Such monitoring and verification are adequately accomplished by the analysis of persistent protein adducts of these agents. Liquid chromatography-mass spectrometry (LC-MS) is the tool of choice in the analysis of such protein adducts, but the overall experimental procedure is quite elaborate. Therefore, an automated on-line pepsin digestion-LC-MS configuration has been developed for the rapid determination of CWA protein adducts. The utility of this configuration is demonstrated by the analysis of specific adducts of sarin and sulfur mustard to human butyryl cholinesterase and human serum albumin, respectively.

  5. Concept Verification Test - Evaluation of Spacelab/Payload operation concepts

    NASA Technical Reports Server (NTRS)

    Mcbrayer, R. O.; Watters, H. H.

    1977-01-01

    The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.

  6. Verification of the CFD simulation system SAUNA for complex aircraft configurations

    NASA Astrophysics Data System (ADS)

    Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.

    1994-04-01

    This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.

  7. GPS and Galileo Developments on Board the International Space Station With the Space Communications and Navigation (SCaN) Testbed

    NASA Technical Reports Server (NTRS)

    Pozzobon, Oscar; Fantinato, Samuele; Dalla Chiara, Andrea; Gamba, Giovanni; Crisci, Massimo; Giordana, Pietro; Enderle, Werner; Chelmins, David; Sands, Obed S.; Clapper, Carolyn J.; hide

    2016-01-01

    The Space Communications and Navigation (SCaN) is a facility developed by NASA and hosted on board the International Space Station (ISS) on an external truss since 2013.It has the objective of testing navigation and communication experimentations with a Software Defined Radio (SDR) approach, which permits software updates for testing new experimentations.NASA has developed the Space Telecommunications Radio System (STRS) architecture standard for SDRs used in space and ground-based platforms to provide commonality among radio developments to provide enhanced capability. The hardware is equipped with both L band front-end radios and the NASA space network communicates with it using S-band, Ku-band and Ka-band links.In May 2016 Qascom started GARISS (GPS and Galileo Receiver for the ISS), an activity of experimentation in collaboration with ESA and NASA that has the objective to develop and validate the acquisition and processing of combined GPS and Galileo signals on board the ISS SCaN testbed. This paper has the objective to present the mission, and provide preliminary details about the challenges in the design, development and verification of the waveform that will be installed on equipment with limited resources. GARISS is also the first attempt to develop a waveform for the ISS as part of an international collaboration between US and Europe. Although the final mission objective is to target dual frequency processing, initial operations will foresee a single frequency processing. Initial results and trade-off between the two options, as well as the final decision will be presented and discussed. The limited resources on board the SCaN with respect to the challenging requirements to acquire and track contemporaneously two satellite navigation systems, with different modulations and data structure, led to the need to assess the possibility of aiding from ground through the S-band. This option would allow assistance to the space receiver in order to provide knowledge of GNSS orbits and reduce the processing on board. Trade off and various options for telemetry and uplink data are presented and discussed. Finally, integration and validation of the waveform are one of the major challenges of GARISS: The Experiment Development System (EDS) and the the Ground Integration Unit (GIU) for VV will be used prior to conducting the experiment on the ISS. The EDS can be used in lab environment and allows prototyping and verification activities with the simulator, but does not include all hardware components. The GIU on the other side is the flight model which replicates the flying equipment, but has limited flexibility for testing.As conclusion, the project is now approaching the Preliminary Design Review (PDR) and indeed only preliminary results are available. This paper is an opportunity to present the GARISS mission as part of an International cooperation between ESA, NASA and Qascom. The preliminary results include GPS and Galileo processing from space signals, the challenges and trade off decisions, the high level STRS architecture and foreseen experimentation campaign. Detailed results from the test campaigns are expected in 2017.

  8. Fatigue and fracture: Overview

    NASA Technical Reports Server (NTRS)

    Halford, G. R.

    1984-01-01

    A brief overview of the status of the fatigue and fracture programs is given. The programs involve the development of appropriate analytic material behavior models for cyclic stress-strain-temperature-time/cyclic crack initiation, and cyclic crack propagation. The underlying thrust of these programs is the development and verification of workable engineering methods for the calculation, in advance of service, of the local cyclic stress-strain response at the critical life governing location in hot section compounds, and the resultant crack initiation and crack growth lifetimes.

  9. Case-Study of the High School Student's Family Values Formation

    ERIC Educational Resources Information Center

    Valeeva, Roza A.; Korolyeva, Natalya E.; Sakhapova, Farida Kh.

    2016-01-01

    The aim of the research is the theoretical justification and experimental verification of content, complex forms and methods to ensure effective development of the high school students' family values formation. 93 lyceum students from Kazan took part in the experiment. To study students' family values we have applied method of studying personality…

  10. Bullying in School: Case Study of Prevention and Psycho-Pedagogical Correction

    ERIC Educational Resources Information Center

    Ribakova, Laysan A.; Valeeva, Roza A.; Merker, Natalia

    2016-01-01

    The purpose of the study was the theoretical justification and experimental verification of content, complex forms and methods to ensure effective prevention and psycho-pedagogical correction of bullying in school. 53 teenage students from Kazan took part in the experiment. A complex of diagnostic techniques for the detection of violence and…

  11. Experimental Verification of a Pneumatic Transport System for the Rapid Evacuation of Tunnels, Part II - Test Program

    DOT National Transportation Integrated Search

    1978-12-01

    This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...

  12. Conservation of Mechanical and Electric Energy: Simple Experimental Verification

    ERIC Educational Resources Information Center

    Ponikvar, D.; Planinsic, G.

    2009-01-01

    Two similar experiments on conservation of energy and transformation of mechanical into electrical energy are presented. Both can be used in classes, as they offer numerous possibilities for discussion with students and are simple to perform. Results are presented and are precise within 20% for the version of the experiment where measured values…

  13. Shuttle structural dynamics characteristics: The analysis and verification

    NASA Technical Reports Server (NTRS)

    Modlin, C. T., Jr.; Zupp, G. A., Jr.

    1985-01-01

    The space shuttle introduced a new dimension in the complexity of the structural dynamics of a space vehicle. The four-body configuration exhibited structural frequencies as low as 2 hertz with a model density on the order of 10 modes per hertz. In the verification process, certain mode shapes and frequencies were identified by the users as more important than others and, as such, the test objectives were oriented toward experimentally extracting those modes and frequencies for analysis and test correlation purposes. To provide the necessary experimental data, a series of ground vibration tests (GVT's) was conducted using test articles ranging from the 1/4-scale structural replica of the space shuttle to the full-scale vehicle. The vibration test and analysis program revealed that the mode shapes and frequency correlations below 10 hertz were good. The quality of correlation of modes between 10 and 20 hertz ranged from good to fair and that of modes above 20 hertz ranged from poor to good. Since the most important modes, based on user preference, were below 10 hertz, it was judged that the shuttle structural dynamic models were adequate for flight certifications.

  14. Mechanical verification of soft-tissue attachment on bioactive glasses and titanium implants.

    PubMed

    Zhao, Desheng; Moritz, Niko; Vedel, Erik; Hupa, Leena; Aro, Hannu T

    2008-07-01

    Soft-tissue attachment is a desired feature of many clinical biomaterials. The aim of the current study was to design a suitable experimental method for tensile testing of implant incorporation with soft-tissues. Conical implants were made of three compositions of bioactive glass (SiO(2)-P(2)O(5)-B(2)O(3)-Na(2)O-K(2)O-CaO-MgO) or titanium fiber mesh (porosity 84.7%). The implants were surgically inserted into the dorsal subcutaneous soft-tissue or back muscles in the rat. Soft-tissue attachment was evaluated by pull-out testing using a custom-made jig 8 weeks after implantation. Titanium fiber mesh implants had developed a relatively high pull-out force in subcutaneous tissue (12.33+/-5.29 N, mean+/-SD) and also measurable attachment with muscle tissue (2.46+/-1.33 N). The bioactive glass implants failed to show mechanically relevant soft-tissue bonding. The experimental set-up of mechanical testing seems to be feasible for verification studies of soft-tissue attachment. The inexpensive small animal model is beneficial for large-scale in vivo screening of new biomaterials.

  15. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  16. [Study on balance group in steady-state extraction process of Chinese medicine and experimental verification to Houttuynia cordata].

    PubMed

    Liu, Wenlong; Zhang, Xili; He, Fuyuan; Zhang, Ping; Wang, Haiqin; Wu, Dezhi; Chen, Zuohong

    2011-11-01

    To establish and experimental verification the mathematical model of the balance groups that is the steady-state of traditional Chinese medicine in extraction. Using the entropy and genetic principles of statistics, and taking the coefficient of variation of GC fingerprint which is the naphtha of the Houttuynia cordata between strains in the same GAP place as a pivot to establish and verify the mathematical model was established of the balance groups that is the steady-state of traditional Chinese medicine in extraction. A mathematical model that is suitable for the balance groups of the steady-state of traditional Chinese medicine and preparation in extraction, and the balance groups which is 29 683 strains (approximately 118.7 kg) were gained with the same origin of H. cordata as the model drug. Under the GAP of quality control model, controlling the stability of the quality through further using the Hardy-Weinberg balance groups of the H. cordata between strains, the new theory and experiment foundation is established for the steady-state of traditional Chinese medicine in extraction and quality control.

  17. Learning Experience on Transformer Using HOT Lab for Pre-service Physics Teacher’s

    NASA Astrophysics Data System (ADS)

    Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.

    2017-09-01

    This study aimed at investigating pre-service teacher’s critical thinking skills improvement through Higher Order Thinking (HOT) Lab on transformer learning. This research used mix method with the embedded experimental model. Research subjects are 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The results showed that based on the results of the analysis of practical reports and observation sheet shows students in the experimental group was better in carrying out the practicum and can solve the real problem while the control group was going on the opposite. The critical thinking skills of students applying the HOT Lab were higher than the verification lab. Critical thinking skills could increase due to HOT Lab based problems solving that can develop higher order thinking skills through laboratory activities. Therefore, it was concluded that the application of HOT Lab was more effective than verification lab on improving students’ thinking skills on transformer topic learning. Finally, HOT Lab can be implemented in other subject learning and could be used to improve another higher order thinking skills.

  18. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    NASA Astrophysics Data System (ADS)

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m < n. The determination of an input control strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  19. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    PubMed

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  20. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing.

    PubMed

    Yassin, Ali A

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.

  1. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing

    PubMed Central

    Yassin, Ali A.

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051

  2. Quantum money with nearly optimal error tolerance

    NASA Astrophysics Data System (ADS)

    Amiri, Ryan; Arrazola, Juan Miguel

    2017-06-01

    We present a family of quantum money schemes with classical verification which display a number of benefits over previous proposals. Our schemes are based on hidden matching quantum retrieval games and they tolerate noise up to 23 % , which we conjecture reaches 25 % asymptotically as the dimension of the underlying hidden matching states is increased. Furthermore, we prove that 25 % is the maximum tolerable noise for a wide class of quantum money schemes with classical verification, meaning our schemes are almost optimally noise tolerant. We use methods in semidefinite programming to prove security in a substantially different manner to previous proposals, leading to two main advantages: first, coin verification involves only a constant number of states (with respect to coin size), thereby allowing for smaller coins; second, the reusability of coins within our scheme grows linearly with the size of the coin, which is known to be optimal. Last, we suggest methods by which the coins in our protocol could be implemented using weak coherent states and verified using existing experimental techniques, even in the presence of detector inefficiencies.

  3. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  4. Matrix Dominated Failure of Fiber-Reinforced Composite Laminates Under Static and Dynamic Loading

    NASA Astrophysics Data System (ADS)

    Schaefer, Joseph Daniel

    Hierarchical material systems provide the unique opportunity to connect material knowledge to solving specific design challenges. Representing the quickest growing class of hierarchical materials in use, fiber-reinforced polymer composites (FRPCs) offer superior strength and stiffness-to-weight ratios, damage tolerance, and decreasing production costs compared to metals and alloys. However, the implementation of FRPCs has historically been fraught with inadequate knowledge of the material failure behavior due to incomplete verification of recent computational constitutive models and improper (or non-existent) experimental validation, which has severely slowed creation and development. Noted by the recent Materials Genome Initiative and the Worldwide Failure Exercise, current state of the art qualification programs endure a 20 year gap between material conceptualization and implementation due to the lack of effective partnership between computational coding (simulation) and experimental characterization. Qualification processes are primarily experiment driven; the anisotropic nature of composites predisposes matrix-dominant properties to be sensitive to strain rate, which necessitates extensive testing. To decrease the qualification time, a framework that practically combines theoretical prediction of material failure with limited experimental validation is required. In this work, the Northwestern Failure Theory (NU Theory) for composite lamina is presented as the theoretical basis from which the failure of unidirectional and multidirectional composite laminates is investigated. From an initial experimental characterization of basic lamina properties, the NU Theory is employed to predict the matrix-dependent failure of composites under any state of biaxial stress from quasi-static to 1000 s-1 strain rates. It was found that the number of experiments required to characterize the strain-rate-dependent failure of a new composite material was reduced by an order of magnitude, and the resulting strain-rate-dependence was applicable for a large class of materials. The presented framework provides engineers with the capability to quickly identify fiber and matrix combinations for a given application and determine the failure behavior over the range of practical loadings cases. The failure-mode-based NU Theory may be especially useful when partnered with computational approaches (which often employ micromechanics to determine constituent and constitutive response) to provide accurate validation of the matrix-dominated failure modes experienced by laminates during progressive failure.

  5. SU-F-J-25: Position Monitoring for Intracranial SRS Using BrainLAB ExacTrac Snap Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, S; McCaw, T; Huq, M

    2016-06-15

    Purpose: To determine the accuracy of position monitoring with BrainLAB ExacTrac snap verification following couch rotations during intracranial SRS. Methods: A CT scan of an anthropomorphic head phantom was acquired using 1.25mm slices. The isocenter was positioned near the centroid of the frontal lobe. The head phantom was initially aligned on the treatment couch using cone-beam CT, then repositioned using ExacTrac x-ray verification with residual errors less than 0.2mm and 0.2°. Snap verification was performed over the full range of couch angles in 15° increments with known positioning offsets of 0–3mm applied to the phantom along each axis. At eachmore » couch angle, the smallest tolerance was determined for which no positioning deviation was detected. Results: For couch angles 30°–60° from the center position, where the longitudinal axis of the phantom is approximately aligned with the beam axis of one x-ray tube, snap verification consistently detected positioning errors exceeding the maximum 8mm tolerance. Defining localization error as the difference between the known offset and the minimum tolerance for which no deviation was detected, the RMS error is mostly less than 1mm outside of couch angles 30°–60° from the central couch position. Given separate measurements of patient position from the two imagers, whether to proceed with treatment can be determined by the criterion of a reading within tolerance from just one (OR criterion) or both (AND criterion) imagers. Using a positioning tolerance of 1.5mm, snap verification has sensitivity and specificity of 94% and 75%, respectively, with the AND criterion, and 67% and 93%, respectively, with the OR criterion. If readings exceeding maximum tolerance are excluded, the sensitivity and specificity are 88% and 86%, respectively, with the AND criterion. Conclusion: With a positioning tolerance of 1.5mm, ExacTrac snap verification can be used during intracranial SRS with sensitivity and specificity between 85% and 90%.« less

  6. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  7. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  8. Experimental demonstration of an isotope-sensitive warhead verification technique using nuclear resonance fluorescence.

    PubMed

    Vavrek, Jayson R; Henderson, Brian S; Danagoulian, Areg

    2018-04-24

    Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618-8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal from the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy "genuine" and "hoax" objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.

  9. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors

    PubMed Central

    Ravikumar, Balaguru; Parri, Elina; Timonen, Sanna; Airola, Antti; Wennerberg, Krister

    2017-01-01

    Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001) between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel-based modeling approach offers practical benefits for probing novel insights into the mode of action of investigational compounds, and for the identification of new target selectivities for drug repurposing applications. PMID:28787438

  10. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  11. NAS Grid Benchmarks. 1.0

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob; Frumkin, Michael; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We provide a paper-and-pencil specification of a benchmark suite for computational grids. It is based on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks (NPB) and is called the NAS Grid Benchmarks (NGB). NGB problems are presented as data flow graphs encapsulating an instance of a slightly modified NPB task in each graph node, which communicates with other nodes by sending/receiving initialization data. Like NPB, NGB specifies several different classes (problem sizes). In this report we describe classes S, W, and A, and provide verification values for each. The implementor has the freedom to choose any language, grid environment, security model, fault tolerance/error correction mechanism, etc., as long as the resulting implementation passes the verification test and reports the turnaround time of the benchmark.

  12. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Wieman, Carl

    2015-09-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and unsatisfying, and "cookbook." Here, both to help understand the reason for such discrepant views and to aid in the design of instructional lab courses, I compare the mental tasks or types of thinking ("cognitive task analysis") associated with a physicist doing tabletop experimental research with the cognitive tasks of students in an introductory physics instructional lab involving traditional verification/confirmation exercises.

  13. Experimental verification of steerability via geometric Bell-like inequalities

    NASA Astrophysics Data System (ADS)

    Li, Jian; Wang, Cen-Yang; Liu, Tong-Jun; Wang, Qin

    2018-03-01

    Quantum steering is one form of quantum correlations interpolating between entanglement and Bell nonlocality, which in some cases can be detected by various steering inequalities. Recently, a remarkable and useful steerability criterion via geometric Bell-like inequalities was established [M. Zukowski, A. Dutta, and Z. Yin, Phys. Rev. A 91, 032107 (2015), 10.1103/PhysRevA.91.032107]. We report an experimental investigation of this steering criterion and verify the geometric Bell-like steering inequality experimentally by using of the Werner states. The results demonstrate that the geometric Bell-like steering inequality is a convenient tool to detect quantum steering both theoretically and practically.

  14. Experimental demonstration of three-dimensional broadband underwater acoustic carpet cloak

    NASA Astrophysics Data System (ADS)

    Bi, Yafeng; Jia, Han; Sun, Zhaoyong; Yang, Yuzhen; Zhao, Han; Yang, Jun

    2018-05-01

    We present the design, architecture, and detailed performance of a three-dimensional (3D) underwater acoustic carpet cloak (UACC). The proposed system of the 3D UACC is an octahedral pyramid, which is composed of periodical steel strips. This underwater acoustic device, placed over the target to hide, is able to manipulate the scattered wavefront to mimic a reflecting plane. The effectiveness of the prototype is experimentally demonstrated in an anechoic tank. The measured acoustic pressure distributions show that the 3D UACC can work in all directions in a wide frequency range. This experimental verification of 3D device paves the way for guidelines on future practical applications.

  15. Experimental verification of low sonic boom configuration

    NASA Technical Reports Server (NTRS)

    Ferri, A.; Wang, H. H.; Sorensen, H.

    1972-01-01

    A configuration designed to produce near field signature has been tested at M = 2.71 and the results are analyzed, by taking in account three-dimensional and second order effects. The configuration has an equivalent total area distribution that corresponds to an airplane flying at 60,000 ft. having a weight of 460,000 lbs, and 300 ft. length. A maximum overpressure of 0.95 lb/square foot has been obtained experimentally. The experimental results agree well with the analysis. The investigation indicates that the three-dimensional effects are very important when the measurements in wind tunnels are taken at small distances from the airplane.

  16. Stepwise verification of bone regeneration using recombinant human bone morphogenetic protein-2 in rat fibula model

    PubMed Central

    2017-01-01

    Objectives The purpose of this study was to introduce our three experiments on bone morphogenetic protein (BMP) and its carriers performed using the critical sized segmental defect (CSD) model in rat fibula and to investigate development of animal models and carriers for more effective bone regeneration. Materials and Methods For the experiments, 14, 16, and 24 rats with CSDs on both fibulae were used in Experiments 1, 2, and 3, respectively. BMP-2 with absorbable collagen sponge (ACS) (Experiments 1 and 2), autoclaved autogenous bone (AAB) and fibrin glue (FG) (Experiment 3), and xenogenic bone (Experiment 2) were used in the experimental groups. Radiographic and histomorphological evaluations were performed during the follow-up period of each experiment. Results Significant new bone formation was commonly observed in all experimental groups using BMP-2 compared to control and xenograft (porcine bone) groups. Although there was some difference based on BMP carrier, regenerated bone volume was typically reduced by remodeling after initially forming excessive bone. Conclusion BMP-2 demonstrates excellent ability for bone regeneration because of its osteoinductivity, but efficacy can be significantly different depending on its delivery system. ACS and FG showed relatively good bone regeneration capacity, satisfying the essential conditions of localization and release-control when used as BMP carriers. AAB could not provide release-control as a BMP carrier, but its space-maintenance role was remarkable. Carriers and scaffolds that can provide sufficient support to the BMP/carrier complex are necessary for large bone defects, and AAB is thought to be able to act as an effective scaffold. The CSD model of rat fibula is simple and useful for initial estimate of bone regeneration by agents including BMPs. PMID:29333367

  17. Effect of electron contamination on in vivo dosimetry for lung block shielding during TBI

    PubMed Central

    Narayanasamy, Ganesh; Cruz, Wilbert; Saenz, Daniel L.; Stathakis, Sotirios; Papanikolaou, Niko

    2016-01-01

    Our institution performs in vivo verification measurement for each of our total body irradiation (TBI) patients with optically stimulated luminescent dosimeters (OSLD). The lung block verification measurements were commonly higher than expected. The aim of this work is to understand this discrepancy and improve the accuracy of these lung block verification measurements. Initially, the thickness of the lung block was increased to provide adequate lung sparing. Further tests revealed the increase was due to electron contamination dose emanating from the lung block. The thickness of the bolus material covering the OSLD behind the lung block was increased to offset the electron contamination. In addition, the distance from the lung block to the dosimeter was evaluated for its effect on the OSLD reading and found to be clinically insignificant over the range of variability in our clinic. The results show that the improved TBI treatment technique provides for better accuracy of measured dose in vivo and consistency of patient setup. PACS number(s): 87.53.Bn, 87.53.Kn, 87.55.N‐, 87.55.Qr PMID:27167290

  18. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  19. Real time radiotherapy verification with Cherenkov imaging: development of a system for beamlet verification

    NASA Astrophysics Data System (ADS)

    Pogue, B. W.; Krishnaswamy, V.; Jermyn, M.; Bruza, P.; Miao, T.; Ware, William; Saunders, S. L.; Andreozzi, J. M.; Gladstone, D. J.; Jarvis, L. A.

    2017-05-01

    Cherenkov imaging has been shown to allow near real time imaging of the beam entrance and exit on patient tissue, with the appropriate intensified camera and associated image processing. A dedicated system has been developed for research into full torso imaging of whole breast irradiation, where the dual camera system captures the beam shape for all beamlets used in this treatment protocol. Particularly challenging verification measurement exists in dynamic wedge, field in field, and boost delivery, and the system was designed to capture these as they are delivered. Two intensified CMOS (ICMOS) cameras were developed and mounted in a breast treatment room, and pilot studies for intensity and stability were completed. Software tools to contour the treatment area have been developed and are being tested prior to initiation of the full trial. At present, it is possible to record delivery of individual beamlets as small as a single MLC thickness, and readout at 20 frames per second is achieved. Statistical analysis of system repeatibilty and stability is presented, as well as pilot human studies.

  20. UNSCOM faces entirely new verification challenges in Iraq

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trevan, T.

    1993-04-01

    Starting with the very first declarations and inspections, it became evident that Iraq was not acting in good faith, would use every possible pretext to reinterpret UNSCOM's inspection rights, and occasionally would use harassment tactics to make inspections as difficult as possible. Topics considered in detail include; initial assumptions, outstanding issues, and UNSCOM's future attitude.

  1. A New Era in Medical Training Through Simulation-Based Training Systems

    DTIC Science & Technology

    2010-04-01

    next steps is to seek initial verification against published data. 2.2 Ultra-High Resolution Display for Army Medicine (UHRDARM) The eMagin ...view between 60 and 80 degrees while only consuming less than 2 watts of total power consumption. Figures 1 and 2: Photos courtesy of eMagin

  2. 40 CFR 75.63 - Initial certification or recertification application.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... requirement of § 75.59(e), for DAHS (missing data and formula) verifications, no hardcopy submittal is... test data and results to the Administrator. (B) To the applicable EPA Regional Office and the... part) tests or any CEMS data analysis used to derive a fuel-and-unit-specific default NOX emission rate...

  3. 19 CFR 181.72 - Verification scope and method.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... written questionnaire sent to an exporter or a producer, including a producer of a material, in Canada or Mexico. The questionnaire: (A) May be sent by certified or registered mail, or by any other method that... treatment on the good. (d) Failure to respond to letter or questionnaire—(1) Nonresponse to initial letter...

  4. Soundscapes

    DTIC Science & Technology

    2013-09-30

    STATEMENT A. Approved for public release; distribution is unlimited. . Soundscapes Michael B...models to provide hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on...APPROACH The research has two principle thrusts: 1) the modeling of the soundscape , and 2) verification using datasets that have been collected

  5. Soundscapes

    DTIC Science & Technology

    2012-09-30

    STATEMENT A. Approved for public release; distribution is unlimited. . Soundscapes Michael B...models to provide hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on...APPROACH The research has two principle thrusts: 1) the modeling of the soundscape , and 2) verification using datasets that have been collected

  6. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Technical Reports Server (NTRS)

    Watts, A. W.

    1982-01-01

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  7. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  8. New generalized Noh solutions for HEDP hydrocode verification

    NASA Astrophysics Data System (ADS)

    Velikovich, A. L.; Giuliani, J. L.; Zalesak, S. T.; Tangri, V.

    2017-10-01

    The classic Noh solution describing stagnation of a cold ideal gas in a strong accretion shock wave has been the workhorse of compressible hydrocode verification for over three decades. We describe a number of its generalizations available for HEDP code verification. First, for an ideal gas, we have obtained self-similar solutions that describe adiabatic convergence either of a finite-pressure gas into an empty cavity or of a finite-amplitude sound wave into a uniform resting gas surrounding the center or axis of symmetry. At the moment of collapse such a flow produces a uniform gas whose velocity at each point is constant and directed towards the axis or the center, i. e. the initial condition similar to the classic solution but with a finite pressure of the converging gas. After that, a constant-velocity accretion shock propagates into the incident gas whose pressure and velocity profiles are not flat, in contrast with the classic solution. Second, for an arbitrary equation of state, we demonstrate the existence of self-similar solutions of the Noh problem in cylindrical and spherical geometry. Examples of such solutions with a three-term equation of state that includes cold, thermal ion/lattice, and thermal electron contributions are presented for aluminum and copper. These analytic solutions are compared to our numerical simulation results as an example of their use for code verification. Work supported by the US DOE/NNSA.

  9. Neutron Source Facility Training Simulator Based on EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has beenmore » widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.« less

  10. Oxygen beams for therapy: advanced biological treatment planning and experimental verification

    NASA Astrophysics Data System (ADS)

    Sokol, O.; Scifoni, E.; Tinganelli, W.; Kraft-Weyrather, W.; Wiedemann, J.; Maier, A.; Boscolo, D.; Friedrich, T.; Brons, S.; Durante, M.; Krämer, M.

    2017-10-01

    Nowadays there is a rising interest towards exploiting new therapeutical beams beyond carbon ions and protons. In particular, 16 O ions are being widely discussed due to their increased LET distribution. In this contribution, we report on the first experimental verification of biologically optimized treatment plans, accounting for different biological effects, generated with the TRiP98 planning system with 16 O beams, performed at HIT and GSI. This implies the measurements of 3D profiles of absorbed dose as well as several biological measurements. The latter includes the measurements of relative biological effectiveness along the range of linear energy transfer values from  ≈20 up to  ≈750 keV μ m-1 , oxygen enhancement ratio values and the verification of the kill-painting approach, to overcome hypoxia, with a phantom imitating an unevenly oxygenated target. With the present implementation, our treatment planning system is able to perform a comparative analysis of different ions, according to any given condition of the target. For the particular cases of low target oxygenation, 16 O ions demonstrate a higher peak-to-entrance dose ratio for the same cell killing in the target region compared to 12 C ions. Based on this phenomenon, we performed a short computational analysis to reveal the potential range of treatment plans, where 16 O can benefit over lighter modalities. It emerges that for more hypoxic target regions (partial oxygen pressure of  ≈0.15% or lower) and relatively low doses (≈4 Gy or lower) the choice of 16 O over 12 C or 4 He may be justified.

  11. The Experimental Regional Ensemble Forecast System (ExREF): Its Use in NWS Forecast Operations and Preliminary Verification

    NASA Technical Reports Server (NTRS)

    Reynolds, David; Rasch, William; Kozlowski, Daniel; Burks, Jason; Zavodsky, Bradley; Bernardet, Ligia; Jankov, Isidora; Albers, Steve

    2014-01-01

    The Experimental Regional Ensemble Forecast (ExREF) system is a tool for the development and testing of new Numerical Weather Prediction (NWP) methodologies. ExREF is run in near-realtime by the Global Systems Division (GSD) of the NOAA Earth System Research Laboratory (ESRL) and its products are made available through a website, an ftp site, and via the Unidata Local Data Manager (LDM). The ExREF domain covers most of North America and has 9-km horizontal grid spacing. The ensemble has eight members, all employing WRF-ARW. The ensemble uses a variety of initial conditions from LAPS and the Global Forecasting System (GFS) and multiple boundary conditions from the GFS ensemble. Additionally, a diversity of physical parameterizations is used to increase ensemble spread and to account for the uncertainty in forecasting extreme precipitation events. ExREF has been a component of the Hydrometeorology Testbed (HMT) NWP suite in the 2012-2013 and 2013-2014 winters. A smaller domain covering just the West Coast was created to minimize band-width consumption for the NWS. This smaller domain has and is being distributed to the National Weather Service (NWS) Weather Forecast Office and California Nevada River Forecast Center in Sacramento, California, where it is ingested into the Advanced Weather Interactive Processing System (AWIPS I and II) to provide guidance on the forecasting of extreme precipitation events. This paper will review the cooperative effort employed by NOAA ESRL, NASA SPoRT (Short-term Prediction Research and Transition Center), and the NWS to facilitate the ingest and display of ExREF data utilizing the AWIPS I and II D2D and GFE (Graphical Software Editor) software. Within GFE is a very useful verification software package called BoiVer that allows the NWS to utilize the River Forecast Center's 4 km gridded QPE to compare with all operational NWP models 6-hr QPF along with the ExREF mean 6-hr QPF so the forecasters can build confidence in the use of the ExREF in preparing their rainfall forecasts. Preliminary results will be presented.

  12. Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens

    PubMed Central

    Lucon, Enrico; McCowan, Chris N.; Santoyo, Ray L.

    2015-01-01

    The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of −40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at −40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator’s skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at −40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses. PMID:26958453

  13. Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens.

    PubMed

    Lucon, Enrico; McCowan, Chris N; Santoyo, Ray L

    2015-01-01

    The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of -40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at -40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator's skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at -40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses.

  14. Coalescence induced dislocation reduction in selectively grown lattice-mismatched heteroepitaxy: Theoretical prediction and experimental verification

    NASA Astrophysics Data System (ADS)

    Yako, Motoki; Ishikawa, Yasuhiko; Wada, Kazumi

    2018-05-01

    A method for reduction of threading dislocation density (TDD) in lattice-mismatched heteroepitaxy is proposed, and the reduction is experimentally verified for Ge on Si. Flat-top epitaxial layers are formed through coalescences of non-planar selectively grown epitaxial layers, and enable the TDD reduction in terms of image force. Numerical calculations and experiments for Ge on Si verify the TDD reduction by this method. The method should be applicable to not only Ge on Si but also other lattice-mismatched heteroepitaxy such as III-V on Si.

  15. The decay of 'mesotrons' (1939-1943), experimental particle physics in the age of innocence

    NASA Astrophysics Data System (ADS)

    Rossi, B.

    An account is given of the experimental work carried out by the author and his associates during the years 1939 through 1943, which produced the first unambiguous evidence of the spontaneous decay of 'mesotrons', showed that this decay occurred according to an exponential law, as expected, and measured the mean life with a 3 percent accuracy. A byproduct of this work was a verification of the relativistic equation for the dilation of time intervals. Previously announced in STAR as N81-76151

  16. Experimental device for measuring the dynamic properties of diaphragm motors

    NASA Astrophysics Data System (ADS)

    Fojtášek, Kamil; Dvořák, Lukáš; Mejzlík, Jan

    The subject of this paper is to design and description of the experimental device for the determination dynamic properties of diaphragm pneumatic motors. These motors are structurally quite different from conventional pneumatic linear cylinders. The working fluid is typically compressed air, the piston of motor is replaced by an elastic part and during the working cycle there is a contact of two elastic environments. In the manufacturers catalogs of these motors are not given any working characteristics. Description of the dynamic behavior of diaphragm motor will be used for verification of mathematical models.

  17. Cymatics for the cloaking of flexural vibrations in a structured plate

    PubMed Central

    Misseroni, D.; Colquitt, D. J.; Movchan, A. B.; Movchan, N. V.; Jones, I. S.

    2016-01-01

    Based on rigorous theoretical findings, we present a proof-of-concept design for a structured square cloak enclosing a void in an elastic lattice. We implement high-precision fabrication and experimental testing of an elastic invisibility cloak for flexural waves in a mechanical lattice. This is accompanied by verifications and numerical modelling performed through finite element simulations. The primary advantage of our square lattice cloak, over other designs, is the straightforward implementation and the ease of construction. The elastic lattice cloak, implemented experimentally, shows high efficiency. PMID:27068339

  18. Experimental verification of ‘waveguide’ plasmonics

    NASA Astrophysics Data System (ADS)

    Prudêncio, Filipa R.; Costa, Jorge R.; Fernandes, Carlos A.; Engheta, Nader; Silveirinha, Mário G.

    2017-12-01

    Surface plasmons polaritons are collective excitations of an electron gas that occur at an interface between negative-ɛ and positive-ɛ media. Here, we report the experimental observation of such surface waves using simple waveguide metamaterials filled only with available positive-ɛ media at microwave frequencies. In contrast to optical designs, in our setup the propagation length of the surface plasmons can be rather long as low loss conventional dielectrics are chosen to avoid typical losses from negative-ɛ media. Plasmonic phenomena have potential applications in enhancing light-matter interactions, implementing nanoscale photonic circuits and integrated photonics.

  19. Experimental verification of an eddy-current bearing

    NASA Technical Reports Server (NTRS)

    Nikolajsen, Jorgen L.

    1989-01-01

    A new type of electromagnetic bearing was built and tested. It consists of fixed AC-electromagnets in a star formation surrounding a conducting rotor. The bearing works by repulsion due to eddy-currents induced in the rotor. A single bearing is able to fully support a short rotor. The rotor support is inherently stable in all five degrees of freedom. No feedback control is needed. The bearing is also able to accelerate the rotor up to speed and decelerate the rotor back to standstill. The bearing design and the experimentation to verify its capabilities are described.

  20. Comment on "Protein sequences from mastodon and Tyrannosaurus rex revealed by mass spectrometry".

    PubMed

    Pevzner, Pavel A; Kim, Sangtae; Ng, Julio

    2008-08-22

    Asara et al. (Reports, 13 April 2007, p. 280) reported sequencing of Tyrannosaurus rex proteins and used them to establish the evolutionary relationships between birds and dinosaurs. We argue that the reported T. rex peptides may represent statistical artifacts and call for complete data release to enable experimental and computational verification of their findings.

Top