NASA Astrophysics Data System (ADS)
Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong
2011-04-01
As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.
NASA Technical Reports Server (NTRS)
Ponchak, George E.; Chun, Donghoon; Katehi, Linda P. B.; Yook, Jong-Gwan
1999-01-01
Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior 3D-FEM electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually increases coupling between the lines; however, if the top of the via posts are connected by a metal Strip, coupling is reduced. In this paper, experimental verification of the 3D-FEM simulations Is demonstrated for commercially fabricated LTCC packages.
Cleanup Verification Package for the 300-18 Waste Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
This cleanup verification package documents completion of remedial action for the 300-18 waste site. This site was identified as containing radiologically contaminated soil, metal shavings, nuts, bolts, and concrete.
Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas
2008-01-01
The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158
Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...
Verification study of an emerging fire suppression system
Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...
2016-01-01
Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less
Verification study of an emerging fire suppression system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.
Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less
Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje
2012-05-01
The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan
1997-01-01
The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.
ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM
The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...
ETV REPORT AND VERIFICATION STATEMENT; EVALUATION OF LOBO LIQUIDS RINSE WATER RECOVERY SYSTEM
The Lobo Liquids Rinse Water Recovery System (Lobo Liquids system) was tested, under actual production conditions, processing metal finishing wastewater, at Gull Industries in Houston, Texas. The verification test evaluated the ability of the ion exchange (IX) treatment system t...
The USEPA has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The ETV P2 Metal Finishing Technologies (ETV-MF) Prog...
NASA Technical Reports Server (NTRS)
Ponchak, George E.; Chun, Donghoon; Yook, Jong-Gwan; Katehi, Linda P. B.
2001-01-01
Coupling between microstrip lines in dense RF packages is a common problem that degrades circuit performance. Prior three-dimensional-finite element method (3-D-FEM) electromagnetic simulations have shown that metal filled via hole fences between two adjacent microstrip lines actually Increases coupling between the lines: however, if the top of the via posts are connected by a metal strip, coupling is reduced. In this paper, experimental verification of the 3-D-FEM simulations is demonstrated for commercially fabricated low temperature cofired ceramic (LTCC) packages. In addition, measured attenuation of microstrip lines surrounded by the shielding structures is presented and shows that shielding structures do not change the attenuation characteristics of the line.
Experimental Verification of Boyle's Law and the Ideal Gas Law
ERIC Educational Resources Information Center
Ivanov, Dragia Trifonov
2007-01-01
Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…
Ac electronic tunneling at optical frequencies
NASA Technical Reports Server (NTRS)
Faris, S. M.; Fan, B.; Gustafson, T. K.
1974-01-01
Rectification characteristics of non-superconducting metal-barrier-metal junctions deduced from electronic tunneling have been observed experimentally for optical frequency irradiation of the junction. The results provide verification of optical frequency Fermi level modulation and electronic tunneling current modulation.
Thermodynamic Analysis of Oxygen-Enriched Direct Smelting of Jamesonite Concentrate
NASA Astrophysics Data System (ADS)
Zhang, Zhong-Tang; Dai, Xi; Zhang, Wen-Hai
2017-12-01
Thermodynamic analysis of oxygen-enriched direct smelting of jamesonite concentrate is reported in this article. First, the occurrence state of lead, antimony and other metallic elements in the smelting process was investigated theoretically. Then, the verification test was carried out. The results indicate that lead and antimony mainly exist in the alloy in the form of metallic lead and metallic antimony. Simultaneously, lead and antimony were also oxidized into the slag in the form of lead-antimony oxide. Iron and copper could be oxidized into the slag in the form of oxides in addition to combining with antimony in the alloy, while zinc was mainly oxidized into the slag in the form of zinc oxide. The verification test indicates that the main phases in the alloy contain metallic lead, metallic antimony and a small amount of Cu2Sb, FeSb2 intermetallic compounds, and the slag is mainly composed of kirschsteinite, fayalite and zinc oxide, in agreement with the thermodynamic analysis.
Sheet Metal Contract. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Kirkpatrick, Thomas; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of sheet metal, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train sheet metal workers. Section 1 contains general information: purpose of Phase I; description…
Predicted and tested performance of durable TPS
NASA Technical Reports Server (NTRS)
Shideler, John L.
1992-01-01
The development of thermal protection systems (TPS) for aerospace vehicles involves combining material selection, concept design, and verification tests to evaluate the effectiveness of the system. The present paper reviews verification tests of two metallic and one carbon-carbon thermal protection system. The test conditions are, in general, representative of Space Shuttle design flight conditions which may be more or less severe than conditions required for future space transportation systems. The results of this study are intended to help establish a preliminary data base from which the designers of future entry vehicles can evaluate the applicability of future concepts to their vehicles.
Cleanup Verification Package for the 600-47 Waste Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. J. Cutlip
This cleanup verification package documents completion of interim remedial action for the 600-47 waste site. This site consisted of several areas of surface debris and contamination near the banks of the Columbia River across from Johnson Island. Contaminated material identified in field surveys included four areas of soil, wood, nuts, bolts, and other metal debris.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-14
... respondents. See ``Verification of the Sales and Factors Response of Foshan Shunde (Guangzhou) Co., Ltd. in... ``Verification of the Sales and Factors Response of Since Hardware (Guangzhou) Co. Ltd. in the Antidumping Review... (Surrogate Country List). On August 17, 2010, the Department received information to value factors of...
FINAL REPORT FOR VERIFICATION OF THE METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (MFFPPT)
The United States Environmental Protection Agency (USEPA) has prepared a computer process simulation package for the metal finishing industry that enables users to predict process outputs based upon process inputs and other operating conditions. This report documents the developm...
Cleanup Verification Package for the 118-F-6 Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. M. Sulloway
2008-10-02
This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.
MOD-1 Wind Turbine Generator Analysis and Design Report, Volume 2
NASA Technical Reports Server (NTRS)
1979-01-01
The MOD-1 detail design is appended. The supporting analyses presented include a parametric system trade study, a verification of the computer codes used for rotor loads analysis, a metal blade study, and a definition of the design loads at each principal wind turbine generator interface for critical loading conditions. Shipping and assembly requirements, composite blade development, and electrical stability are also discussed.
2008-07-01
Leachate . ................................ 56 xi Table 24. Smelter Site Soil Lettuce Germination Percentage...sand soil (Table 22). This discovery was contrary to the hypothesized results. Archived samples of leachate from each treatment were examined...but after further investigation ,the pH and EC of the New Jersey leachate showed no remarkable differences between the unamended or sand unamended
Evaluating DFT for Transition Metals and Binaries: Developing the V/DM-17 Test Set
NASA Astrophysics Data System (ADS)
Decolvenaere, Elizabeth; Mattsson, Ann
We have developed the V-DM/17 test set to evaluate the experimental accuracy of DFT calculations of transition metals. When simulation and experiment disagree, the disconnect in length-scales and temperatures makes determining ``who is right'' difficult. However, methods to evaluate the experimental accuracy of functionals in the context of solid-state materials science, especially for transition metals, is lacking. As DFT undergoes a shift from a descriptive to a predictive tool, these issues of verification are becoming increasingly important. With undertakings like the Materials Project leading the way in high-throughput predictions and discoveries, the development of a one-size-fits-most approach to verification is critical. Our test set evaluates 26 transition metal elements and 80 transition metal alloys across three physical observables: lattice constants, elastic coefficients, and formation energy of alloys. Whether or not the formation energy can be reproduced measures whether the relevant physics are captured in a calculation. This is especially important question in transition metals, where active d-electrons can thwart commonly used techniques. In testing the V/DM-17 test set, we offer new views into the performance of existing functionals. Sandia National Labs is a multi-mission laboratory managed and operated by Sandia Corp., a wholly owned subsidiary of Lockheed Martin Corp., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
EPA ENVIRONMENTAL TECHNOLOGY EXPERIENCE
THE USEPA's Environmental Technology Verification for Metal Finishing Pollution Prevention Technologies (ETV-MF) Program verifies the performance of innovative, commercial-ready technologies designed to improve industry performance and achieve cost-effective pollution prevention ...
1985-09-01
the presence of high concentrations of polychlorinated biphenyls, polynuclear aromatic hydrocarbons , and heavy metals including Cu, 69 - >- j :ij...34Measurement of the Responses of Individuals to Environmental Stress and Pollution: Studies With Bivalve Molluscs ," Philosophical Transactions Royal...Gilfillan, E.S. 1980. "The Use of Scope-for-growth Measurements in Monitoring Petroleum
NASA Astrophysics Data System (ADS)
Slamet, Bachtiar, B. M.; Wulan, P. P. D. K.; Setiadi, Sari, D. P.
2017-05-01
The development of Ti6Al4V based anti bacterial dental implant, modified with dopanted silver metal (Ag) TiO2 nanotube arrays (TiNTAs), is studied in this research. The condition inside the mouth is less foton energy, the dental implant material need to be modified with silver metal (Ag) dopanted TiNTAs. Modified TiNTAs used silver metal dopanted with Photo Assisted Deposition (PAD) method can be used as an electron trapper and produced hydroxyl radical, therefore it has antibacterial properties. The verification of antibacterial properties developed with biofilm static test using Streptococcus mutans bacteria model within 3 and 16 hours incubation, was characterized with XRD and SEM-EDX. Properties test result that resisting the biofilm growth effectively is TiNTAs/Ag/0,15, with 97,62 % disinfection bacteria sampel.
NASA Astrophysics Data System (ADS)
Lubina, A. S.; Subbotin, A. S.; Sedov, A. A.; Frolov, A. A.
2016-12-01
The fast sodium reactor fuel assembly (FA) with U-Pu-Zr metallic fuel is described. In comparison with a "classical" fast reactor, this FA contains thin fuel rods and a wider fuel rod grid. Studies of the fluid dynamics and the heat transfer were carried out for such a new FA design. The verification of the ANSYS CFX code was provided for determination of the velocity, pressure, and temperature fields in the different channels. The calculations in the cells and in the FA were carried out using the model of shear stress transport (SST) selected at the stage of verification. The results of the hydrodynamics and heat transfer calculations have been analyzed.
Cleanup Verification Package for the 118-F-1 Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. J. Farris and H. M. Sulloway
2008-01-10
This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.
NASA Technical Reports Server (NTRS)
Williamson, Steve; Aman, Bob; Aurigema, Andrew; Melendez, Orlando
1999-01-01
The Wiltech Component Cleaning & Refurbishment Facility (WT-CCRF) at NASA Kennedy Space Center performs precision cleaning on approximately 200,000 metallic and non metallic components every year. WT-CCRF has developed a CFC elimination plan consisting of aqueous cleaning and verification and an economical dual solvent strategy for alternative solvent solution. Aqueous Verification Methodologies were implemented two years ago on a variety of Ground Support Equipment (GSE) components and sampling equipment. Today, 50% of the current workload is verified using aqueous methods and 90% of the total workload is degreased aqueously using, Zonyl and Brulin surfactants in ultrasonic baths. An additional estimated 20% solvent savings could be achieved if the proposed expanded use of aqueous methods are approved. Aqueous cleaning has shown to be effective, environmentally friendly and economical (i.e.. cost of materials, equipment, facilities and labor).
NASA Technical Reports Server (NTRS)
Cramer, B. A.; Davis, J. W.
1975-01-01
Analysis methods for predicting cyclic creep deflection in stiffened metal panel structures, were applied to full size panels. Results were compared with measured deflections from cyclic tests of thin gage L605, Rene' 41, and TDNiCr full size corrugation stiffened panels. A design criteria was then formulated for metallic thermal protection panels subjected to creep. A computer program was developed to calculate creep deflections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lubina, A. S., E-mail: lubina-as@nrcki.ru; Subbotin, A. S.; Sedov, A. A.
2016-12-15
The fast sodium reactor fuel assembly (FA) with U–Pu–Zr metallic fuel is described. In comparison with a “classical” fast reactor, this FA contains thin fuel rods and a wider fuel rod grid. Studies of the fluid dynamics and the heat transfer were carried out for such a new FA design. The verification of the ANSYS CFX code was provided for determination of the velocity, pressure, and temperature fields in the different channels. The calculations in the cells and in the FA were carried out using the model of shear stress transport (SST) selected at the stage of verification. The resultsmore » of the hydrodynamics and heat transfer calculations have been analyzed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.
The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. M. Dittmer
2008-01-31
The 116-C-3 waste site consisted of two underground storage tanks designed to receive mixed waste from the 105-C Reactor Metals Examination Facility chemical dejacketing process. Confirmatory evaluation and subsequent characterization of the site determined that the southern tank contained approximately 34,000 L (9,000 gal) of dejacketing wastes, and that the northern tank was unused. In accordance with this evaluation, the verification sampling and modeling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrate that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils.more » The results also show that residual contaminant concentrations are protective of groundwater and the Columbia River.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, M; Kang, S; Lee, S
Purpose: Implant-supported dentures seem particularly appropriate for the predicament of becoming edentulous and cancer patients are no exceptions. As the number of people having dental implants increased in different ages, critical dosimetric verification of metal artifact effects are required for the more accurate head and neck radiation therapy. The purpose of this study is to verify the theoretical analysis of the metal(streak and dark) artifact, and to evaluate dosimetric effect which cause by dental implants in CT images of patients with the patient teeth and implants inserted humanoid phantom. Methods: The phantom comprises cylinder which is shaped to simulate themore » anatomical structures of a human head and neck. Through applying various clinical cases, made phantom which is closely allied to human. Developed phantom can verify two classes: (i)closed mouth (ii)opened mouth. RapidArc plans of 4 cases were created in the Eclipse planning system. Total dose of 2000 cGy in 10 fractions is prescribed to the whole planning target volume (PTV) using 6MV photon beams. Acuros XB (AXB) advanced dose calculation algorithm, Analytical Anisotropic Algorithm (AAA) and progressive resolution optimizer were used in dose optimization and calculation. Results: In closed and opened mouth phantom, because dark artifacts formed extensively around the metal implants, dose variation was relatively higher than that of streak artifacts. As the PTV was delineated on the dark regions or large streak artifact regions, maximum 7.8% dose error and average 3.2% difference was observed. The averaged minimum dose to the PTV predicted by AAA was about 5.6% higher and OARs doses are also 5.2% higher compared to AXB. Conclusion: The results of this study showed that AXB dose calculation involving high-density materials is more accurate than AAA calculation, and AXB was superior to AAA in dose predictions beyond dark artifact/air cavity portion when compared against the measurements.« less
The U.S. Environmental Protection Agency (EPA), through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report describes ...
The U.S. Environmental Protection Agency (EPA), through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report describes ...
The U.S. Environmental Protection Agency (EPA), through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report describes ...
Microwelding of various metallic materials under ultravacuum (A0138-10)
NASA Technical Reports Server (NTRS)
Assie, J. P.
1984-01-01
In the space vacuum environment, the spacecraft mechanisms are liable to sustain damaging effects from microwelds due to molecular diffusion of the spacecraft constituent metals. Such microwelds result in a continuing increase in the friction factors and are even liable to jam the mechanisms altogether. The object of this experiment is to check the metal surfaces representative of the mechanism constituent metals (treated or untreated, lubricated or unlubricated) for microwelds afater an extended stay in the space environment. The experimental approach is to passively expose inert metal specimens to the space vacuum and to conduct end-of-mission verification of the significance of microwelds between various pairs of metal washers. The experiment will be located in one of the FRECOPA boxes in a 12-in. -deep peripheral tray that contains nine other experiments from France.
NASA Technical Reports Server (NTRS)
1974-01-01
A monograph is presented which establishes structural design criteria and recommends practices to ensure the design of sound composite structures, including composite-reinforced metal structures. (It does not discuss design criteria for fiber-glass composites and such advanced composite materials as beryllium wire or sapphire whiskers in a matrix material.) Although the criteria were developed for aircraft applications, they are general enough to be applicable to space vehicles and missiles as well. The monograph covers four broad areas: (1) materials, (2) design, (3) fracture control, and (4) design verification. The materials portion deals with such subjects as material system design, material design levels, and material characterization. The design portion includes panel, shell, and joint design, applied loads, internal loads, design factors, reliability, and maintainability. Fracture control includes such items as stress concentrations, service-life philosophy, and the management plan for control of fracture-related aspects of structural design using composite materials. Design verification discusses ways to prove flightworthiness.
Probabilistic Sizing and Verification of Space Ceramic Structures
NASA Astrophysics Data System (ADS)
Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit
2012-07-01
Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.
The Environmental Technology Verification report discusses the technology and performance of the Lubrizol Engine Control Systems Purifilter SC17L manufactured by Lubrizol Engine Control Systems. The technology is a precious and base metal, passively regenerated particulate filter...
Hariri, Azian; Paiman, Nuur Azreen; Leman, Abdul Mutalib; Md Yusof, Mohammad Zainal
2014-08-01
This study aimed to develop an index that can rank welding workplace that associate well with possible health risk of welders. Welding Fumes Health Index (WFHI) were developed based on data from case studies conducted in Plant 1 and Plant 2. Personal sampling of welding fumes to assess the concentration of metal constituents along with series of lung function tests was conducted. Fifteen metal constituents were investigated in each case study. Index values were derived from aggregation analysis of metal constituent concentration while significant lung functions were recognized through statistical analysis in each plant. The results showed none of the metal constituent concentration was exceeding the permissible exposure limit (PEL) for all plants. However, statistical analysis showed significant mean differences of lung functions between welders and non-welders. The index was then applied to one of the welding industry (Plant 3) for verification purpose. The developed index showed its promising ability to rank welding workplace, according to the multiple constituent concentrations of welding fumes that associates well with lung functions of the investigated welders. There was possibility that some of the metal constituents were below the detection limit leading to '0' value of sub index, thus the multiplicative form of aggregation model was not suitable for analysis. On the other hand, maximum or minimum operator forms suffer from compensation issues and were not considered in this study.
Verification of Ceramic Structures
NASA Astrophysics Data System (ADS)
Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit
2012-07-01
In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).
Mapping the Spatial Distribution of Metal-Bearing Oxides in VY Canis Majoris
NASA Astrophysics Data System (ADS)
Burkhardt, Andrew; Booth, S. Tom; Remijan, Anthony; Carroll, Brandon; Ziurys, Lucy M.
2015-06-01
The formation of silicate-based dust grains is not well constrained. Despite this, grain surface chemistry is essential to modern astrochemical formation models. In carbon-poor stellar envelopes, such as the red hypergiant VY Canis Majoris (VY CMa), metal-bearing oxides, the building blocks of silicate grains, dominate the grain formation, and thus are a key location to study dust chemistry. TiO_2, which was only first detected in the radio recently (Kaminski et al., 2013a), has been proposed to be a critical molecule for silicate grain formation, and not oxides containing more abundant metals (eg. Si, Fe, and Mg) (Gail and Sedlmayr, 1998). In addition, other molecules, such as SO_2, have been found to trace shells produced by numerous outflows pushing through the expanding envelope, resulting in a complex velocity structure (Ziurys et al., 2007). With the advanced capabilities of ALMA, it is now possible to individually resolve the velocity structure of each of these outflows and constrain the underlying chemistry in the region. Here, we present high resolution maps of rotational transitions of several metal-bearing oxides in VY CMa from the ALMA Band 7 and Band 9 Science Verification observations. With these maps, the physical parameters of the region and the formation chemistry of metal-bearing oxides will be studied.
Predicting Shear Transformation Events in Metallic Glasses
NASA Astrophysics Data System (ADS)
Xu, Bin; Falk, Michael L.; Li, J. F.; Kong, L. T.
2018-03-01
Shear transformation is the elementary process for plastic deformation of metallic glasses, the prediction of the occurrence of the shear transformation events is therefore of vital importance to understand the mechanical behavior of metallic glasses. In this Letter, from the view of the potential energy landscape, we find that the protocol-dependent behavior of shear transformation is governed by the stress gradient along its minimum energy path and we propose a framework as well as an atomistic approach to predict the triggering strains, locations, and structural transformations of the shear transformation events under different shear protocols in metallic glasses. Verification with a model Cu64 Zr36 metallic glass reveals that the prediction agrees well with athermal quasistatic shear simulations. The proposed framework is believed to provide an important tool for developing a quantitative understanding of the deformation processes that control mechanical behavior of metallic glasses.
Predicting Shear Transformation Events in Metallic Glasses.
Xu, Bin; Falk, Michael L; Li, J F; Kong, L T
2018-03-23
Shear transformation is the elementary process for plastic deformation of metallic glasses, the prediction of the occurrence of the shear transformation events is therefore of vital importance to understand the mechanical behavior of metallic glasses. In this Letter, from the view of the potential energy landscape, we find that the protocol-dependent behavior of shear transformation is governed by the stress gradient along its minimum energy path and we propose a framework as well as an atomistic approach to predict the triggering strains, locations, and structural transformations of the shear transformation events under different shear protocols in metallic glasses. Verification with a model Cu_{64}Zr_{36} metallic glass reveals that the prediction agrees well with athermal quasistatic shear simulations. The proposed framework is believed to provide an important tool for developing a quantitative understanding of the deformation processes that control mechanical behavior of metallic glasses.
This test simulated shipments of hazardous waste contained in polyethylene (poly) drums, metal drums, and corrugated boxes through routine land transportation routes and across international ports of entry in the El Paso/Ciudad Juarez trade area. RFID tags were attached to four ...
The TraceDetect's SafeGuard is designed to automatically measure total arsenic concentrations in drinking water samples (including raw water and treated water) over a range from 1 ppb to over 100 ppb. Once the operator has introduced the sample vial and selected "measure&qu...
Mechanisms and dynamics of protonation and lithiation of ferrocene.
Sharma, Nishant; Ajay, Jayanth K; Venkatasubbaiah, Krishnan; Lourderaj, Upakarasamy
2015-09-14
By elucidating the mechanism of the simplest electrophilic substitution reaction of ferrocene, it was found that the verification of the protonation reaction has been a difficulty. In the work reported here, ab initio chemical dynamics simulations were performed at B3LYP/DZVP level of theory to understand the atomic level mechanisms of protonation and lithiation of ferrocene. Protonation of ferrocene resulted in the agostic and metal-protonated forms. Trajectory calculations revealed that protonation of ferrocene occurs by exo and endo mechanisms, with exo being the major path. H(+) was found to be mobile and hopped from the Cp ring to the metal center and vice versa after the initial attack on ferrocene, with the metal-complex having a shorter lifetime. These results remove the ambiguity surrounding the mechanism, as proposed in earlier experimental and computational studies. Lithiation of ferrocene resulted in the formation of cation-π and metal-lithiated complexes. Similar to protonation, trajectory results revealed that both exo and endo paths were followed, with the exo path being the major one. In addition, lithiated-ferrocene exhibited planetary motion. The major path (exo) followed in the protonation and lithiation of ferrocene is consistent with the observations in earlier experimental studies for other hard electrophiles.
Study on numerical simulation of asymmetric structure aluminum profile extrusion based on ALE method
NASA Astrophysics Data System (ADS)
Chen, Kun; Qu, Yuan; Ding, Siyi; Liu, Changhui; Yang, Fuyong
2018-05-01
Using the HyperXtrude module based on the Arbitrary Lagrangian-Eulerian (ALE) finite element method, the paper simulates the steady extrusion process of the asymmetric structure aluminum die successfully. A verification experiment is carried out to verify the simulation results. Having obtained and analyzed the stress-strain field, temperature field and extruded velocity of the metal, it confirms that the simulation prediction results and the experimental schemes are consistent. The scheme of the die correction and optimization are discussed at last. By adjusting the bearing length and core thickness, adopting the structure of feeder plate protection, short shunt bridge in the upper die and three-level bonding container in the lower die to control the metal flowing, the qualified aluminum profile can be obtained.
Self-Organized Defects of Half-Metallic Nanowires in MgO-Based Magnetic Tunnel Junctions
NASA Astrophysics Data System (ADS)
Seike, Masayoshi; Fukushima, Tetsuya; Sato, Kazunori; Katayama-Yoshida, Hiroshi
2013-03-01
The purpose of this study is to examine the possibility of self-organization of defects and defect-induced properties in MgO-based magnetic tunnel junctions (MTJs). Using the Heyd-Scuseria-Ernzerhof (HSE06) hybrid functional, first-principles calculations were performed to estimate the electronic structures and total energies of MgO with various defects. From our thorough evaluation of the calculated results and previously reported experimental data, we propose that self-organized half-metallic nanowires of magnesium vacancies can be formed in MgO-based MTJs. This self-organization may provide the foundation for a comprehensive understanding of the conductivity, tunnel barriers and quantum oscillations of MgO-based MTJs. Further experimental verification is needed before firm conclusions can be drawn.
Non-destructive forensic latent fingerprint acquisition with chromatic white light sensors
NASA Astrophysics Data System (ADS)
Leich, Marcus; Kiltz, Stefan; Dittmann, Jana; Vielhauer, Claus
2011-02-01
Non-destructive latent fingerprint acquisition is an emerging field of research, which, unlike traditional methods, makes latent fingerprints available for additional verification or further analysis like tests for substance abuse or age estimation. In this paper a series of tests is performed to investigate the overall suitability of a high resolution off-the-shelf chromatic white light sensor for the contact-less and non-destructive latent fingerprint acquisition. Our paper focuses on scanning previously determined regions with exemplary acquisition parameter settings. 3D height field and reflection data of five different latent fingerprints on six different types of surfaces (HDD platter, brushed metal, painted car body (metallic and non-metallic finish), blued metal, veneered plywood) are experimentally studied. Pre-processing is performed by removing low-frequency gradients. The quality of the results is assessed subjectively; no automated feature extraction is performed. Additionally, the degradation of the fingerprint during the acquisition period is observed. While the quality of the acquired data is highly dependent on surface structure, the sensor is capable of detecting the fingerprint on all sample surfaces. On blued metal the residual material is detected; however, the ridge line structure dissolves within minutes after fingerprint placement.
Digital video system for on-line portal verification
NASA Astrophysics Data System (ADS)
Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott
1990-07-01
A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.
Surface plasmon resonances in liquid metal nanoparticles
NASA Astrophysics Data System (ADS)
Ershov, A. E.; Gerasimov, V. S.; Gavrilyuk, A. P.; Karpov, S. V.
2017-06-01
We have shown significant suppression of resonant properties of metallic nanoparticles at the surface plasmon frequency during the phase transition "solid-liquid" in the basic materials of nanoplasmonics (Ag, Au). Using experimental values of the optical constants of liquid and solid metals, we have calculated nanoparticle plasmonic absorption spectra. The effect was demonstrated for single particles, dimers and trimers, as well as for the large multiparticle colloidal aggregates. Experimental verification was performed for single Au nanoparticles heated to the melting temperature and above up to full suppression of the surface plasmon resonance. It is emphasized that this effect may underlie the nonlinear optical response of composite materials containing plasmonic nanoparticles and their aggregates.
Hydrogen and helium under high pressure - A case for a classical theory of dense matter
NASA Astrophysics Data System (ADS)
Celebonovic, Vladan
1989-06-01
When subject to high pressure, H2 and He-3 are expected to undergo phase transitions, and to become metallic at a sufficiently high pressure. Using a semiclassical theory of dense matter proposed by Savic and Kasanin, calculations of phase transition and metallization pressure have been performed for these two materials. In hydrogen, metallization occurs at p(M) = (3.0 + or - 0.2) Mbar, while for helium the corresponding value is (106 + or - 1) Mbar. A phase transition occurs in helium at p(tr) = (10.0 + or - 0.4) Mbar. These values are close to the results obtainable by more rigorous methods. Possibilities of experimental verification of the calculations are briefly discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-31
... facilities and the examination of relevant sales and financial records. Our verification results are outlined... financial statements of Ratnamani Metals & Tubes, Ltd. (``Ratnamani'') and Jindal SAW, Ltd. (``Jindal'') for... submission at Exhibits 10 and 11. Ratnamani's and Jindal's financial statements at 2, 39, 41, 42, and 44 and...
Quality Assurance Project Plan For Verification of ANDalyze Lead100 Test Kit and AND1000 Fluorimeter
Lead (Pb) is a naturally occurring metal in the aquatic environment; however, most Pb contamination of concern arises from anthropogenic sources (such as deposition of Pb dust from combustion processes in natural waterways or due to its use in plumbing materials). Although Pb is...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... subsequent soil samples showed levels of metals at or below generic residential criteria or background values... 1994- 1996 and additional sampling between 1998 and 2007. Area A--Site Entrance: Soil boring samples... verification samples. Additional soil samples were collected from the same location as the previous collection...
Structural Deterministic Safety Factors Selection Criteria and Verification
NASA Technical Reports Server (NTRS)
Verderaime, V.
1992-01-01
Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.
HARIRI, Azian; PAIMAN, Nuur Azreen; LEMAN, Abdul Mutalib; MD. YUSOF, Mohammad Zainal
2014-01-01
Abstract Background This study aimed to develop an index that can rank welding workplace that associate well with possible health risk of welders. Methods Welding Fumes Health Index (WFHI) were developed based on data from case studies conducted in Plant 1 and Plant 2. Personal sampling of welding fumes to assess the concentration of metal constituents along with series of lung function tests was conducted. Fifteen metal constituents were investigated in each case study. Index values were derived from aggregation analysis of metal constituent concentration while significant lung functions were recognized through statistical analysis in each plant. Results The results showed none of the metal constituent concentration was exceeding the permissible exposure limit (PEL) for all plants. However, statistical analysis showed significant mean differences of lung functions between welders and non-welders. The index was then applied to one of the welding industry (Plant 3) for verification purpose. The developed index showed its promising ability to rank welding workplace, according to the multiple constituent concentrations of welding fumes that associates well with lung functions of the investigated welders. Conclusion There was possibility that some of the metal constituents were below the detection limit leading to ‘0’ value of sub index, thus the multiplicative form of aggregation model was not suitable for analysis. On the other hand, maximum or minimum operator forms suffer from compensation issues and were not considered in this study. PMID:25927034
Implications of Artefacts Reduction in the Planning CT Originating from Implanted Fiducial Markers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kassim, Iskandar, E-mail: i.binkassim@erasmusmc.n; Joosten, Hans; Barnhoorn, Jaco C.
The efficacy of metal artefact reduction (MAR) software to suppress artefacts in reconstructed computed tomography (CT) images originating from small metal objects, like tumor markers and surgical clips, was evaluated. In addition, possible implications of using digital reconstructed radiographs (DRRs), based on the MAR CT images, for setup verification were analyzed. A phantom and 15 patients with different tumor sites and implanted markers were imaged with a multislice CT scanner. The raw image data was reconstructed both with the clinically used filtered-backprojection (FBP) and with the MAR software. Using the MAR software, improvements in image quality were often observed inmore » CT slices with markers or clips. Especially when several markers were located near to each other, fewer streak artefacts were observed than with the FBP algorithm. In addition, the shape and size of markers could be identified more accurately, reducing the contoured marker volumes by a factor of 2. For the phantom study, the CT numbers measured near to the markers corresponded more closely to the expected values. However, the MAR images were slightly more smoothed compared with the images reconstructed with FBP. For 8 prostate cancer patients in this study, the interobserver variation in 3D marker definition was similar (<0.4 mm) when using DRRs based on either FBP or MAR CT scans. Automatic marker matches also showed a similar success rate. However, differences in automatic match results up to 1 mm, caused by differences in the marker definition, were observed, which turned out to be (borderline) statistically significant (p = 0.06) for 2 patients. In conclusion, the MAR software might improve image quality by suppressing metal artefacts, probably allowing for a more reliable delineation of structures. When implanted markers or clips are used for setup verification, the accuracy may slightly be improved as well, which is relevant when using very tight clinical target volume (CTV) to planning target volume (PTV) margins for planning.« less
A sigmoidal model for biosorption of heavy metal cations from aqueous media.
Özen, Rümeysa; Sayar, Nihat Alpagu; Durmaz-Sam, Selcen; Sayar, Ahmet Alp
2015-07-01
A novel multi-input single output (MISO) black-box sigmoid model is developed to simulate the biosorption of heavy metal cations by the fission yeast from aqueous medium. Validation and verification of the model is done through statistical chi-squared hypothesis tests and the model is evaluated by uncertainty and sensitivity analyses. The simulated results are in agreement with the data of the studied system in which Schizosaccharomyces pombe biosorbs Ni(II) cations at various process conditions. Experimental data is obtained originally for this work using dead cells of an adapted variant of S. Pombe and represented by Freundlich isotherms. A process optimization scheme is proposed using the present model to build a novel application of a cost-merit objective function which would be useful to predict optimal operation conditions. Copyright © 2015. Published by Elsevier Inc.
Self-verification motives at the collective level of self-definition.
Chen, Serena; Chen, Karen Y; Shaw, Lindsay
2004-01-01
Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.
NASA Astrophysics Data System (ADS)
Wu, Yu; Zhang, Hongpeng
2017-12-01
A new microfluidic chip is presented to enhance the sensitivity of a micro inductive sensor, and an approach to coil inductance change calculation is introduced for metal particle detection in lubrication oil. Electromagnetic knowledge is used to establish a mathematical model of an inductive sensor for metal particle detection, and the analytic expression of coil inductance change is obtained by a magnetic vector potential. Experimental verification is carried out. The results show that copper particles 50-52 µm in diameter have been detected; the relative errors between the theoretical and experimental values are 7.68% and 10.02% at particle diameters of 108-110 µm and 50-52 µm, respectively. The approach presented here can provide a theoretical basis for an inductive sensor in metal particle detection in oil and other areas of application.
Standards for the calibration of extensometers
NASA Astrophysics Data System (ADS)
Loveday, Malcolm S.
1991-10-01
The consequences of the impending publication of a new European Standard, BS EN 10002 Pt 4 'Metallic Materials: Verification of Extensometers Used in Uniaxial Testing', which was based on the equivalent International Standard ISO 9513, are considered within the context of the new standard superceding the present British Standard, BS 3846. The three standards are compared and the differences are highlighted.
NASA Astrophysics Data System (ADS)
Fatemi, Javad
2011-05-01
The thermal protection system of the EXPERT re-entry vehicle is subjected to accelerations, vibrations, acoustic and shock loads during launch and aero-heating loads and aerodynamic forces during re-entry. To fully understand the structural and thermomechanical performances of the TPS, heat transfer analysis, thermal stress analysis, and thermal buckling analysis must be performed. This requires complex three-dimensional thermal and structural models of the entire TPS including the insulation and sensors. Finite element (FE) methods are employed to assess the thermal and structural response of the TPS to the mechanical and aerothermal loads. The FE analyses results are used for the design verification and design improvement of the EXPERT thermal protection system.
NASA Technical Reports Server (NTRS)
Sanchez, A.; Davis, C. F., Jr.; Liu, K. C.; Javan, A.
1978-01-01
A theoretical analysis of the metal-oxide-metal (MOM) antenna/diode as a detector of microwave and infrared radiation is presented with the experimental verification conducted in the far infrared. It is shown that the detectivity at room temperature can be as high as 10 to the 10th per W Hz exp 1/2 at frequencies of 10 to the 14th Hz in the infrared. As a result, design guidelines are obtained for the lithographic fabrication of thin-film MOM structures that are to operate in the 10-micron region of the infrared spectrum.
Shen, Xiaomei; Liu, Wenqi; Gao, Xuejiao; Lu, Zhanghui; Wu, Xiaochun; Gao, Xingfa
2015-12-23
Metal and alloy nanomaterials have intriguing oxidase- and superoxide dismutation-like (SOD-like) activities. However, origins of these activities remain to be studied. Using density functional theory (DFT) calculations, we investigate mechanisms of oxidase- and SOD-like properties for metals Au, Ag, Pd and Pt and alloys Au4-xMx (x = 1, 2, 3; M = Ag, Pd, Pt). We find that the simple reaction-dissociation of O2-supported on metal surfaces can profoundly account for the oxidase-like activities of the metals. The activation (Eact) and reaction energies (Er) calculated by DFT can be used to effectively predict the activity. As verification, the calculated activity orders for series of metal and alloy nanomaterials are in excellent agreement with those obtained by experiments. Briefly, the activity is critically dependent on two factors, metal compositions and exposed facets. On the basis of these results, an energy-based model is proposed to account for the activation of molecular oxygen. As for SOD-like activities, the mechanisms mainly consist of protonation of O2(•-) and adsorption and rearrangement of HO2(•) on metal surfaces. Our results provide atomistic-level insights into the oxidase- and SOD-like activities of metals and pave a way to the rational design of mimetic enzymes based on metal nanomaterials. Especially, the O2 dissociative adsorption mechanism will serve as a general way to the activation of molecular oxygen by nanosurfaces and help understand the catalytic role of nanomaterials as pro-oxidants and antioxidants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Ryan M.; Suter, Jonathan D.; Jones, Anthony M.
2014-09-12
This report documents FY14 efforts for two instrumentation subtasks under storage and transportation. These instrumentation tasks relate to developing effective nondestructive evaluation (NDE) methods and techniques to (1) verify the integrity of metal canisters for the storage of used nuclear fuel (UNF) and to (2) verify the integrity of dry storage cask internals.
Advances in Additive Manufacturing
2016-07-14
of 3D - printed structures. Analysis examples will include quantification of tolerance differences between the designed and manufactured parts, void...15. SUBJECT TERMS 3-D printing , validation and verification, nondestructive inspection, print -on-the-move, prototyping 16. SECURITY CLASSIFICATION...researching the formation of AM-grade metal powder from battlefield scrap and operating base waste, 2) potential of 3-D printing with sand to make
The Results of 45 Years of Atmospheric Corrosion Study in the Czech Republic.
Kreislova, Katerina; Knotkova, Dagmar
2017-04-07
Atmospheric corrosion poses a significant problem with regard to destruction of various materials, especially metals. Observations made over the past decades suggest that the world's climate is changing. Besides global warming, there are also changes in other parameters. For example, average annual precipitation increased by nearly 10% over the course of the 20th century. In Europe, the most significant change, from the atmospheric corrosion point of view, was an increase in SO₂ pollution in the 1970s through the 1980s and a subsequent decrease in this same industrial air pollution and an increase in other types of air pollution, which created a so-called multi-pollutant atmospheric environment. Exposed metals react to such changes immediately, even if corrosion attack started in high corrosive atmospheres. This paper presents a complex evaluation of the effect of air pollution and other environmental parameters and verification of dose/response equations for conditions in the Czech Republic.
Masuda, Shumpei; Tan, Kuan Y; Partanen, Matti; Lake, Russell E; Govenius, Joonas; Silveri, Matti; Grabert, Hermann; Möttönen, Mikko
2018-03-02
We experimentally study nanoscale normal-metal-insulator-superconductor junctions coupled to a superconducting microwave resonator. We observe that bias-voltage-controllable single-electron tunneling through the junctions gives rise to a direct conversion between the electrostatic energy and that of microwave photons. The measured power spectral density of the microwave radiation emitted by the resonator exceeds at high bias voltages that of an equivalent single-mode radiation source at 2.5 K although the phonon and electron reservoirs are at subkelvin temperatures. Measurements of the generated power quantitatively agree with a theoretical model in a wide range of bias voltages. Thus, we have developed a microwave source which is compatible with low-temperature electronics and offers convenient in-situ electrical control of the incoherent photon emission rate with a predetermined frequency, without relying on intrinsic voltage fluctuations of heated normal-metal components or suffering from unwanted losses in room temperature cables. Importantly, our observation of negative generated power at relatively low bias voltages provides a novel type of verification of the working principles of the recently discovered quantum-circuit refrigerator.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Intervalence transfer of ferrocene moieties adsorbed on electrode surfaces by a conjugated linkage
NASA Astrophysics Data System (ADS)
Chen, Wei; Brown, Lauren E.; Konopelski, Joseph P.; Chen, Shaowei
2009-03-01
Effective intervalence transfer occurred between the metal centers of ferrocene moieties that were adsorbed onto a ruthenium thin film surface by ruthenium-carbene π bonds, a direct verification of Hush's four-decade-old prediction. Electrochemical measurements showed two pairs of voltammetric peaks where the separation of the formal potentials suggested a Class II behavior. Additionally, the potential spacing increased with increasing ferrocene surface coverage, most probably as a consequence of the enhanced contribution from through-space electronic interactions between the metal centers. In contrast, the incorporation of a sp 3 carbon spacer into the ferrocene-ruthenium linkage led to the diminishment of interfacial electronic communication.
Toward Paradoxical Inconsistency in Electrostatics of Metallic Conductors
Naturally, when dealing with fundamental problems, the V and V effort should include careful exploration and, if necessary, revision of the fundamentals...Current developments show a clear trend toward more serious efforts in validation and verification (V and V) of physical and engineering models...underlying the physics. With this understanding in mind, we review some fundamentals of the models of crystalline electric conductors and find a
A Nonvolume Preserving Plasticity Theory with Applications to Powder Metallurgy
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1983-01-01
A plasticity theory has been developed to predict the mechanical response of powder metals during hot isostatic pressing. The theory parameters were obtained through an experimental program consisting of hydrostatic pressure tests, uniaxial compression and uniaxial tension tests. A nonlinear finite element code was modified to include the theory and the results of themodified code compared favorably to the results from a verification experiment.
NASA Astrophysics Data System (ADS)
Fay, Aurélien; Browning, Clyde; Brandt, Pieter; Chartoire, Jacky; Bérard-Bergery, Sébastien; Hazart, Jérôme; Chagoya, Alexandre; Postnikov, Sergei; Saib, Mohamed; Lattard, Ludovic; Schavione, Patrick
2016-03-01
Massively parallel mask-less electron beam lithography (MP-EBL) offers a large intrinsic flexibility at a low cost of ownership in comparison to conventional optical lithography tools. This attractive direct-write technique needs a dedicated data preparation flow to correct both electronic and resist processes. Moreover, Data Prep has to be completed in a short enough time to preserve the flexibility advantage of MP-EBL. While the MP-EBL tools have currently entered an advanced stage of development, this paper will focus on the data preparation side of the work for specifically the MAPPER Lithography FLX-1200 tool [1]-[4], using the ASELTA Nanographics Inscale software. The complete flow as well as the methodology used to achieve a full-field layout data preparation, within an acceptable cycle time, will be presented. Layout used for Data Prep evaluation was one of a 28 nm technology node Metal1 chip with a field size of 26x33mm2, compatible with typical stepper/scanner field sizes and wafer stepping plans. Proximity Effect Correction (PEC) was applied to the entire field, which was then exported as a single file to MAPPER Lithography's machine format, containing fractured shapes and dose assignments. The Soft Edge beam to beam stitching method was employed in the specific overlap regions defined by the machine format as well. In addition to PEC, verification of the correction was included as part of the overall data preparation cycle time. This verification step was executed on the machine file format to ensure pattern fidelity and accuracy as late in the flow as possible. Verification over the full chip, involving billions of evaluation points, is performed both at nominal conditions and at Process Window corners in order to ensure proper exposure and process latitude. The complete MP-EBL data preparation flow was demonstrated for a 28 nm node Metal1 layout in 37 hours. The final verification step shows that the Edge Placement Error (EPE) is kept below 2.25 nm over an exposure dose variation of 8%.
Ghorbani Moghaddam, Masoud; Achuthan, Ajit; Bednarcyk, Brett A; Arnold, Steven M; Pineda, Evan J
2016-05-04
A multiscale computational model is developed for determining the elasto-plastic behavior of polycrystal metals by employing a single crystal plasticity constitutive model that can capture the microstructural scale stress field on a finite element analysis (FEA) framework. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, the stand-alone GMC is applied for studying simple material microstructures such as a repeating unit cell (RUC) containing single grain or two grains under uniaxial loading conditions. For verification, the results obtained by the stand-alone GMC are compared to those from an analogous FEA model incorporating the same single crystal plasticity constitutive model. This verification is then extended to samples containing tens to hundreds of grains. The results demonstrate that the GMC homogenization combined with the crystal plasticity constitutive framework is a promising approach for failure analysis of structures as it allows for properly predicting the von Mises stress in the entire RUC, in an average sense, as well as in the local microstructural level, i.e. , each individual grain. Two-three orders of saving in computational cost, at the expense of some accuracy in prediction, especially in the prediction of the components of local tensor field quantities and the quantities near the grain boundaries, was obtained with GMC. Finally, the capability of the developed multiscale model linking FEA and GMC to solve real-life-sized structures is demonstrated by successfully analyzing an engine disc component and determining the microstructural scale details of the field quantities.
Ghorbani Moghaddam, Masoud; Achuthan, Ajit; Bednarcyk, Brett A.; Arnold, Steven M.; Pineda, Evan J.
2016-01-01
A multiscale computational model is developed for determining the elasto-plastic behavior of polycrystal metals by employing a single crystal plasticity constitutive model that can capture the microstructural scale stress field on a finite element analysis (FEA) framework. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, the stand-alone GMC is applied for studying simple material microstructures such as a repeating unit cell (RUC) containing single grain or two grains under uniaxial loading conditions. For verification, the results obtained by the stand-alone GMC are compared to those from an analogous FEA model incorporating the same single crystal plasticity constitutive model. This verification is then extended to samples containing tens to hundreds of grains. The results demonstrate that the GMC homogenization combined with the crystal plasticity constitutive framework is a promising approach for failure analysis of structures as it allows for properly predicting the von Mises stress in the entire RUC, in an average sense, as well as in the local microstructural level, i.e., each individual grain. Two–three orders of saving in computational cost, at the expense of some accuracy in prediction, especially in the prediction of the components of local tensor field quantities and the quantities near the grain boundaries, was obtained with GMC. Finally, the capability of the developed multiscale model linking FEA and GMC to solve real-life-sized structures is demonstrated by successfully analyzing an engine disc component and determining the microstructural scale details of the field quantities. PMID:28773458
Laboratory simulation of heat exchange for liquids with Pr > 1: Heat transfer
NASA Astrophysics Data System (ADS)
Belyaev, I. A.; Zakharova, O. D.; Krasnoshchekova, T. E.; Sviridov, V. G.; Sukomel, L. A.
2016-02-01
Liquid metals are promising heat transfer agents in new-generation nuclear power plants, such as fast-neutron reactors and hybrid tokamaks—fusion neutron sources (FNSs). We have been investigating hydrodynamics and heat exchange of liquid metals for many years, trying to reproduce the conditions close to those in fast reactors and fusion neutron sources. In the latter case, the liquid metal flow takes place in a strong magnetic field and strong thermal loads resulting in development of thermogravitational convection in the flow. In this case, quite dangerous regimes of magnetohydrodynamic (MHD) heat exchange not known earlier may occur that, in combination with other long-known regimes, for example, the growth of hydraulic drag in a strong magnetic field, make the possibility of creating a reliable FNS cooling system with a liquid metal heat carrier problematic. There exists a reasonable alternative to liquid metals in FNS, molten salts, namely, the melt of lithium and beryllium fluorides (Flibe) and the melt of fluorides of alkali metals (Flinak). Molten salts, however, are poorly studied media, and their application requires detailed scientific substantiation. We analyze the modern state of the art of studies in this field. Our contribution is to answer the following question: whether above-mentioned extremely dangerous regimes of MHD heat exchange detected in liquid metals can exist in molten salts. Experiments and numerical simulation were performed in order to answer this question. The experimental test facility represents a water circuit, since water (or water with additions for increasing its electrical conduction) is a convenient medium for laboratory simulation of salt heat exchange in FNS conditions. Local heat transfer coefficients along the heated tube, three-dimensional (along the length and in the cross section, including the viscous sublayer) fields of averaged temperature and temperature pulsations are studied. The probe method for measurements in a flow is described in detail. Experimental data are designated for verification of codes simulating heat exchange of molten salts.
LMFBR system-wide transient analysis: the state of the art and US validation needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khatib-Rahbar, M.; Guppy, J.G.; Cerbone, R.J.
1982-01-01
This paper summarizes the computational capabilities in the area of liquid metal fast breeder reactor (LMFBR) system-wide transient analysis in the United States, identifies various numerical and physical approximations, the degree of empiricism, range of applicability, model verification and experimental needs for a wide class of protected transients, in particular, natural circulation shutdown heat removal for both loop- and pool-type plants.
An advanced selective liquid-metal plating technique for stretchable biosensor applications.
Li, Guangyong; Lee, Dong-Weon
2017-10-11
This paper presents a novel stretchable pulse sensor fabricated by a selective liquid-metal plating process (SLMP), which can conveniently attach to the human skin and monitor the patient's heartbeat. The liquid metal-based stretchable pulse sensor consists of polydimethylsiloxane (PDMS) thin films and liquid metal functional circuits with electronic elements that are embedded into the PDMS substrate. In order to verify the utility of the fabrication process, various complex liquid-metal patterns are achieved by using the selective wetting behavior of the reduced liquid metal on the Cu patterns of the PDMS substrate. The smallest liquid-metal pattern is approximately 2 μm in width with a uniform surface. After verification, a transparent flowing LED light with programmed circuits is realized and exhibits stable mechanical and electrical properties under various deformations (bending, twisting and stretching). Finally, based on SLMP, a wireless pulse measurement system is developed which is composed of the liquid metal-based stretchable pulse sensor, a Bluetooth module, an Arduino development board, a laptop computer and a self-programmed visualized software program. The experimental results reveal that the portable non-invasive pulse sensor has the potential to reduce costs, simplify biomedical diagnostic procedures and help patients to improve their life in the future.
Trace Elements and Healthcare: A Bioinformatics Perspective.
Zhang, Yan
2017-01-01
Biological trace elements are essential for human health. Imbalance in trace element metabolism and homeostasis may play an important role in a variety of diseases and disorders. While the majority of previous researches focused on experimental verification of genes involved in trace element metabolism and those encoding trace element-dependent proteins, bioinformatics study on trace elements is relatively rare and still at the starting stage. This chapter offers an overview of recent progress in bioinformatics analyses of trace element utilization, metabolism, and function, especially comparative genomics of several important metals. The relationship between individual elements and several diseases based on recent large-scale systematic studies such as genome-wide association studies and case-control studies is discussed. Lastly, developments of ionomics and its recent application in human health are also introduced.
Strain-dependent activation energy of shear transformation in metallic glasses
NASA Astrophysics Data System (ADS)
Xu, Bin; Falk, Michael; Li, Jinfu; Kong, Lingti
2017-04-01
Shear transformation (ST) plays a decisive role in determining the mechanical behavior of metallic glasses, which is believed to be a stress-assisted thermally activated process. Understanding the dependence in its activation energy on the stress imposed on the material is of central importance to model the deformation process of metallic glasses and other amorphous solids. Here a theoretical model is proposed to predict the variation of the minimum energy path (MEP) associated with a particular ST event upon further deformation. Verification based on atomistic simulations and calculations are also conducted. The proposed model reproduces the MEP and activation energy of an ST event under different imposed macroscopic strains based on a known MEP at a reference strain. Moreover, an analytical approach is proposed based on the atomistic calculations, which works well when the stress varies linearity along the MEP. These findings provide necessary background for understanding the activation processes and, in turn, the mechanical behavior of metallic glasses.
Biomining of metals: how to access and exploit natural resource sustainably.
Jerez, Carlos A
2017-09-01
Mining activities have been carried out for thousands of years and nowadays have an enormous worldwide use to obtain important metals of industrial use. These include copper, iron, gold and several others. Although modern mining companies have sustainable mining programs that include tailings management and external verifications, it is recognized that these industrial activities are responsible for a significant damage to the environment. Specially, technologies such as smelting and roasting generate very toxic emissions, including solid particles in the air, very large tailings and contribute to generate acid mine drainage (AMD) that affects humans health and all kinds of living plants, animals and microorganisms. Consequently, due to environmental restrictions, these methods are being replaced in many countries by less contaminating processes. On the other hand, the microbial solubilization of metals by bioleaching or biomining is successfully used in industrial operations, to extract several metals such as copper, gold and uranium. © 2017 The Author. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C
2012-01-01
US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.
Fabrication of U-10 wt.%Zr Metallic Fuel Rodlets for Irradiation Test in BOR-60 Fast Reactor
Kim, Ki-Hwan; Kim, Jong-Hwan; Oh, Seok-Jin; ...
2016-01-01
The fabrication technology for metallic fuel has been developed to produce the driver fuel in a PGSFR in Korea since 2007. In order to evaluate the irradiation integrity and validate the in-reactor of the starting metallic fuel with FMS cladding for the loading of the metallic fuel, U-10 wt.%Zr fuel rodlets were fabricated and evaluated for a verification of the starting driver fuel through an irradiation test in the BOR-60 fast reactor. The injection casting method was applied to U-10 wt.%Zr fuel slugs with a diameter of 5.5 mm. Consequently, fuel slugs per melting batch without casting defects were fabricated through the developmentmore » of advanced casting technology and evaluation tests. The optimal GTAW welding conditions were also established through a number of experiments. In addition, a qualification test was carried out to prove the weld quality of the end plug welding of the metallic fuel rodlets. The wire wrapping of metallic fuel rodlets was successfully accomplished for the irradiation test. Thus, PGSFR fuel rodlets have been soundly fabricated for the irradiation test in a BOR-60 fast reactor.« less
The Results of 45 Years of Atmospheric Corrosion Study in the Czech Republic
Kreislova, Katerina; Knotkova, Dagmar
2017-01-01
Atmospheric corrosion poses a significant problem with regard to destruction of various materials, especially metals. Observations made over the past decades suggest that the world’s climate is changing. Besides global warming, there are also changes in other parameters. For example, average annual precipitation increased by nearly 10% over the course of the 20th century. In Europe, the most significant change, from the atmospheric corrosion point of view, was an increase in SO2 pollution in the 1970s through the 1980s and a subsequent decrease in this same industrial air pollution and an increase in other types of air pollution, which created a so-called multi-pollutant atmospheric environment. Exposed metals react to such changes immediately, even if corrosion attack started in high corrosive atmospheres. This paper presents a complex evaluation of the effect of air pollution and other environmental parameters and verification of dose/response equations for conditions in the Czech Republic. PMID:28772754
NASA Astrophysics Data System (ADS)
Na, Jeong K.; Kuhr, Samuel J.; Jata, Kumar V.
2008-03-01
Thermal Protection Systems (TPS) can be subjected to impact damage during flight and/or during ground maintenance and/or repair. AFRL/RXLP is developing a reliable and robust on-board sensing/monitoring capability for next generation thermal protection systems to detect and assess impact damage. This study was focused on two classes of metallic thermal protection tiles to determine threshold for impact damage and develop sensing capability of the impacts. Sensors made of PVDF piezoelectric film were employed and tested to evaluate the detectability of impact signals and assess the onset or threshold of impact damage. Testing was performed over a range of impact energy levels, where the sensors were adhered to the back of the specimens. The PVDF signal levels were analyzed and compared to assess damage, where digital microscopy, visual inspection, and white light interferometry were used for damage verification. Based on the impact test results, an assessment of the impact damage thresholds for each type of metallic TPS system was made.
Russian-US collaboration on implementation of the active well coincidence counter (AWCC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mozhajev, V.; Pshakin, G.; Stewart, J.
The feasibility of using a standard AWCC at the Obninsk IPPE has been demonstrated through active measurements of single UO{sub 2} (36% enriched) disks and through passive measurements of plutonium metal disks used for simulating reactor cores. The role of the measurements is to verify passport values assigned to the disks by the facility, and thereby facilitate the mass accountability procedures developed for the very large inventory of fuel disks at the facility. The AWCC is a very flexible instrument for verification measurements of the large variety of nuclear material items at the Obninsk IPPE and other Russian facilities. Futuremore » work at the IPPE will include calibration and verification measurements for other materials, both in individual disks and in multi-disk storage tubes; it will also include training in the use of the AWCC.« less
Battery algorithm verification and development using hardware-in-the-loop testing
NASA Astrophysics Data System (ADS)
He, Yongsheng; Liu, Wei; Koch, Brain J.
Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.
Dosimetric evaluation of a MOSFET detector for clinical application in photon therapy.
Kohno, Ryosuke; Hirano, Eriko; Nishio, Teiji; Miyagishi, Tomoko; Goka, Tomonori; Kawashima, Mitsuhiko; Ogino, Takashi
2008-01-01
Dosimetric characteristics of a metal oxide-silicon semiconductor field effect transistor (MOSFET) detector are studied with megavoltage photon beams for patient dose verification. The major advantages of this detector are its size, which makes it a point dosimeter, and its ease of use. In order to use the MOSFET detector for dose verification of intensity-modulated radiation therapy (IMRT) and in-vivo dosimetry for radiation therapy, we need to evaluate the dosimetric properties of the MOSFET detector. Therefore, we investigated the reproducibility, dose-rate effect, accumulated-dose effect, angular dependence, and accuracy in tissue-maximum ratio measurements. Then, as it takes about 20 min in actual IMRT for the patient, we evaluated fading effect of MOSFET response. When the MOSFETs were read-out 20 min after irradiation, we observed a fading effect of 0.9% with 0.9% standard error of the mean. Further, we applied the MOSFET to the measurement of small field total scatter factor. The MOSFET for dose measurements of small field sizes was better than the reference pinpoint chamber with vertical direction. In conclusion, we assessed the accuracy, reliability, and usefulness of the MOSFET detector in clinical applications such as pinpoint absolute dosimetry for small fields.
Working Memory Mechanism in Proportional Quantifier Verification
ERIC Educational Resources Information Center
Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria
2014-01-01
The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…
Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.
de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M
2012-04-15
A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.
NASA Astrophysics Data System (ADS)
Xie, Wen-Xiong; Li, Jian-Sheng; Gong, Jian; Zhu, Jian-Yu; Huang, Po
2013-10-01
Based on the time-dependent coincidence method, a preliminary experiment has been performed on uranium metal castings with similar quality (about 8-10 kg) and shape (hemispherical shell) in different enrichments using neutron from Cf fast fission chamber and timing DT accelerator. Groups of related parameters can be obtained by analyzing the features of time-dependent coincidence counts between source-detector and two detectors to characterize the fission signal. These parameters have high sensitivity to the enrichment, the sensitivity coefficient (defined as (ΔR/Δm)/R¯) can reach 19.3% per kg of 235U. We can distinguish uranium castings with different enrichments to hold nuclear weapon verification.
High-entropy alloys in hexagonal close-packed structure
Gao, Michael C.; Zhang, B.; Guo, S. M.; ...
2015-08-28
The microstructures and properties of high-entropy alloys (HEAs) based on the face-centered cubic and body-centered cubic structures have been studied extensively in the literature, but reports on HEAs in the hexagonal close-packed (HCP) structure are very limited. Using an efficient strategy in combining phase diagram inspection, CALPHAD modeling, and ab initio molecular dynamics simulations, a variety of new compositions are suggested that may hold great potentials in forming single-phase HCP HEAs that comprise rare earth elements and transition metals, respectively. Lastly, experimental verification was carried out on CoFeReRu and CoReRuV using X-ray diffraction, scanning electron microscopy, and energy dispersion spectroscopy.
Micromechanics of composite laminate compression failure
NASA Technical Reports Server (NTRS)
Guynn, E. Gail; Bradley, Walter L.
1986-01-01
The Dugdale analysis for metals loaded in tension was adapted to model the failure of notched composite laminates loaded in compression. Compression testing details, MTS alignment verification, and equipment needs were resolved. Thus far, only 2 ductile material systems, HST7 and F155, were selected for study. A Wild M8 Zoom Stereomicroscope and necessary attachments for video taping and 35 mm pictures were purchased. Currently, this compression test system is fully operational. A specimen is loaded in compression, and load vs shear-crippling zone size is monitored and recorded. Data from initial compression tests indicate that the Dugdale model does not accurately predict the load vs damage zone size relationship of notched composite specimens loaded in compression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep
The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less
Metal flow of a tailor-welded blank in deep drawing process
NASA Astrophysics Data System (ADS)
Yan, Qi; Guo, Ruiquan
2005-01-01
Tailor welded blanks were used in the automotive industry to consolidate parts, reduce weight, and increase safety. In recent years, this technology was developing rapidly in China. In Chinese car models, tailor welded blanks had been applied in a lot of automobile parts such as rail, door inner, bumper, floor panel, etc. Concerns on the properties of tailor welded blanks had become more and more important for automobile industry. A lot of research had shown that the strength of the welded seam was higher than that of the base metal, such that the weld failure in the aspect of strength was not a critical issue. However, formability of tailor welded blanks in the stamping process was complex. Among them, the metal flow of tailor welded blanks in the stamping process must be investigated thoroughly in order to reduce the scrap rate during the stamping process in automobile factories. In this paper, the behavior of metal flow for tailor welded blanks made by the laser welding process with two types of different thickness combinations were studied in the deep drawing process. Simulations and experiment verification of the movement of weld line for tailor welded blanks were discussed in detail. Results showed that the control on the movement of welded seam during stamping process by taking some measures in the aspect of blank holder was effective.
Linking dwarf galaxies to halo building blocks with the most metal-poor star in Sculptor.
Frebel, Anna; Kirby, Evan N; Simon, Joshua D
2010-03-04
Current cosmological models indicate that the Milky Way's stellar halo was assembled from many smaller systems. On the basis of the apparent absence of the most metal-poor stars in present-day dwarf galaxies, recent studies claimed that the true Galactic building blocks must have been vastly different from the surviving dwarfs. The discovery of an extremely iron-poor star (S1020549) in the Sculptor dwarf galaxy based on a medium-resolution spectrum cast some doubt on this conclusion. Verification of the iron-deficiency, however, and measurements of additional elements, such as the alpha-element Mg, are necessary to demonstrate that the same type of stars produced the metals found in dwarf galaxies and the Galactic halo. Only then can dwarf galaxy stars be conclusively linked to early stellar halo assembly. Here we report high-resolution spectroscopic abundances for 11 elements in S1020549, confirming its iron abundance of less than 1/4,000th that of the Sun, and showing that the overall abundance pattern follows that seen in low-metallicity halo stars, including the alpha-elements. Such chemical similarity indicates that the systems destroyed to form the halo billions of years ago were not fundamentally different from the progenitors of present-day dwarfs, and suggests that the early chemical enrichment of all galaxies may be nearly identical.
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meier, Horst; Laurischkat, Roman; Zhu Junhong
One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi bodymore » system model and its included compensation method.« less
NASA Astrophysics Data System (ADS)
Wei, T. B.; Chen, Y. L.; Lin, H. R.; Huang, S. Y.; Yeh, T. C. J.; Wen, J. C.
2016-12-01
In the groundwater study, it estimated the heterogeneous spatial distribution of hydraulic Properties, there were many scholars use to hydraulic tomography (HT) from field site pumping tests to estimate inverse of heterogeneous spatial distribution of hydraulic Properties, to prove the most of most field site aquifer was heterogeneous hydrogeological parameters spatial distribution field. Many scholars had proposed a method of hydraulic tomography to estimate heterogeneous spatial distribution of hydraulic Properties of aquifer, the Huang et al. [2011] was used the non-redundant verification analysis of pumping wells changed, observation wells fixed on the inverse and the forward, to reflect the feasibility of the heterogeneous spatial distribution of hydraulic Properties of field site aquifer of the non-redundant verification analysis on steady-state model.From post literature, finding only in steady state, non-redundant verification analysis of pumping well changed location and observation wells fixed location for inverse and forward. But the studies had not yet pumping wells fixed or changed location, and observation wells fixed location for redundant verification or observation wells change location for non-redundant verification of the various combinations may to explore of influences of hydraulic tomography method. In this study, it carried out redundant verification method and non-redundant verification method for forward to influences of hydraulic tomography method in transient. And it discuss above mentioned in NYUST campus sites the actual case, to prove the effectiveness of hydraulic tomography methods, and confirmed the feasibility on inverse and forward analysis from analysis results.Keywords: Hydraulic Tomography, Redundant Verification, Heterogeneous, Inverse, Forward
Structural tailoring of engine blades (STAEBL)
NASA Technical Reports Server (NTRS)
Platt, C. E.; Pratt, T. K.; Brown, K. W.
1982-01-01
A mathematical optimization procedure was developed for the structural tailoring of engine blades and was used to structurally tailor two engine fan blades constructed of composite materials without midspan shrouds. The first was a solid blade made from superhybrid composites, and the second was a hollow blade with metal matrix composite inlays. Three major computerized functions were needed to complete the procedure: approximate analysis with the established input variables, optimization of an objective function, and refined analysis for design verification.
Temporal and modal characterization of DoD source air toxic ...
This project tested three, real-/near real-time monitoring techniques to develop air toxic emission factors for Department of Defense (DoD) platform sources. These techniques included: resonance enhanced multi photon ionization time of flight mass spectrometry (REMPI-TOFMS) for organic air toxics, laser induced breakdown spectroscopy (LIBS) for metallic air toxics, and optical remote sensing (ORS) methods for measurement of criteria pollutants and other hazardous air pollutants (HAPs). Conventional emission measurements were used for verification of the real-time monitoring results. The REMPI-TOFMS system was demonstrated on the following: --a United States U.S. Marine Corps (USMC) diesel generator, --a U.S. Air Force auxiliary power unit (APU), --the waste combustor at the Portsmouth Naval Shipyard, during a multi-monitor environmental technology verification (ETV) test for dioxin monitoring systems, --two dynamometer-driven high mobility multi-purpose wheeled vehicles (HMMWVs), --an idling Abrams battle tank, --a Bradley infantry fighting vehicle (IFV), and --an F-15 and multiple F-22 U.S. Air Force aircraft engines. LIBS was tested and applied solely to the U.S. Marine Corps diesel generator. The high detection limits of LIBS for toxic metals limited its usefulness as a real time analyzer for most DoD sources. ORS was tested only on the APU with satisfactory results for non-condensable combustion products (carbon monoxide [CO], carbon dioxide
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.
Wright, Kevin B; King, Shawn; Rosenberg, Jenny
2014-01-01
This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.
Dunbar, Robert C; Berden, Giel; Martens, Jonathan K; Oomens, Jos
2015-09-24
Conformational preferences have been surveyed for divalent metal cation complexes with the dipeptide ligands AlaPhe, PheAla, GlyHis, and HisGly. Density functional theory results for a full set of complexes are presented, and previous experimental infrared spectra, supplemented by a number of newly recorded spectra obtained with infrared multiple photon dissociation spectroscopy, provide experimental verification of the preferred conformations in most cases. The overall structural features of these complexes are shown, and attention is given to comparisons involving peptide sequence, nature of the metal ion, and nature of the side-chain anchor. A regular progression is observed as a function of binding strength, whereby the weakly binding metal ions (Ba(2+) to Ca(2+)) transition from carboxylate zwitterion (ZW) binding to charge-solvated (CS) binding, while the stronger binding metal ions (Ca(2+) to Mg(2+) to Ni(2+)) transition from CS binding to metal-ion-backbone binding (Iminol) by direct metal-nitrogen bonds to the deprotonated amide nitrogens. Two new sequence-dependent reversals are found between ZW and CS binding modes, such that Ba(2+) and Ca(2+) prefer ZW binding in the GlyHis case but prefer CS binding in the HisGly case. The overall binding strength for a given metal ion is not strongly dependent on the sequence, but the histidine peptides are significantly more strongly bound (by 50-100 kJ mol(-1)) than the phenylalanine peptides.
Finite element simulation and Experimental verification of Incremental Sheet metal Forming
NASA Astrophysics Data System (ADS)
Kaushik Yanamundra, Krishna; Karthikeyan, R., Dr.; Naranje, Vishal, Dr
2018-04-01
Incremental sheet metal forming is now a proven manufacturing technique that can be employed to obtain application specific, customized, symmetric or asymmetric shapes that are required by automobile or biomedical industries for specific purposes like car body parts, dental implants or knee implants. Finite element simulation of metal forming process is being performed successfully using explicit dynamics analysis of commercial FE software. The simulation is mainly useful in optimization of the process as well design of the final product. This paper focuses on simulating the incremental sheet metal forming process in ABAQUS, and validating the results using experimental methods. The shapes generated for testing are of trapezoid, dome and elliptical shapes whose G codes are written and fed into the CNC milling machine with an attached forming tool with a hemispherical bottom. The same pre-generated coordinates are used to simulate a similar machining conditions in ABAQUS and the tool forces, stresses and strains in the workpiece while machining are obtained as the output data. The forces experimentally were recorded using a dynamometer. The experimental and simulated results were then compared and thus conclusions were drawn.
Design technology co-optimization for 14/10nm metal1 double patterning layer
NASA Astrophysics Data System (ADS)
Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi
2016-03-01
Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luke, S J
2011-12-20
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
Study on verifying the angle measurement performance of the rotary-laser system
NASA Astrophysics Data System (ADS)
Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui
2018-04-01
An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.
NASA Astrophysics Data System (ADS)
Mailfert, Julien; Van de Kerkhove, Jeroen; De Bisschop, Peter; De Meyer, Kristin
2014-03-01
A Metal1-layer (M1) patterning study is conducted on 20nm node (N20) for random-logic applications. We quantified the printability performance on our test vehicle for N20, corresponding to Poly/M1 pitches of 90/64nm, and with a selected minimum M1 gap size of 70nm. The Metal1 layer is patterned with 193nm immersion lithography (193i) using Negative Tone Developer (NTD) resist, and a double-patterning Litho-Etch-Litho-Etch (LELE) process. Our study is based on Logic test blocks that we OPCed with a combination of calibrated models for litho and for etch. We report the Overlapping Process Window (OPW), based on a selection of test structures measured after-etch. We find that most of the OPW limiting structures are EOL (End-of-Line) configurations. Further analysis of these individual OPW limiters will reveal that they belong to different types, such as Resist 3D (R3D) and Mask 3D (M3D) sensitive structures, limiters related to OPC (Optical Proximity Corrections) options such as assist placement, or the choice of CD metrics and tolerances for calculation of the process windows itself. To guide this investigation, we will consider a `reference OPC' case to be compared with other solutions. In addition, rigorous simulations and OPC verifications will complete the after-etch measurements to help us to validate our experimental findings.
Richardson, Michael L; Petscavage, Jonelle M
2011-11-01
The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.
A Module Experimental Process System Development Unit (MEPSDU)
NASA Technical Reports Server (NTRS)
1981-01-01
The purpose of this program is to demonstrate the technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per watt peak. Program efforts included: preliminary design review, preliminary cell fabrication using the proposed process sequence, verification of sandblasting back cleanup, study of resist parameters, evaluation of pull strength of the proposed metallization, measurement of contact resistance of Electroless Ni contacts, optimization of process parameter, design of the MEPSDU module, identification and testing of insulator tapes, development of a lamination process sequence, identification, discussions, demonstrations and visits with candidate equipment vendors, evaluation of proposals for tabbing and stringing machine.
Investigation of new semiinsulating behavior of III-V compounds
NASA Technical Reports Server (NTRS)
Lagowski, Jacek
1990-01-01
The investigation of defect interactions and properties related to semiinsulating behavior of III-V semiconductors resulted in about twenty original publications, six doctoral thesis, one masters thesis and numerous conference presentations. The studies of new compensation mechanisms involving transition metal impurities have defined direct effects associated with deep donor/acceptor levels acting as compensating centers. Electrical and optical properties of vanadium and titanium levels were determined in GaAs, InP and also in ternary compounds InGaAs. The experimental data provided basis for the verification of chemical trends and the VRBE method. They also defined compositional range for III-V mixed crystals whereby semiinsulating behavior can be achieved using transition elements deep levels and a suitable codoping with shallow donor/acceptor impurities.
Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stagich, B. H.
The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.
Fingerprint changes and verification failure among patients with hand dermatitis.
Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba
2013-03-01
To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226
NASA Astrophysics Data System (ADS)
McConkey, M. L.
1984-12-01
A complete CMOS/BULK design cycle has been implemented and fully tested to evaluate its effectiveness and a viable set of computer-aided design tools for the layout, verification, and simulation of CMOS/BULK integrated circuits. This design cycle is good for p-well, n-well, or twin-well structures, although current fabrication technique available limit this to p-well only. BANE, an integrated layout program from Stanford, is at the center of this design cycle and was shown to be simple to use in the layout of CMOS integrated circuits (it can be also used to layout NMOS integrated circuits). A flowchart was developed showing the design cycle from initial layout, through design verification, and to circuit simulation using NETLIST, PRESIM, and RNL from the University of Washington. A CMOS/BULK library was designed and includes logic gates that were designed and completely tested by following this flowchart. Also designed was an arithmetic logic unit as a more complex test of the CMOS/BULK design cycle.
In vivo proton range verification: a review
NASA Astrophysics Data System (ADS)
Knopf, Antje-Christin; Lomax, Antony
2013-08-01
Protons are an interesting modality for radiotherapy because of their well defined range and favourable depth dose characteristics. On the other hand, these same characteristics lead to added uncertainties in their delivery. This is particularly the case at the distal end of proton dose distributions, where the dose gradient can be extremely steep. In practice however, this gradient is rarely used to spare critical normal tissues due to such worries about its exact position in the patient. Reasons for this uncertainty are inaccuracies and non-uniqueness of the calibration from CT Hounsfield units to proton stopping powers, imaging artefacts (e.g. due to metal implants) and anatomical changes of the patient during treatment. In order to improve the precision of proton therapy therefore, it would be extremely desirable to verify proton range in vivo, either prior to, during, or after therapy. In this review, we describe and compare state-of-the art in vivo proton range verification methods currently being proposed, developed or clinically implemented.
Zhu, Ling-Ling; Lv, Na; Zhou, Quan
2016-12-01
We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.
Katz, Jennifer; Joiner, Thomas E
2002-02-01
We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.
Compromises produced by the dialectic between self-verification and self-enhancement.
Morling, B; Epstein, S
1997-12-01
Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.
Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.
Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M
2013-05-21
This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
You Can't See the Real Me: Attachment Avoidance, Self-Verification, and Self-Concept Clarity.
Emery, Lydia F; Gardner, Wendi L; Carswell, Kathleen L; Finkel, Eli J
2018-03-01
Attachment shapes people's experiences in their close relationships and their self-views. Although attachment avoidance and anxiety both undermine relationships, past research has primarily emphasized detrimental effects of anxiety on the self-concept. However, as partners can help people maintain stable self-views, avoidant individuals' negative views of others might place them at risk for self-concept confusion. We hypothesized that avoidance would predict lower self-concept clarity and that less self-verification from partners would mediate this association. Attachment avoidance was associated with lower self-concept clarity (Studies 1-5), an effect that was mediated by low self-verification (Studies 2-3). The association between avoidance and self-verification was mediated by less self-disclosure and less trust in partner feedback (Study 4). Longitudinally, avoidance predicted changes in self-verification, which in turn predicted changes in self-concept clarity (Study 5). Thus, avoidant individuals' reluctance to trust or become too close to others may result in hidden costs to the self-concept.
Applications of Materials Selection For Joining Composite/Alloy Piping Systems
NASA Technical Reports Server (NTRS)
Crosby, Karen E.; Smith, Brett H.; Mensah, Patrick F.; Stubblefield, Michael A.
2001-01-01
A study in collaboration between investigators at Southern University and Louisiana State University in Baton Rouge, Louisiana and NASA/MSFC is examining materials for modeling and analysis of heat-activated thermal coupling for joining composite to composite/alloy structures. The short-term objectives of this research are to develop a method for joining composite or alloy structures, as well as to study the effects of thermal stress on composite-to-alloy joints. This investigation will result in the selection of a suitable metallic alloy. Al-Li alloys have potential for this purpose in aerospace applications due to their excellent strength-to-weight ratio. The study of Al-Li and other alloys is of significant importance to this and other aerospace as well as offshore related interests. Further research will incorporate the use of computer aided design and rapid prototype hardware for conceptual design and verification of a potential composite piping delivery system.
Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.
A Roadmap for the Implementation of Continued Process Verification.
Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin
2016-01-01
In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.
The study of fix composite panel and steel plates on testing stand
NASA Astrophysics Data System (ADS)
Wróbel, A.; Płaczek, M.; Wachna, M.
2016-08-01
In this paper the practical possibilities of strength verification analysis of composite materials used in the manufacture of selected components of railway wagons are presented. Real laboratory stand for measurements in a scale controlled by PLC controller were made. The study of different types of connections of composite materials with sheet metal is presented. In one of the chapter of this paper principles construction of testing stand with pneumatic cylinder were presented. Mainly checking of displacements and stresses generated on the sheet as a result of pneumatic actuators load for composite boards was carried out. The use of the controller with operating panel allows to easy programming testing cycle. The user can define the force generated by the actuator by change of air pressure in cylinder. Additionally the location of acting cylinders and their jump can be changed by operator. The examination of the volume displacements was done by displacement sensor, and the tensile strain gauge. All parameters are written in CatmanEasy - data acquisition software. This article presents the study of stresses and displacements in the composite plates joined with sheet metal, in summary of this article, the authors compare the obtained results with the computer simulation results in the article: "Simulation of stresses in an innovative combination of composite with sheet".
Verification of operational solar flare forecast: Case of Regional Warning Center Japan
NASA Astrophysics Data System (ADS)
Kubo, Yûki; Den, Mitsue; Ishii, Mamoru
2017-08-01
In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.
Metal Solidification Imaging Process by Magnetic Induction Tomography.
Ma, Lu; Spagnul, Stefano; Soleimani, Manuchehr
2017-11-06
There are growing number of important applications that require a contactless method for monitoring an object surrounded inside a metallic enclosure. Imaging metal solidification is a great example for which there is no real time monitoring technique at present. This paper introduces a technique - magnetic induction tomography - for the real time in-situ imaging of the metal solidification process. Rigorous experimental verifications are presented. Firstly, a single inductive coil is placed on the top of a melting wood alloy to examine the changes of its inductance during solidification process. Secondly, an array of magnetic induction coils are designed to investigate the feasibility of a tomographic approach, i.e., when one coil is driven by an alternating current as a transmitter and a vector of phase changes are measured from the remaining of the coils as receivers. Phase changes are observed when the wood alloy state changes from liquid to solid. Thirdly, a series of static cold phantoms are created to represent various liquid/solid interfaces to verify the system performance. Finally, a powerful temporal reconstruction method is applied to realise real time in-situ visualisation of the solidification and the measurement of solidified shell thickness, a first report of its kind.
Identity Verification, Control, and Aggression in Marriage
ERIC Educational Resources Information Center
Stets, Jan E.; Burke, Peter J.
2005-01-01
In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…
Students' Verification Strategies for Combinatorial Problems
ERIC Educational Resources Information Center
Mashiach Eizenberg, Michal; Zaslavsky, Orit
2004-01-01
We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…
Hua, Xijin; Wang, Ling; Al-Hajjar, Mazen; Jin, Zhongmin; Wilcox, Ruth K; Fisher, John
2014-07-01
Finite element models are becoming increasingly useful tools to conduct parametric analysis, design optimisation and pre-clinical testing for hip joint replacements. However, the verification of the finite element model is critically important. The purposes of this study were to develop a three-dimensional anatomic finite element model for a modular metal-on-polyethylene total hip replacement for predicting its contact mechanics and to conduct experimental validation for a simple finite element model which was simplified from the anatomic finite element model. An anatomic modular metal-on-polyethylene total hip replacement model (anatomic model) was first developed and then simplified with reasonable accuracy to a simple modular total hip replacement model (simplified model) for validation. The contact areas on the articulating surface of three polyethylene liners of modular metal-on-polyethylene total hip replacement bearings with different clearances were measured experimentally in the Leeds ProSim hip joint simulator under a series of loading conditions and different cup inclination angles. The contact areas predicted from the simplified model were then compared with that measured experimentally under the same conditions. The results showed that the simplification made for the anatomic model did not change the predictions of contact mechanics of the modular metal-on-polyethylene total hip replacement substantially (less than 12% for contact stresses and contact areas). Good agreements of contact areas between the finite element predictions from the simplified model and experimental measurements were obtained, with maximum difference of 14% across all conditions considered. This indicated that the simplification and assumptions made in the anatomic model were reasonable and the finite element predictions from the simplified model were valid. © IMechE 2014.
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
Automated Array Assembly, Phase 2
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1979-01-01
The Automated Array Assembly Task, Phase 2 of the Low Cost Silicon Solar Array Project is a process development task. The contract provides for the fabrication of modules from large area tandem junction cells (TJC). During this quarter, effort was focused on the design of a large area, approximately 36 sq cm, TJC and process verification runs. The large area TJC design was optimized for minimum I squared R power losses. In the TJM activity, the cell-module interfaces were defined, module substrates were formed and heat treated and clad metal interconnect strips were fabricated.
Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics
NASA Technical Reports Server (NTRS)
Vary, A.
1980-01-01
Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.
Self-verification and contextualized self-views.
Chen, Serena; English, Tammy; Peng, Kaiping
2006-07-01
Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.
An experimental verification of metamaterial coupled enhanced transmission for antenna applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pushpakaran, Sarin V.; Raj, Rohith K.; Pradeep, Anju
2014-02-10
Inspired by the work of Bethe on electromagnetic transmission through subwavelength hole, there has been immense interest on the extraordinary transmission through subwavelength slot/slit on metal plates. The invention of metamaterials has boosted the extra ordinary transmission through subwavelength slots. We examine computationally and experimentally the concept of metamaterial cover using an array of split ring resonators (SRRs), for enhancing the transmission in a stacked dipole antenna working in the S band. The front to back ratio is considerably improved by enhancing the magnetic resonant strength in close proximity of the slit of the upper parasitic dipole. The effect ofmore » stacking height of the SRR monolayer on the resonant characteristics of the split ring resonators and its effect on antenna radiation characteristics has been studied.« less
Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A
2011-10-01
A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.
Systematic study of source mask optimization and verification flows
NASA Astrophysics Data System (ADS)
Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi
2012-06-01
Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.
The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases
KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM
2011-01-01
Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874
The effect of mystery shopper reports on age verification for tobacco purchases.
Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William
2011-09-01
Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention. Copyright © Taylor & Francis Group, LLC
NASA Astrophysics Data System (ADS)
Szcześniak, Dominik; Hoehn, Ross D.; Kais, Sabre
2018-05-01
The transition metal dichalcogenide (M X2 , where M =Mo , W and X =S , Se, Te) monolayers are of high interest for semiconducting applications at the nanoscale level; this interest is due to both their direct band gaps and high charge mobilities. In this regard, an in-depth understating of the related Schottky barrier heights, associated with the incorporation of M X2 sheets into novel low-dimensional metal-semiconductor junctions, is of crucial importance. Herein, we generate and provide analysis of the Schottky barrier heights behavior to account for the metal-induced gap states concept as its explanation. In particular, the present investigations concentrate on the estimation of the charge neutrality levels directly by employing the primary theoretical model, i.e., the cell-averaged Green's function formalism combined with the complex band structure technique. The results presented herein place charge neutrality levels in the vicinity of the midgap; this is in agreement with previous reports and analogous to the behavior of three-dimensional semiconductors. The calculated canonical Schottky barrier heights are also found to be in agreement with other computational and experimental values in cases where the difference between electronegativities of the semiconductor and metal contact is small. Moreover, the influence of the spin-orbit effects is herein considered and supports that Schottky barrier heights have metal-induced gap state-derived character, regardless whether spin-orbit coupling interactions are considered. The results presented within this report constitute a direct and vital verification of the importance of metal-induced gap states in explaining the behavior of observed Schottky barrier heights at M X2 -metal junctions.
Trojanowicz, Karol; Wójcik, Włodzimierz
2011-01-01
The article presents a case-study on the calibration and verification of mathematical models of organic carbon removal kinetics in biofilm. The chosen Harremöes and Wanner & Reichert models were calibrated with a set of model parameters obtained both during dedicated studies conducted at pilot- and lab-scales for petrochemical wastewater conditions and from the literature. Next, the models were successfully verified through studies carried out utilizing a pilot ASFBBR type bioreactor installed in an oil-refinery wastewater treatment plant. During verification the pilot biofilm reactor worked under varying surface organic loading rates (SOL), dissolved oxygen concentrations and temperatures. The verification proved that the models can be applied in practice to petrochemical wastewater treatment engineering for e.g. biofilm bioreactor dimensioning.
Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.
Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B
2011-03-01
Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.
He, Hua; McDermott, Michael P.
2012-01-01
Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R
Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less
Sharma, Sunil D; Kumar, Rajesh; Akhilesh, Philomina; Pendse, Anil M; Deshpande, Sudesh; Misra, Basant K
2012-01-01
Dose verification to cochlea using metal oxide semiconductor field effect transistor (MOSFET) dosimeter using a specially designed multi slice head and neck phantom during the treatment of acoustic schwannoma by Gamma Knife radiosurgery unit. A multi slice polystyrene head phantom was designed and fabricated for measurement of dose to cochlea during the treatment of the acoustic schwannoma. The phantom has provision to position the MOSFET dosimeters at the desired location precisely. MOSFET dosimeters of 0.2 mm x 0.2 mm x 0.5 μm were used to measure the dose to the cochlea. CT scans of the phantom with MOSFETs in situ were taken along with Leksell frame. The treatment plans of five patients treated earlier for acoustic schwannoma were transferred to the phantom. Dose and coordinates of maximum dose point inside the cochlea were derived. The phantom along with the MOSFET dosimeters was irradiated to deliver the planned treatment and dose received by cochlea were measured. The treatment planning system (TPS) estimated and measured dose to the cochlea were in the range of 7.4 - 8.4 Gy and 7.1 - 8 Gy, respectively. The maximum variation between TPS calculated and measured dose to cochlea was 5%. The measured dose values were found in good agreement with the dose values calculated using the TPS. The MOSFET dosimeter can be a suitable choice for routine dose verification in the Gamma Knife radiosurgery.
Electronic cigarette sales to minors via the internet.
Williams, Rebecca S; Derrick, Jason; Ribisl, Kurt M
2015-03-01
Electronic cigarettes (e-cigarettes) entered the US market in 2007 and, with little regulatory oversight, grew into a $2-billion-a-year industry by 2013. The Centers for Disease Control and Prevention has reported a trend of increasing e-cigarette use among teens, with use rates doubling from 2011 to 2012. While several studies have documented that teens can and do buy cigarettes online, to our knowledge, no studies have yet examined age verification among Internet tobacco vendors selling e-cigarettes. To estimate the extent to which minors can successfully purchase e-cigarettes online and assess compliance with North Carolina's 2013 e-cigarette age-verification law. In this cross-sectional study conducted from February 2014 to June 2014, 11 nonsmoking minors aged 14 to 17 years made supervised e-cigarette purchase attempts from 98 Internet e-cigarette vendors. Purchase attempts were made at the University of North Carolina Internet Tobacco Vendors Study project offices using credit cards. Rate at which minors can successfully purchase e-cigarettes on the Internet. Minors successfully received deliveries of e-cigarettes from 76.5% of purchase attempts, with no attempts by delivery companies to verify their ages at delivery and 95% of delivered orders simply left at the door. All delivered packages came from shipping companies that, according to company policy or federal regulation, do not ship cigarettes to consumers. Of the total orders, 18 failed for reasons unrelated to age verification. Only 5 of the remaining 80 youth purchase attempts were rejected owing to age verification, resulting in a youth buy rate of 93.7%. None of the vendors complied with North Carolina's e-cigarette age-verification law. Minors are easily able to purchase e-cigarettes from the Internet because of an absence of age-verification measures used by Internet e-cigarette vendors. Federal law should require and enforce rigorous age verification for all e-cigarette sales as with the federal PACT (Prevent All Cigarette Trafficking) Act's requirements for age verification in Internet cigarette sales.
NASA Technical Reports Server (NTRS)
Scheick, Leif
2010-01-01
The vertical metal oxide semiconductor field-effect transistor (MOSFET) is a widely used power transistor onboard a spacecraft. The MOSFET is typically employed in power supplies and high current switching applications. Due to the inherent high electric fields in the device, power MOSFETs are sensitive to heavy ion irradiation and can fail catastrophically as a result of single event gate rupture (SEGR) or single event burnout (SEB). Manufacturers have designed radiation-hardened power MOSFETs for space applications. These radiation hardened devices are not immune to SEGR or SEB but, rather, can exhibit them at a much more damaging ion than their non-radiation hardened counterparts. See [1] through [5] for more information.This effort was to investigate the SEGR and SEB responses of two power MOSFETs from IR(the IRHN57133SE and the IRHN57250SE) that have recently been produced on a new fabrication line. These tests will serve as a limited verification of these parts, but it is acknowledged that further testing on the respective parts may be needed for some mission profiles.
Hafnium transistor design for neural interfacing.
Parent, David W; Basham, Eric J
2008-01-01
A design methodology is presented that uses the EKV model and the g(m)/I(D) biasing technique to design hafnium oxide field effect transistors that are suitable for neural recording circuitry. The DC gain of a common source amplifier is correlated to the structural properties of a Field Effect Transistor (FET) and a Metal Insulator Semiconductor (MIS) capacitor. This approach allows a transistor designer to use a design flow that starts with simple and intuitive 1-D equations for gain that can be verified in 1-D MIS capacitor TCAD simulations, before final TCAD process verification of transistor properties. The DC gain of a common source amplifier is optimized by using fast 1-D simulations and using slower, complex 2-D simulations only for verification. The 1-D equations are used to show that the increased dielectric constant of hafnium oxide allows a higher DC gain for a given oxide thickness. An additional benefit is that the MIS capacitor can be employed to test additional performance parameters important to an open gate transistor such as dielectric stability and ionic penetration.
Fast regional readout CMOS Image Sensor for dynamic MLC tracking
NASA Astrophysics Data System (ADS)
Zin, H.; Harris, E.; Osmond, J.; Evans, P.
2014-03-01
Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.
Application of additive laser technologies in the gas turbine blades design process
NASA Astrophysics Data System (ADS)
Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.
2017-11-01
An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Verification, Validation and Sensitivity Studies in Computational Biomechanics
Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.
2012-01-01
Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646
Van Hoof, Joris J
2017-04-01
Currently, two different age verification systems (AVS) are implemented to enhance compliance with legal age limits for the sale of alcohol in the Netherlands. In this study, we tested the operational procedures and effectiveness of ID readers and remote age verification technology in supermarkets during the sale of alcohol. Following a trained alcohol purchase protocol, eight mystery shoppers (both underage and in the branch's reference age) conducted 132 alcohol purchase attempts in stores that were equipped with ID readers or remote age verification or were part of a control group. In stores equipped with an ID reader, 34% of the purchases were conducted without any mistakes (full compliance). In stores with remote age verification, full compliance was achieved in 87% of the cases. The control group reached 57% compliance, which is in line with the national average. Stores with ID readers perform worse than stores with remote age verification, and also worse than stores without any AVS. For both systems, in addition to effectiveness, public support and user friendliness need to be investigated. This study shows that remote age verification technology is a promising intervention that increases vendor compliance during the sales of age restricted products. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?
Schaun, Gustavo Z
2017-12-08
Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.
NASA Astrophysics Data System (ADS)
Arai, Yukiko; Aoki, Hitoshi; Abe, Fumitaka; Todoroki, Shunichiro; Khatami, Ramin; Kazumi, Masaki; Totsuka, Takuya; Wang, Taifeng; Kobayashi, Haruo
2015-04-01
1/f noise is one of the most important characteristics for designing analog/RF circuits including operational amplifiers and oscillators. We have analyzed and developed a novel 1/f noise model in the strong inversion, saturation, and sub-threshold regions based on SPICE2 type model used in any public metal-oxide-semiconductor field-effect transistor (MOSFET) models developed by the University of California, Berkeley. Our model contains two noise generation mechanisms that are mobility and interface trap number fluctuations. Noise variability dependent on gate voltage is also newly implemented in our model. The proposed model has been implemented in BSIM4 model of a SPICE3 compatible circuit simulator. Parameters of the proposed model are extracted with 1/f noise measurements for simulation verifications. The simulation results show excellent agreements between measurement and simulations.
Design and experimental verification of a water-like pentamode material
NASA Astrophysics Data System (ADS)
Zhao, Aiguo; Zhao, Zhigao; Zhang, Xiangdong; Cai, Xuan; Wang, Lei; Wu, Tao; Chen, Hong
2017-01-01
Pentamode materials approximate tailorable artificial liquids. Recently, microscopic versions of these intricate structures have been fabricated, and the static mechanical experiments reveal that the ratio of bulk modulus to shear modulus as large as 1000 can be obtained. However, no direct acoustic experimental characterizations have been reported yet. In this paper, a water-like two-dimensional pentamode material sample is designed and fabricated with a single metallic material, which is a hollow metallic foam-like structure at centimeter scale. Acoustic simulation and experimental testing results indicate that the designed pentamode material mimics water in acoustic properties over a wide frequency range, i.e., it exhibits transparency when surrounded by water. This work contributes to the development of microstructural design of materials with specific modulus and density distribution, thus paving the way for the physical realization of special acoustic devices such as metamaterial lenses and vibration isolation.
NASA Technical Reports Server (NTRS)
Lissenden, Cliff J.; Arnold, Steven M.
1996-01-01
Guidance for the formulation of robust, multiaxial, constitutive models for advanced materials is provided by addressing theoretical and experimental issues using micromechanics. The multiaxial response of metal matrix composites, depicted in terms of macro flow/damage surfaces, is predicted at room and elevated temperatures using an analytical micromechanical model that includes viscoplastic matrix response as well as fiber-matrix debonding. Macro flow/damage surfaces (i.e., debonding envelopes, matrix threshold surfaces, macro 'yield' surfaces, surfaces of constant inelastic strain rate, and surfaces of constant dissipation rate) are determined for silicon carbide/titanium in three stress spaces. Residual stresses are shown to offset the centers of the flow/damage surfaces from the origin and their shape is significantly altered by debonding. The results indicate which type of flow/damage surfaces should be characterized and what loadings applied to provide the most meaningful experimental data for guiding theoretical model development and verification.
Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K
2013-03-04
The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.
The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...
Verification of chemistry reference ranges using a simple method in sub-Saharan Africa
Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania
2016-01-01
Background Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. Objectives We report on an easy and cost-saving method to verify CRRs. Methods Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory. PMID:28879112
Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.
De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania
2016-01-01
Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
NASA Astrophysics Data System (ADS)
Peng, Edwin
In the recent decades, there has been much interest in functionalized surfaces produced by ultrafast laser processing. Using pulse lasers with nanosecond to femtosecond time scale, a wide range of micro/nanoscale structures can be produced on virtually all metal surfaces. These surface structures create special optoelectronic, wetting, and tribological properties with a diverse range of potential applications. The formation mechanisms of these surface structures, especially microscale, mound-like structures, are not fully understood. There has been wide study of ultrafast laser processing of metals. Yet, the proposed formation models present in current literature often lack sufficient experimental verification. Specifically, many studies are limited to surface characterization, e.g. scanning electron microscopy of the surfaces of these micro/nanoscale structures. Valuable insight into the physical processes responsible for formation can be obtained if standard material science characterization methods are performed across the entire mound. In our study, we examined mound-like structures formed on three metal alloys. Using cross section and 3D slice and view operations by a dual beam scanning electron microscope-focused ion beam, the interior microstructures of these mounds are revealed. Taking advantage of amorphous phase formation during laser processing of Ni60Nb40, we verified the fluence-dependent formation model: mounds formed at low fluence are primarily the result of ablation while mounds formed at high fluence are formed by both ablation and rapid resolidification by hydrodynamical fluid flow. For the first time, we revealed the cross section of a wide variety of mound-like structures on titanium surfaces. The increased contribution to mound formation by fluid flow with increasing fluence was observed. Finally, a 3D scanning electron microscopy technique was applied for mounds produced on silver surface by delayed-pulse laser processing. The interior microstructure demonstrated that most of the volume comprised of resolidified silver grains with 1% porosity.
Characterizing proton-activated materials to develop PET-mediated proton range verification markers
NASA Astrophysics Data System (ADS)
Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.
2016-06-01
Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.
Advances in Studies of Electrode Kinetics and Mass Transport in AMTEC Cells (abstract)
NASA Technical Reports Server (NTRS)
Williams, R. M.; Jeffries-Nakamura, B.; Ryan, M. A.; Underwood, M. L.; Kisor, A.; O'Connor, D.; Kikkert, S.
1993-01-01
Previous work reported from JPL has included characterization of electrode kinetics and alkali atom transport from electrodes including Mo, W, WRh(sub x), WPt(sub x)(Mn), in sodium AMTEC cells and vapor exposure cells, and Mo in potassium vapor exposure cells. These studies were generally performed in cells with small area electrodes (about 1 to 5 cm(sup 2)), and device geometry had little effect on transport. Alkali diffusion coefficients through these electrodes have been characterized, and approximate surface diffusion coefficients derived in cases of activated transport. A basic model of electrode kinetic at the alkali metal vapor/porous metal electrode/alkali beta'-alumina solid electrolyte three phase boundary has been proposed which accounts for electrochemical reaction rates with a collision frequency near the three phase boundary and tunneling from the porous electrode partially covered with adsorbed alkali metal atoms. The small electrode effect in AMTEC cells has been discussed in several papers, but quantitative investigations have described only the overall effect and the important contribution of electrolyte resistance. The quantitative characterization of transport losses in cells with large area electrodes has been limited to simulations of large area electrode effects, or characterization of transport losses from large area electrodes with significant longitudinal temperature gradients. This paper describes new investigations of electrochemical kinetics and transport, particularily with WPt(sub 3.5) electrodes, including the influence of electrode size on the mass transport loss in the AMTEC cell. These electrodes possess excellent sodium transport properties making verification of device limitations on transport much more readily attained.
ASRM process development in aqueous cleaning
NASA Technical Reports Server (NTRS)
Swisher, Bill
1992-01-01
Viewgraphs are included on process development in aqueous cleaning which is taking place at the Aerojet Advanced Solid Rocket Motor (ASRM) Division under a NASA Marshall Space and Flight Center contract for design, development, test, and evaluation of the ASRM including new production facilities. The ASRM will utilize aqueous cleaning in several manufacturing process steps to clean case segments, nozzle metal components, and igniter closures. ASRM manufacturing process development is underway, including agent selection, agent characterization, subscale process optimization, bonding verification, and scale-up validation. Process parameters are currently being tested for optimization utilizing a Taguci Matrix, including agent concentration, cleaning solution temperature, agitation and immersion time, rinse water amount and temperature, and use/non-use of drying air. Based on results of process development testing to date, several observations are offered: aqueous cleaning appears effective for steels and SermeTel-coated metals in ASRM processing; aqueous cleaning agents may stain and/or attack bare aluminum metals to various extents; aqueous cleaning appears unsuitable for thermal sprayed aluminum-coated steel; aqueous cleaning appears to adequately remove a wide range of contaminants from flat metal surfaces, but supplementary assistance may be needed to remove clumps of tenacious contaminants embedded in holes, etc.; and hot rinse water appears to be beneficial to aid in drying of bare steel and retarding oxidation rate.
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
NASA Technical Reports Server (NTRS)
Masiulaniec, K. Cyril; Vanfossen, G. James, Jr.; Dewitt, Kenneth J.; Dukhan, Nihad
1995-01-01
A technique was developed to cast frozen ice shapes that had been grown on a metal surface. This technique was applied to a series of ice shapes that were grown in the NASA Lewis Icing Research Tunnel on flat plates. Nine flat plates, 18 inches square, were obtained from which aluminum castings were made that gave good ice shape characterizations. Test strips taken from these plates were outfitted with heat flux gages, such that when placed in a dry wind tunnel, can be used to experimentally map out the convective heat transfer coefficient in the direction of flow from the roughened surfaces. The effects on the heat transfer coefficient for both parallel and accelerating flow will be studied. The smooth plate model verification baseline data as well as one ice roughened test case are presented.
Heat transport through atomic contacts.
Mosso, Nico; Drechsler, Ute; Menges, Fabian; Nirmalraj, Peter; Karg, Siegfried; Riel, Heike; Gotsmann, Bernd
2017-05-01
Heat transport and dissipation at the nanoscale severely limit the scaling of high-performance electronic devices and circuits. Metallic atomic junctions serve as model systems to probe electrical and thermal transport down to the atomic level as well as quantum effects that occur in one-dimensional (1D) systems. Whereas charge transport in atomic junctions has been studied intensively in the past two decades, heat transport remains poorly characterized because it requires the combination of a high sensitivity to small heat fluxes and the formation of stable atomic contacts. Here we report heat-transfer measurements through atomic junctions and analyse the thermal conductance of single-atom gold contacts at room temperature. Simultaneous measurements of charge and heat transport reveal the proportionality of electrical and thermal conductance, quantized with the respective conductance quanta. This constitutes a verification of the Wiedemann-Franz law at the atomic scale.
Material Selection for Cable Gland to Improved Reliability of the High-hazard Industries
NASA Astrophysics Data System (ADS)
Vashchuk, S. P.; Slobodyan, S. M.; Deeva, V. S.; Vashchuk, D. S.
2018-01-01
The sealed cable glands (SCG) are available to ensure safest connection sheathed single wire for the hazard production facility (nuclear power plant and others) the same as pilot cable, control cables, radio-frequency cables et al. In this paper, we investigate the specifics of the material selection of SCG with the express aim of hazardous man-made facility. We discuss the safe working conditions for cable glands. The research indicates the sintering powdered metals cables provide the reliability growth due to their properties. A number of studies have demonstrated the verification of material selection. On the face of it, we make findings indicating that double glazed sealed units could enhance reliability. We had evaluated sample reliability under fire conditions, seismic load, and pressure containment failure. We used the samples mineral insulated thermocouple cable.
2017-08-01
comparable with MARATHON 1 in terms of output. Rather, the MARATHON 2 verification cases were designed to ensure correct implementation of the new algorithms...DISCLAIMER The findings of this report are not to be construed as an official Department of the Army position, policy, or decision unless so designated by...for employment against demands. This study is a comparative verification of the functionality of MARATHON 4 (our newest implementation of MARATHON
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
A field study of the accuracy and reliability of a biometric iris recognition system.
Latman, Neal S; Herb, Emily
2013-06-01
The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Levine, S. R.
1982-01-01
A first-cut integrated environmental attack life prediction methodology for hot section components is addressed. The HOST program is concerned with oxidation and hot corrosion attack of metallic coatings as well as their degradation by interdiffusion with the substrate. The effects of the environment and coatings on creep/fatigue behavior are being addressed through a joint effort with the Fatigue sub-project. An initial effort will attempt to scope the problem of thermal barrier coating life prediction. Verification of models will be carried out through benchmark rig tests including a 4 atm. replaceable blade turbine and a 50 atm. pressurized burner rig.
Experimental verification of the rainbow trapping effect in adiabatic plasmonic gratings
Gan, Qiaoqiang; Gao, Yongkang; Wagner, Kyle; Vezenov, Dmitri; Ding, Yujie J.; Bartoli, Filbert J.
2011-01-01
We report the experimental observation of a trapped rainbow in adiabatically graded metallic gratings, designed to validate theoretical predictions for this unique plasmonic structure. One-dimensional graded nanogratings were fabricated and their surface dispersion properties tailored by varying the grating groove depth, whose dimensions were confirmed by atomic force microscopy. Tunable plasmonic bandgaps were observed experimentally, and direct optical measurements on graded grating structures show that light of different wavelengths in the 500–700-nm region is “trapped” at different positions along the grating, consistent with computer simulations, thus verifying the “rainbow” trapping effect. PMID:21402936
Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia
2014-11-01
Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia
2014-01-01
Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806
SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Itano, M; Yamazaki, T; Tachibana, R
Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less
Zhang, Ying; Alonzo, Todd A
2016-11-01
In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B
2015-03-01
Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.
Electronic Cigarette Sales to Minors via the Internet
Williams, Rebecca S.; Derrick, Jason; Ribisl, Kurt M.
2015-01-01
Importance Electronic cigarettes (e-cigarettes) entered the US market in 2007 and, with little regulatory oversight, grew into a $2-billion-a-year industry by 2013. The Centers for Disease Control and Prevention has reported a trend of increasing e-cigarette use among teens, with use rates doubling from 2011 to 2012. While several studies have documented that teens can and do buy cigarettes online, to our knowledge, no studies have yet examined age verification among Internet tobacco vendors selling e-cigarettes. Objective To estimate the extent to which minors can successfully purchase e-cigarettes online and assess compliance with North Carolina's 2013 e-cigarette age-verification law. Design, Setting, and Participants In this cross-sectional study conducted from February 2014 to June 2014, 11 nonsmoking minors aged 14 to 17 years made supervised e-cigarette purchase attempts from 98 Internet e-cigarette vendors. Purchase attempts were made at the University of North Carolina Internet Tobacco Vendors Study project offices using credit cards. Main Outcome and Measure Rate at which minors can successfully purchase e-cigarettes on the Internet. Results Minors successfully received deliveries of e-cigarettes from 76.5% of purchase attempts, with no attempts by delivery companies to verify their ages at delivery and 95% of delivered orders simply left at the door. All delivered packages came from shipping companies that, according to company policy or federal regulation, do not ship cigarettes to consumers. Of the total orders, 18 failed for reasons unrelated to age verification. Only 5 of the remaining 80 youth purchase attempts were rejected owing to age verification, resulting in a youth buy rate of 93.7%. None of the vendors complied with North Carolina's e-cigarette age-verification law. Conclusions and Relevance Minors are easily able to purchase e-cigarettes from the Internet because of an absence of age-verification measures used by Internet e-cigarette vendors. Federal law should require and enforce rigorous age verification for all e-cigarette sales as with the federal PACT (Prevent All Cigarette Trafficking) Act's requirements for age verification in Internet cigarette sales. PMID:25730697
NASA Technical Reports Server (NTRS)
Srivas, Mandayam; Bickford, Mark
1991-01-01
The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...
Qi, Zhen-Yu; Deng, Xiao-Wu; Huang, Shao-Min; Shiu, Almon; Lerch, Michael; Metcalfe, Peter; Rosenfeld, Anatoly; Kron, Tomas
2011-08-01
A real-time dose verification method using a recently designed metal oxide semiconductor field effect transistor (MOSFET) dosimetry system was evaluated for quality assurance (QA) of intensity-modulated radiation therapy (IMRT). Following the investigation of key parameters that might affect the accuracy of MOSFET measurements (i.e., source surface distance [SSD], field size, beam incident angles and radiation energy spectrum), the feasibility of this detector in IMRT dose verification was demonstrated by comparison with ion chamber measurements taken in an IMRT QA phantom. Real-time in vivo measurements were also performed with the MOSFET system during serial tomotherapy treatments administered to 8 head and neck cancer patients. MOSFET sensitivity did not change with SSD. For field sizes smaller than 20 × 20 cm(2), MOFET sensitivity varied within 1.0%. The detector angular response was isotropic within 2% over 360°, and the observed sensitivity variation due to changes in the energy spectrum was negligible in 6-MV photons. MOSFET system measurements and ion chamber measurements agreed at all points in IMRT phantom plan verification, within 5%. The mean difference between 48 IMRT MOSFET-measured doses and calculated values in 8 patients was 3.33% and ranged from -2.20% to 7.89%. More than 90% of the total measurements had deviations of less than 5% from the planned doses. The MOSFET dosimetry system has been proven to be an effective tool in evaluating the actual dose within individual patients during IMRT treatment. Copyright © 2011 Elsevier Inc. All rights reserved.
INF and IAEA: A comparative analysis of verification strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheinman, L.; Kratzer, M.
1992-07-01
This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.
DOT National Transportation Integrated Search
2008-06-30
The following Independent Verification and Validation (IV&V) report documents and presents the results of a study of the Washington State Ferries Prototype Wireless High Speed Data Network. The purpose of the study was to evaluate and determine if re...
The development, verification, and comparison study between LC-MS libraries for two manufacturers’ instruments and a verified protocol are discussed. The LC-MS library protocol was verified through an inter-laboratory study that involved Federal, State, and private laboratories. ...
A verification library for multibody simulation software
NASA Technical Reports Server (NTRS)
Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.
1989-01-01
A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.
Generic Verification Protocol for Verification of Online Turbidimeters
This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...
INF and IAEA: A comparative analysis of verification strategy. [Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheinman, L.; Kratzer, M.
1992-07-01
This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.
Verification Assessment of Flow Boundary Conditions for CFD Analysis of Supersonic Inlet Flows
NASA Technical Reports Server (NTRS)
Slater, John W.
2002-01-01
Boundary conditions for subsonic inflow, bleed, and subsonic outflow as implemented into the WIND CFD code are assessed with respect to verification for steady and unsteady flows associated with supersonic inlets. Verification procedures include grid convergence studies and comparisons to analytical data. The objective is to examine errors, limitations, capabilities, and behavior of the boundary conditions. Computational studies were performed on configurations derived from a "parameterized" supersonic inlet. These include steady supersonic flows with normal and oblique shocks, steady subsonic flow in a diffuser, and unsteady flow with the propagation and reflection of an acoustic disturbance.
Verification Games: Crowd-Sourced Formal Verification
2016-03-01
VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced
Lightweight Carbon-Carbon High-Temperature Space Radiator
NASA Technical Reports Server (NTRS)
Miller, W.O.; Shih, Wei
2008-01-01
A document summarizes the development of a carbon-carbon composite radiator for dissipating waste heat from a spacecraft nuclear reactor. The radiator is to be bonded to metal heat pipes and to operate in conjunction with them at a temperature approximately between 500 and 1,000 K. A goal of this development is to reduce the average areal mass density of a radiator to about 2 kg/m(exp 2) from the current value of approximately 10 kg/m(exp 2) characteristic of spacecraft radiators made largely of metals. Accomplishments thus far include: (1) bonding of metal tubes to carbon-carbon material by a carbonization process that includes heating to a temperature of 620 C; (2) verification of the thermal and mechanical integrity of the bonds through pressure-cycling, axial-shear, and bending tests; and (3) construction and testing of two prototype heat-pipe/carbon-carbon-radiator units having different radiator areas, numbers of heat pipes, and areal mass densities. On the basis of the results achieved thus far, it is estimated that optimization of design could yield an areal mass density of 2.2 kg/m (exp 2) close to the goal of 2 kg/m(exp 2).
Direct observation of how the heavy-fermion state develops in CeCoIn5
NASA Astrophysics Data System (ADS)
Chen, Q. Y.; Xu, D. F.; Niu, X. H.; Jiang, J.; Peng, R.; Xu, H. C.; Wen, C. H. P.; Ding, Z. F.; Huang, K.; Shu, L.; Zhang, Y. J.; Lee, H.; Strocov, V. N.; Shi, M.; Bisti, F.; Schmitt, T.; Huang, Y. B.; Dudin, P.; Lai, X. C.; Kirchner, S.; Yuan, H. Q.; Feng, D. L.
2017-07-01
Heavy-fermion systems share some of the strange metal phenomenology seen in other unconventional superconductors, providing a unique opportunity to set strange metals in a broader context. Central to understanding heavy-fermion systems is the interplay of localization and itinerancy. These materials acquire high electronic masses and a concomitant Fermi volume increase as the f electrons delocalize at low temperatures. However, despite the wide-spread acceptance of this view, a direct microscopic verification has been lacking. Here we report high-resolution angle-resolved photoemission measurements on CeCoIn5, a prototypical heavy-fermion compound, which spectroscopically resolve the development of band hybridization and the Fermi surface expansion over a wide temperature region. Unexpectedly, the localized-to-itinerant transition occurs at surprisingly high temperatures, yet f electrons are still largely localized even at the lowest temperature. These findings point to an unanticipated role played by crystal-field excitations in the strange metal behavior of CeCoIn5. Our results offer a comprehensive experimental picture of the heavy-fermion formation, setting the stage for understanding the emergent properties, including unconventional superconductivity, in this and related materials.
Verification of E-Beam direct write integration into 28nm BEOL SRAM technology
NASA Astrophysics Data System (ADS)
Hohle, Christoph; Choi, Kang-Hoon; Gutsch, Manuela; Hanisch, Norbert; Seidel, Robert; Steidel, Katja; Thrun, Xaver; Werner, Thomas
2015-03-01
Electron beam direct write lithography (EBDW) potentially offers advantages for low-volume semiconductor manufacturing, rapid prototyping or design verification due to its high flexibility without the need of costly masks. However, the integration of this advanced patterning technology into complex CMOS manufacturing processes remains challenging. The low throughput of today's single e-Beam tools limits high volume manufacturing applications and maturity of parallel (multi) beam systems is still insufficient [1,2]. Additional concerns like transistor or material damage of underlying layers during exposure at high electron density or acceleration voltage have to be addressed for advanced technology nodes. In the past we successfully proved that potential degradation effects of high-k materials or ULK shrink can be neglected and were excluded by demonstrating integrated electrical results of 28nm node transistor and BEOL performance following 50kV electron beam dry exposure [3]. Here we will give an update on the integration of EBDW in the 300mm CMOS manufacturing processes of advanced integrated circuits at the 28nm SRAM node of GLOBALFOUNDRIES Dresden. The work is an update to what has been previously published [4]. E-beam patterning results of BEOL full chip metal and via layers with a dual damascene integration scheme using a 50kV VISTEC SB3050DW variable shaped electron beam direct writer at Fraunhofer IPMSCNT are demonstrated. For the patterning of the Metal layer a Mix & Match concept based on the sequence litho - etch -litho -etch (LELE) was developed and evaluated wherein several exposure fields were blanked out during the optical exposure. Etch results are shown and compared to the POR. Results are also shown on overlay performance and optimized e-Beam exposure time using most advanced data prep solutions and resist processes. The patterning results have been verified using fully integrated electrical measurement of metal lines and vias on wafer level. In summary we demonstrate the integration capability of EBDW into a productive CMOS process flow at the example of the 28nm SRAM technology node.
Swann, William B; Kwan, Virginia S Y; Polzer, Jeffrey T; Milton, Laurie P
2003-11-01
A longitudinal study examined the interplay of identity negotiation processes and diversity in small groups of master's of business administration (MBA) students. When perceivers formed relatively positive impressions of other group members, higher diversity predicted more individuation of targets. When perceivers formed relatively neutral impressions of other group members, however, higher diversity predicted less individuation of targets. Individuation at the outset of the semester predicted self-verification effects several weeks later, and self-verification, in turn, predicted group identification and creative task performance. The authors conclude that contrary to self-categorization theory, fostering individuation and self-verification in diverse groups may maximize group identification and productivity.
Developing a Test for Assessing Elementary Students' Comprehension of Science Texts
ERIC Educational Resources Information Center
Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien
2012-01-01
This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…
Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo
2012-01-01
Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.
Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less
Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H
2018-01-01
The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.
Design and verification of a novel hollow vibrating module for laser machining.
Wang, Zhaozhao; Jang, Seungbong; Kim, EunHee; Jeon, Yongho; Lee, Soo-Hun; Lee, Moon G
2015-04-01
If a vibration module is added on laser machining system, the quality of surface finish and aspect ratio on metals can be significantly enhanced. In this study, a single mobility model of vibrating laser along the path of laser beam was put forward. In order to realize the desired unidirectional motion, a resonance type vibration module with optical lens was designed and manufactured. This cylindrical module was composed of curved-beam flexure elements. The cylindrical coordinate system was established to describe the relationship of a curved-beam flexure element's motion and deformation. In addition, the stiffness matrix of the curved-beam element was obtained. Finite element method and dynamical modeling were provided to analyze the resonance frequency and the displacement of the motion. The feasibility of the design was demonstrated with the help of experiments on frequency response. Experimental results show good agreement with theoretical analysis and simulation predictions.
NASA Astrophysics Data System (ADS)
Matsuura, Masahiro; Mano, Takaaki; Noda, Takeshi; Shibata, Naokazu; Hotta, Masahiro; Yusa, Go
2018-02-01
Quantum energy teleportation (QET) is a proposed protocol related to quantum vacuum. The edge channels in a quantum Hall system are well suited for the experimental verification of QET. For this purpose, we examine a charge-density wave packet excited and detected by capacitively coupled front gate electrodes. We observe the waveform of the charge packet, which is proportional to the time derivative of the applied square voltage wave. Further, we study the transmission and reflection behaviors of the charge-density wave packet by applying a voltage to another front gate electrode to control the path of the edge state. We show that the threshold voltages where the dominant direction is switched in either transmission or reflection for dense and sparse wave packets are different from the threshold voltage where the current stops flowing in an equilibrium state.
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Results from an Independent View on The Validation of Safety-Critical Space Systems
NASA Astrophysics Data System (ADS)
Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.
2013-08-01
The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.
Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D
2014-03-01
Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.
Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm
Hashimoto, Koichi
2017-01-01
Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216
Double patterning from design enablement to verification
NASA Astrophysics Data System (ADS)
Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya
2011-11-01
Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.
Wong, James Min-Leong; Liu, Yen-Liang; Graves, Stephen; de Steiger, Richard
2015-11-01
More than 15,000 primary hip resurfacing arthroplasties have been recorded by the Australian Orthopaedic Association National Joint Replacement Registry (AOANJRR) with 884 primary procedures requiring revision for reasons other than infection, a cumulative percent revision rate at 12 years of 11%. However, few studies have reported the survivorship of these revision procedures. (1) What is the cumulative percent rerevision rate for revision procedures for failed hip resurfacings? (2) Is there a difference in rerevision rate among different types of revision or bearing surfaces? The AOANJRR collects data on all primary and revision hip joint arthroplasties performed in Australia and after verification against health department data, checking of unmatched procedures, and subsequent retrieval of unreported procedures is able to obtain an almost complete data set relating to hip arthroplasty in Australia. Revision procedures are linked to the known primary hip arthroplasty. There were 15,360 primary resurfacing hip arthroplasties recorded of which 884 had undergone revision and this was the cohort available to study. The types of revisions were acetabular only, femoral only, or revision of both acetabular and femoral components. With the exception of the acetabular-only revisions, all revisions converted hip resurfacing arthroplasties to conventional (stemmed) total hip arthroplasties (THAs). All initial revisions for infection were excluded. The survivorship of the different types of revisions and that of the different bearing surfaces used were estimated using the Kaplan-Meier method and compared using Cox proportional hazard models. Cumulative percent revision was calculated by determining the complement of the Kaplan-Meier survivorship function at that time multiplied by 100. Of the 884 revisions recorded, 102 underwent further revision, a cumulative percent rerevision at 10 years of 26% (95% confidence interval, 19.6-33.5). There was no difference in the rate of rerevision between acetabular revision and combined femoral and acetabular revision (hazard ratio [HR], 1.06 [0.47-2], p = 0.888), femoral revision and combined femoral and acetabular revision (HR, 1.00 [0.65-2], p = 0.987), and acetabular revision and femoral revision (HR, 1.06 [0.47-2], p = 0.893). There was no difference in the rate of rerevision when comparing different bearing surfaces (metal-on-metal versus ceramic-on-ceramic HR, 0.46 [0.16-1.29], p = 0.141; metal-on-metal versus ceramic-on-crosslinked polyethylene HR, 0.51 [0.15-1.76], p = 0.285; metal-on-metal versus metal-on-crosslinked polyethylene HR, 0.62 [0.20-1.89], p = 0.399; and metal-on-metal versus oxinium-on-crosslinked polyethylene HR, 0.53 [0.14-2.05], p = 0.356). Revision of a primary hip resurfacing arthroplasty is associated with a high risk of rerevision. This study may help surgeons guide their patients about the outcomes in the longer term after the first revision of hip resurfacing arthroplasty. Level III, therapeutic study.
Spacecraft attitude calibration/verification baseline study
NASA Technical Reports Server (NTRS)
Chen, L. C.
1981-01-01
A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.
Using expansive grasses for monitoring heavy metal pollution in the vicinity of roads.
Vachová, Pavla; Vach, Marek; Najnarová, Eva
2017-10-01
We propose a method for monitoring heavy metal deposition in the vicinity of roads using the leaf surfaces of two expansive grass species which are greatly abundant. A principle of the proposed procedure is to minimize the number of operations in collecting and preparing samples for analysis. The monitored elements are extracted from the leaf surfaces using dilute nitric acid directly in the sample-collection bottle. The ensuing steps, then, are only to filter the extraction solution and the elemental analysis itself. The verification results indicate that the selected grasses Calamagrostis epigejos and Arrhenatherum elatius are well suited to the proposed procedure. Selected heavy metals (Zn, Cu, Pb, Ni, Cr, and Cd) in concentrations appropriate for direct determination using methods of elemental analysis can be extracted from the surface of leaves of these species collected in the vicinity of roads with medium traffic loads. Comparing the two species showed that each had a different relationship between the amounts of deposited heavy metals and distance from the road. This disparity can be explained by specific morphological properties of the two species' leaf surfaces. Due to the abundant occurrence of the two species and the method's general simplicity and ready availability, we regard the proposed approach to constitute a broadly usable and repeatable one for producing reproducible results. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Candelone, Jean-Pierre; Hong, Sungmin; Pellone, Christian; Boutron, Claude F.
1995-08-01
Pb, Zn, Cd and Cu have been measured using ultraclean procedures in various sections of a 70.3-m snow/ice core covering the past 220 years (including the Industrial Revolution) drilled at Summit, central Greenland. These time series are the first reliable ones ever published for Zn, Cd, and Cu; for Pb they are the first verification of the pioneering data published more than two decades ago by C. Patterson and his coworkers [Murozumi et al., 1969]. For all four heavy metals, concentrations are found to have markedly increased up until the 1960s and 1970s before decreasing significantly during the following few decades. The timing and the amplitude of the observed changes differ significantly however from one metal to another. Comparison with concentration values obtained by analyzing ancient Holocene ice dated 7760 years B.P., that is, before humans started to impact on the atmosphere, show that no detectable increase occurred for Zn, Cd, and Cu before the Industrial Revolution. On the other hand, Pb concentrations were already one order of magnitude above natural values in late 18th century ice. Cumulative deposition of heavy metals to the whole Greenland ice cap since the Industrial Revolution ranges from 3200 t for Pb to 60 t for Cd.
NASA Astrophysics Data System (ADS)
Huzak, M.; Deleuze, M. S.; Hajgató, B.
2011-09-01
An analysis using the formalism of crystalline orbitals for extended systems with periodicity in one dimension demonstrates that any antiferromagnetic and half-metallic spin-polarization of the edge states in n-acenes, and more generally in zigzag graphene nanoislands and nanoribbons of finite width, would imply a spin contamination ⟨S2⟩ that increases proportionally to system size, in sharp and clear contradiction with the implications of Lieb's theorem for compensated bipartite lattices and the expected value for a singlet (S = 0) electronic ground state. Verifications on naphthalene, larger n-acenes (n = 3-10) and rectangular nanographene islands of increasing size, as well as a comparison using unrestricted Hartree-Fock theory along with basis sets of improving quality against various many-body treatments demonstrate altogether that antiferromagnetism and half-metallicity in extended graphene nanoribbons will be quenched by an exact treatment of electron correlation, at the confines of non-relativistic many-body quantum mechanics. Indeed, for singlet states, symmetry-breakings in spin-densities are necessarily the outcome of a too approximate treatment of static and dynamic electron correlation in single-determinantal approaches, such as unrestricted Hartree-Fock or Density Functional Theory. In this context, such as the size-extensive spin-contamination to which it relates, half-metallicity is thus nothing else than a methodological artefact.
Huzak, M; Deleuze, M S; Hajgató, B
2011-09-14
An analysis using the formalism of crystalline orbitals for extended systems with periodicity in one dimension demonstrates that any antiferromagnetic and half-metallic spin-polarization of the edge states in n-acenes, and more generally in zigzag graphene nanoislands and nanoribbons of finite width, would imply a spin contamination
NASA Technical Reports Server (NTRS)
1981-01-01
The technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per Watt peak was demonstrated. The proposed process sequence was reviewed and laboratory verification experiments were conducted. The preliminary process includes the following features: semicrystalline silicon (10 cm by 10 cm) as the silicon input material; spray on dopant diffusion source; Al paste BSF formation; spray on AR coating; electroless Ni plate solder dip metallization; laser scribe edges; K & S tabbing and stringing machine; and laminated EVA modules.
Advanced composite vertical fin for L-1011 aircraft
NASA Technical Reports Server (NTRS)
Jackson, A. C.
1984-01-01
The structural box of the L-1011 vertical fin was redesigned using advanced composite materials. The box was fabricated and ground tested to verify the structural integrity. This report summarizes the complete program starting with the design and analysis and proceeds through the process development ancillary test program production readiness verification testing, fabrication of the full-scale fin boxes and the full-scale ground testing. The program showed that advanced composites can economically and effectively be used in the design and fabrication of medium primary structures for commercial aircraft. Static-strength variability was demonstrated to be comparable to metal structures and the long term durability of advanced composite components was demonstrated.
Quantitative Hydrocarbon Surface Analysis
NASA Technical Reports Server (NTRS)
Douglas, Vonnie M.
2000-01-01
The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES
The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2013-12-01
ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.
Separating stages of arithmetic verification: An ERP study with a novel paradigm.
Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes
2015-08-01
In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
NASA Technical Reports Server (NTRS)
Cramer, B. A.; Davis, J. W.
1975-01-01
A method for predicting permanent cyclic creep deflections in stiffened panel structures was developed. The resulting computer program may be applied to either the time-hardening or strain-hardening theories of creep accumulation. Iterative techniques were used to determine structural rotations, creep strains, and stresses as a function of time. Deflections were determined by numerical integration of structural rotations along the panel length. The analytical approach was developed for analyzing thin-gage entry vehicle metallic-thermal-protection system panels subjected to cyclic bending loads at high temperatures, but may be applied to any panel subjected to bending loads. Predicted panel creep deflections were compared with results from cyclic tests of subsize corrugation and rib-stiffened panels. Empirical equations were developed for each material based on correlation with tensile cyclic creep data and both the subsize panels and tensile specimens were fabricated from the same sheet material. For Vol. 1, see N75-21431.
NASA Technical Reports Server (NTRS)
Ricks, Glen A.
1988-01-01
The assembly test article (ATA) consisted of two live loaded redesigned solid rocket motor (RSRM) segments which were assembled and disassembled to simulate the actual flight segment stacking process. The test assembly joint was flight RSRM design, which included the J-joint insulation design and metal capture feature. The ATA test was performed mid-November through 24 December 1987, at Kennedy Space Center (KSC), Florida. The purpose of the test was: certification that vertical RSRM segment mating and separation could be accomplished without any damage; verification and modification of the procedures in the segment stacking/destacking documents; and certification of various GSE to be used for flight assembly and inspection. The RSRM vertical segment assembly/disassembly is possible without any damage to the insulation, metal parts, or seals. The insulation J-joint contact area was very close to the predicted values. Numerous deviations and changes to the planning documents were made to ensure the flight segments are effectively and correctly stacked. Various GSE were also certified for use on flight segments, and are discussed in detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Logsdon, W.A.; Begley, J.A.; Gottshall, C.L.
1978-03-01
The ASME Boiler and Pressure Vessel Code, Section III, Article G-2000, requires that dynamic fracture toughness data be developed for materials with specified minimum yield strengths greater than 50 ksi to provide verification and utilization of the ASME specified minimum reference toughness K/sub IR/ curve. In order to qualify ASME SA508 Class 2a and ASME SA533 Grade A Class 2 pressure vessel steels (minimum yield strengths equal 65 kip/in./sup 2/ and 70 kip/in./sup 2/, respectively) per this requirement, dynamic fracture toughness tests were performed on these materials. All dynamic fracture toughness values of SA508 Class 2a base and HAZ material,more » SA533 Grade A Class 2 base and HAZ material, and applicable weld metals exceeded the ASME specified minimum reference toughness K/sub IR/ curve.« less
NASA Astrophysics Data System (ADS)
Wang, Fang; Liu, Chang; Liu, Xiaoning; Niu, Tiaoming; Wang, Jing; Mei, Zhonglei; Qin, Jiayong
2017-06-01
In this paper, a flat and incident angle independence absorbing material is proposed and numerically verified in the optical spectrum. A homogeneous and anisotropic dielectric slab as a non-reflecting layer is first reviewed, and a feasible realization strategy of the slab is then given by using layered isotropic materials. When the loss components of the constitutive materials are not zero, the slab will work as an angle insensitive absorbing layer, and the absorption rate augments with increase of the losses. As the numerical verifications, the field distributions of a metallic cylinder and a triangular metallic object individually covered by the designed absorbing layer are demonstrated. The simulation results show that the designed absorbing layer can efficiently absorb the incident waves with the property of incident angle independence at the operation frequency. This homogeneous slab can be used in one and two dimensional situations for the realization of an invisibility cloak, a carpet cloak and even a skin cloak, if it is used to conformally cover target objects.
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...
Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes
The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Yamaoka, S
1995-06-01
Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.
The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.
The National Human Exposure Assessment Sur...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS
This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...
Metal artifact suppression in megavoltage computed tomography
NASA Astrophysics Data System (ADS)
Schreiner, L. John; Rogers, Myron; Salomons, Greg; Kerr, Andrew
2005-04-01
There has been considerable interest in megavoltage CT (MVCT) imaging associated with the development of image guided radiation therapy. It is clear that MVCT can provide good image quality for patient setup verification with soft tissue contrast much better than noted in conventional megavoltage portal imaging. In addition, it has been observed that MVCT images exhibit considerably reduced artifacts surrounding metal implants (e.g., surgical clips, hip implants, dental fillings) compared to conventional diagnostic CT images (kVCT). When encountered, these artifacts greatly limit the usefulness of kVCT images, and a variety of solutions have been proposed to remove the artifacts, but these have met with only partial success. In this paper, we investigate the potential for CT imaging in regions surrounding metal implants using high-energy photons from a Cobalt-60 source and from a 4 MV linear accelerator. MVCT and kVCT images of contrast phantoms and a phantom containing a hip prosthesis are compared and analysed. We show that MVCT scans provide good fidelity for CT number quantification in the high-density regions of the images, and in the regions immediately adjacent to the metal implants. They also provide structural details within the high-density inserts and implants. Calculations will show that practical clinical MVCT imaging, able to detect 3% contrast objects, should be achievable with doses of about 2.5cGy. This suggests that MVCT not only has a role in radiotherapy treatment planning and guidance, but may also be indicated for surgical guidance and follow-up in regions where metal implants cannot be avoided.
GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER
The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2011 CFR
2011-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2010 CFR
2010-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
NASA Astrophysics Data System (ADS)
Ravindran, Paul B.; Ebenezer, Suman Babu S.; Winfred, Michael Raj; Amalan, S.
2017-05-01
The radiochromic FX gel with Optical CT readout has been investigated by several authors and has shown promising results for 3D dosimetry. One of the applications of the gel dosimeters is their use in 3D dose verification for IMRT and RapidArc quality assurance. Though polymer gel has been used successfully for clinical dose verification, the use of FX gel for clinical dose verification with optical cone beam CT needs further validation. In this work, we have used FX gel and an in- house optical readout system for gamma analysis between the dose matrices of measured dose distribution and a treatment planning system (TPS) calculated dose distribution for a few test cases.
Alloy, L B; Lipman, A J
1992-05-01
In this commentary we examine Swann, Wenzlaff, Krull, and Pelham's (1992) findings with respect to each of 5 central propositions in self-verification theory. We conclude that although the data are consistent with self-verification theory, none of the 5 components of the theory have been demonstrated convincingly as yet. Specifically, we argue that depressed subjects' selection of social feedback appears to be balanced or evenhanded rather than biased toward negative feedback and that there is little evidence to indicate that depressives actively seek negative appraisals. Furthermore, we suggest that the studies are silent with respect to the motivational postulates of self-verification theory and that a variety of competing cognitive and motivational models can explain Swann et al.'s findings as well as self-verification theory.
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
The formal verification of generic interpreters
NASA Technical Reports Server (NTRS)
Windley, P.; Levitt, K.; Cohen, G. C.
1991-01-01
The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.
RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports
NASA Technical Reports Server (NTRS)
Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.
1982-01-01
Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.
Developing a NASA strategy for the verification of large space telescope observatories
NASA Astrophysics Data System (ADS)
Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie
2006-06-01
In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...
This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...
Feasibility of biochemical verification in a web-based smoking cessation study.
Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L
2017-10-01
Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.
The U.S.-Mexico Border Program is sponsored ...
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda
2003-01-01
Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.
A novel numerical framework for self-similarity in plasticity: Wedge indentation in single crystals
NASA Astrophysics Data System (ADS)
Juul, K. J.; Niordson, C. F.; Nielsen, K. L.; Kysar, J. W.
2018-03-01
A novel numerical framework for analyzing self-similar problems in plasticity is developed and demonstrated. Self-similar problems of this kind include processes such as stationary cracks, void growth, indentation etc. The proposed technique offers a simple and efficient method for handling this class of complex problems by avoiding issues related to traditional Lagrangian procedures. Moreover, the proposed technique allows for focusing the mesh in the region of interest. In the present paper, the technique is exploited to analyze the well-known wedge indentation problem of an elastic-viscoplastic single crystal. However, the framework may be readily adapted to any constitutive law of interest. The main focus herein is the development of the self-similar framework, while the indentation study serves primarily as verification of the technique by comparing to existing numerical and analytical studies. In this study, the three most common metal crystal structures will be investigated, namely the face-centered cubic (FCC), body-centered cubic (BCC), and hexagonal close packed (HCP) crystal structures, where the stress and slip rate fields around the moving contact point singularity are presented.
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...
Large Engine Technology Program. Task 21: Rich Burn Liner for Near Term Experimental Evaluations
NASA Technical Reports Server (NTRS)
Hautman, D. J.; Padget, F. C.; Kwoka, D.; Siskind, K. S.; Lohmann, R. P.
2005-01-01
The objective of the task reported herein, which was conducted as part of the NASA sponsored Large Engine Technology program, was to define and evaluate a near-term rich-zone liner construction based on currently available materials and fabrication processes for a Rich-Quench-Lean combustor. This liner must be capable of operation at the temperatures and pressures of simulated HSCT flight conditions but only needs sufficient durability for limited duration testing in combustor rigs and demonstrator engines in the near future. This must be achieved at realistic cooling airflow rates since the approach must not compromise the emissions, performance, and operability of the test combustors, relative to the product engine goals. The effort was initiated with an analytical screening of three different liner construction concepts. These included a full cylinder metallic liner and one with multiple segments of monolithic ceramic, both of which incorporated convective cooling on the external surface using combustor airflow that bypassed the rich zone. The third approach was a metallic platelet construction with internal convective cooling. These three metal liner/jacket combinations were tested in a modified version of an existing Rich-Quench-Lean combustor rig to obtain data for heat transfer model refinement and durability verification.
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
The 871 keV gamma ray from 17O and the identification of plutonium oxide
NASA Astrophysics Data System (ADS)
Peurrung, Anthony; Arthur, Richard; Elovich, Robert; Geelhood, Bruce; Kouzes, Richard; Pratt, Sharon; Scheele, Randy; Sell, Richard
2001-12-01
Disarmament agreements and discussions between the United States and the Russian Federation for reducing the number of stockpiled nuclear weapons require verification of the origin of materials as having come from disassembled weapons. This has resulted in the identification of measurable "attributes" that characterize such materials. It has been proposed that the 871 keV gamma ray of 17O can be observed as an indicator of the unexpected presence of plutonium oxide, as opposed to plutonium metal, in such materials. We have shown that the observation of the 871 keV gamma ray is not a specific indicator of the presence of the oxide, but rather indicates the presence of nitrogen.
NASA Astrophysics Data System (ADS)
Belyaev, I. A.; Sviridov, V. G.; Batenin, V. M.; Biryukov, D. A.; Nikitina, I. S.; Manchkha, S. P.; Pyatnitskaya, N. Yu.; Razuvanov, N. G.; Sviridov, E. V.
2017-11-01
The results are presented of experimental investigations into liquid metal heat transfer performed by the joint research group consisting of specialist in heat transfer and hydrodynamics from NIU MPEI and JIHT RAS. The program of experiments has been prepared considering the concept of development of the nuclear power industry in Russia. This concept calls for, in addition to extensive application of water-cooled, water-moderated (VVER-type) power reactors and BN-type sodium cooled fast reactors, development of the new generation of BREST-type reactors, fusion power reactors, and thermonuclear neutron sources. The basic coolants for these nuclear power installations will be heavy liquid metals, such as lead and lithium-lead alloy. The team of specialists from NRU MPEI and JIHT RAS commissioned a new RK-3 mercury MHD-test facility. The major components of this test facility are a unique electrical magnet constructed at Budker Nuclear Physics Institute and a pressurized liquid metal circuit. The test facility is designed for investigating upward and downward liquid metal flows in channels of various cross-sections in a transverse magnetic field. A probe procedure will be used for experimental investigation into heat transfer and hydrodynamics as well as for measuring temperature, velocity, and flow parameter fluctuations. It is generally adopted that liquid metals are the best coolants for the Tokamak reactors. However, alternative coolants should be sought for. As an alternative to liquid metal coolants, molten salts, such as fluorides of lithium and beryllium (so-called FLiBes) or fluorides of alkali metals (so-called FLiNaK) doped with uranium fluoride, can be used. That is why the team of specialists from NRU MPEI and JIHT RAS, in parallel with development of a mercury MHD test facility, is designing a test facility for simulating molten salt heat transfer and hydrodynamics. Since development of this test facility requires numerical predictions and verification of numerical codes, all examined configurations of the MHD flow are also investigated numerically.
SU-F-T-407: Artifact Reduction with Dual Energy Or IMAR: Who’s Winning?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elder, E; Schreibmann, E; Dhabaan, A
2016-06-15
Purpose: The purpose of this abstract was to evaluate the performance of commercial strategies for artifact reduction in radiation oncology settings. The iterative metal artifact reduction (Siemens iMAR) algorithm and monoenergetic virtual datasets reconstructed from dual energy scans are compared side-by-side in their ability to image in the presence of metal inserts. Methods: A CIRS ATOM Dosimetry Verification Phantom was scanned with and without a metal insert on a SOMATOM Definition AS dual energy scanner. Images with the metal insert were reconstructed with (a) a tradition single energy CT scan with the iMAR option implemented, using different artifact reduction settingsmore » and (b) a monoenergetic scan calculated from dual energy scans by recovering differences in the energy-dependence of the attenuation coefficients of different materials and then creating a virtual monoenergetic scan from these coefficients. The iMAR and monoenergetic scans were then compared with the metal-free scan to assess changes in HU numbers and noise within a region around the metal insert. Results: Both the iMAR and dual energy scans reduced artifacts produced by the metal insert. However the iMAR results are dependent of the selected algorithm settings, with a mean HU difference ranging from 0.65 to 90.40 for different options. The mean differences without the iMAR correction were 38.74. When using the dual energy scan, the mean differences were 4.53, that is however attributed to increased noise and not artifacts, as the dual energy scan had the lowest skewness (2.52) compared to the iMAR scans (ranging from 3.90 to 4.88) and the lowest kurtosis (5.72 for dual energy, range of 18.19 to 27.36 for iMAR). Conclusion: Both approaches accurately recovered HU numbers, however the dual energy method provided smaller residual artifacts.« less
Process Sensitivity, Performance, and Direct Verification Testing of Adhesive Locking Features
NASA Technical Reports Server (NTRS)
Golden, Johnny L.; Leatherwood, Michael D.; Montoya, Michael D.; Kato, Ken A.; Akers, Ed
2012-01-01
Phase I: The use of adhesive locking features or liquid locking compounds (LLCs) (e.g., Loctite) as a means of providing a secondary locking feature has been used on NASA programs since the Apollo program. In many cases Loctite was used as a last resort when (a) self-locking fasteners were no longer functioning per their respective drawing specification, (b) access was limited for removal & replacement, or (c) replacement could not be accomplished without severe impact to schedule. Long-term use of Loctite became inevitable in cases where removal and replacement of worn hardware was not cost effective and Loctite was assumed to be fully cured and working. The NASA Engineering & Safety Center (NESC) and United Space Alliance (USA) recognized the need for more extensive testing of Loctite grades to better understand their capabilities and limitations as a secondary locking feature. These tests, identified as Phase I, were designed to identify processing sensitivities, to determine proper cure time, the correct primer to use on aerospace nutplate, insert and bolt materials such as A286 and MP35N, and the minimum amount of Loctite that is required to achieve optimum breakaway torque values. The .1900-32 was the fastener size tested, due to wide usage in the aerospace industry. Three different grades of Loctite were tested. Results indicate that, with proper controls, adhesive locking features can be successfully used in the repair of locking features and should be considered for design. Phase II: Threaded fastening systems used in aerospace programs typically have a requirement for a redundant locking feature. The primary locking method is the fastener preload and the traditional redundant locking feature is a self-locking mechanical device that may include deformed threads, non-metallic inserts, split beam features, or other methods that impede movement between threaded members. The self-locking resistance of traditional locking features can be directly verified during assembly by measuring the dynamic prevailing torque. Adhesive locking features or LLCs are another method of providing redundant locking, but a direct verification method has not been used in aerospace applications to verify proper installation when using LLCs because of concern for damage to the adhesive bond. The reliability of LLCs has also been questioned due to failures observed during testing with coupons for process verification, although the coupon failures have often been attributed to a lack of proper procedures. It is highly desirable to have a direct method of verifying the LLC cure or bond integrity. The purpose of the Phase II test program was to determine if the torque applied during direct verification of an adhesive locking feature degrades that locking feature. This report documents the test program used to investigate the viability of such a direct verification method. Results of the Phase II testing were positive, and additional investigation of direct verification of adhesive locking features is merited.
Cross-Language Phonological Activation of Meaning: Evidence from Category Verification
ERIC Educational Resources Information Center
Friesen, Deanna C.; Jared, Debra
2012-01-01
The study investigated phonological processing in bilingual reading for meaning. English-French and French-English bilinguals performed a category verification task in either their first or second language. Interlingual homophones (words that share phonology across languages but not orthography or meaning) and single language control words served…
Academic Self-Esteem and Perceived Validity of Grades: A Test of Self-Verification Theory.
ERIC Educational Resources Information Center
Okun, Morris A.; Fournet, Lee M.
1993-01-01
The hypothesis derived from self-verification theory that semester grade point average would be positively related to perceived validity of grade scores among high self-esteem undergraduates and inversely related for low self-esteem students was not supported in a study with 281 undergraduates. (SLD)
Implementation of Precision Verification Solvents on the External Tank
NASA Technical Reports Server (NTRS)
Campbell, M.
1998-01-01
This paper presents the Implementation of Precision Verification Solvents on the External Tank. The topics include: 1) Background; 2) Solvent Usages; 3) TCE (Trichloroethylene) Reduction; 4) Solvent Replacement Studies; 5) Implementation; 6) Problems Occuring During Implementation; and 7) Future Work. This paper is presented in viewgraph form.
NASA Astrophysics Data System (ADS)
Yung, Lai Chin; Fei, Cheong Choke; Mandeep, Jit Singh; Amin, Nowshad; Lai, Khin Wee
2015-11-01
The leadframe fabrication process normally involves additional thin-metal layer plating on the bulk copper substrate surface for wire bonding purposes. Silver, tin, and copper flakes are commonly adopted as plating materials. It is critical to assess the density of the plated metal layer, and in particular to look for porosity or voids underneath the layer, which may reduce the reliability during high-temperature stress. A fast, reliable inspection technique is needed to assess the porosity or void weakness. To this end, the characteristics of x-rays generated from bulk samples were examined using an energy-dispersive x-ray (EDX) detector to examine the porosity percentage. Monte Carlo modeling was integrated with Castaing's formula to verify the integrity of the experimental data. Samples with different porosity percentages were considered to test the correlation between the intensity of the collected x-ray signal and the material density. To further verify the integrity of the model, conventional cross-sectional samples were also taken to observe the porosity percentage using Image J software measurement. A breakthrough in bulk substrate assessment was achieved by applying EDX for the first time to nonelemental analysis. The experimental data showed that the EDX features were not only useful for elemental analysis, but also applicable to thin-film metal layer thickness measurement and bulk material density determination. A detailed experiment was conducted using EDX to assess the plating metal layer and bulk material porosity.
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
Design of verification platform for wireless vision sensor networks
NASA Astrophysics Data System (ADS)
Ye, Juanjuan; Shang, Fei; Yu, Chuang
2017-08-01
At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.
SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo
2016-06-15
Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less
Impact of radiation attenuation by a carbon fiber couch on patient dose verification
NASA Astrophysics Data System (ADS)
Yu, Chun-Yen; Chou, Wen-Tsae; Liao, Yi-Jen; Lee, Jeng-Hung; Liang, Ji-An; Hsu, Shih-Ming
2017-02-01
The aim of this study was to understand the difference between the measured and calculated irradiation attenuations obtained using two algorithms and to identify the influence of couch attenuation on patient dose verification. We performed eight tests of couch attenuation with two photon energies, two longitudinal couch positions, and two rail positions. The couch attenuation was determined using a radiation treatment planning system. The measured and calculated attenuations were compared. We also performed 12 verifications of head-and-neck and rectum cases by using a Delta phantom. The dose deviation (DD), distance to agreement (DTA), and gamma index of pencil-beam convolution (PBC) verifications were nearly the same. The agreement was least consistent for the anisotropic analytical algorithm (AAA) without the couch for the head-and-neck case, in which the DD, DTA, and gamma index were 74.4%, 99.3%, and 89%, respectively; for the rectum case, the corresponding values were 56.2%, 95.1%, and 92.4%. We suggest that dose verification should be performed using the following three metrics simultaneously: DD, DTA, and the gamma index.
ERIC Educational Resources Information Center
Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.
2012-01-01
Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…
Advanced in-production hotspot prediction and monitoring with micro-topography
NASA Astrophysics Data System (ADS)
Fanton, P.; Hasan, T.; Lakcher, A.; Le-Gratiet, B.; Prentice, C.; Simiz, J.-G.; La Greca, R.; Depre, L.; Hunsche, S.
2017-03-01
At 28nm technology node and below, hot spot prediction and process window control across production wafers have become increasingly critical to prevent hotspots from becoming yield-limiting defects. We previously established proof of concept for a systematic approach to identify the most critical pattern locations, i.e. hotspots, in a reticle layout by computational lithography and combining process window characteristics of these patterns with across-wafer process variation data to predict where hotspots may become yield impacting defects [1,2]. The current paper establishes the impact of micro-topography on a 28nm metal layer, and its correlation with hotspot best focus variations across a production chip layout. Detailed topography measurements are obtained from an offline tool, and pattern-dependent best focus (BF) shifts are determined from litho simulations that include mask-3D effects. We also establish hotspot metrology and defect verification by SEM image contour extraction and contour analysis. This enables detection of catastrophic defects as well as quantitative characterization of pattern variability, i.e. local and global CD uniformity, across a wafer to establish hotspot defect and variability maps. Finally, we combine defect prediction and verification capabilities for process monitoring by on-product, guided hotspot metrology, i.e. with sampling locations being determined from the defect prediction model and achieved prediction accuracy (capture rate) around 75%
NASA Astrophysics Data System (ADS)
Yoon, K. J.; Park, K. H.; Lee, S. K.; Goo, N. S.; Park, H. C.
2004-06-01
This paper describes an analytical design model for a layered piezo-composite unimorph actuator and its numerical and experimental verification using a LIPCA (lightweight piezo-composite curved actuator) that is lighter than other conventional piezo-composite type actuators. The LIPCA is composed of top fiber composite layers with high modulus and low CTE (coefficient of thermal expansion), a middle PZT ceramic wafer, and base layers with low modulus and high CTE. The advantages of the LIPCA design are to replace the heavy metal layer of THUNDER by lightweight fiber-reinforced plastic layers without compromising the generation of high force and large displacement and to have design flexibility by selecting the fiber direction and the number of prepreg layers. In addition to the lightweight advantage and design flexibility, the proposed device can be manufactured without adhesive layers when we use a resin prepreg system. A piezo-actuation model for a laminate with piezo-electric material layers and fiber composite layers is proposed to predict the curvature and residual stress of the LIPCA. To predict the actuation displacement of the LIPCA with curvature, a finite element analysis method using the proposed piezo-actuation model is introduced. The predicted deformations are in good agreement with the experimental ones.
A VST and VISTA study of globular clusters in NGC 253
NASA Astrophysics Data System (ADS)
Cantiello, Michele; Grado, Aniello; Rejkuba, Marina; Arnaboldi, Magda; Capaccioli, Massimo; Greggio, Laura; Iodice, Enrica; Limatola, Luca
2018-03-01
Context. Globular clusters (GCs) are key to our understanding of the Universe, as laboratories of stellar evolution, fossil tracers of the past formation epoch of the host galaxy, and effective distance indicators from local to cosmological scales. Aim. We analyze the properties of the sources in the NGC 253 with the aim of defining an up to date catalog of GC candidates in the galaxy. Given the distance of the galaxy, GCs in NGC 253 are ideal targets for resolved color-magnitude diagram studies of extragalactic GCs with next-generation diffraction limited ground-based telescopes. Methods: Our analysis is based on the science verification data of two ESO survey telescopes, VST and VISTA. Using ugri photometry from VST and JKs from VISTA, GC candidates were selected using as reference the morpho-photometric and color properties of spectroscopically confirmed GCs available in the literature. The strength of the results was verified against available archival HST/ACS data from the GHOSTS survey: all but two of the selected GC candidates appear as star clusters in HST footprints. Results: The adopted GC selection leads to the definition of a sample of ˜350 GC candidates. At visual inspection, we find that 82 objects match all the requirements for selecting GC candidates and 155 are flagged as uncertain GC candidate; however, 110 are unlikely GCs, which are most likely background galaxies. Furthermore, our analysis shows that four of the previously spectroscopically confirmed GCs, i.e., ˜20% of the total spectroscopic sample, are more likely either background galaxies or high-velocity Milky Way stars. The radial density profile of the selected best candidates shows the typically observed r1/4-law radial profile. The analysis of the color distributions reveals only marginal evidence of the presence of color bimodality, which is normally observed in galaxies of similar luminosity. The GC luminosity function does not show the typical symmetry, mainly because of the lack of bright GCs. Part of the bright GCs missing might be at very large galactocentric distances or along the line of sight of the galaxy dusty disk. As an alternative possibility, we speculate that a fraction of low luminosity GC candidates might instead be metal-rich, intermediate age clusters, but fall in a similar color interval of old, metal-poor GCs. Conclusions: Defining a contaminant-free sample of GCs in extragalactic systems is not a straight forward exercise. Using optical and near-IR photometry we purged the list of GCs with spectroscopic membership and photometric GC candidates in NGC 253. Our results show that the use of either spectroscopic or photometric data only does not generally ensure a contaminant-free sample and a combination of both spectroscopy and photometry is preferred. Table 3 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/611/A21This work is based on observations taken at the ESO La Silla Paranal Observatory within the VST Science Verification Programme ID 60.A-9286(A) and VISTA Science Verification Programme ID 60.A-9285(A).
Space shuttle engineering and operations support. Avionics system engineering
NASA Technical Reports Server (NTRS)
Broome, P. A.; Neubaur, R. J.; Welsh, R. T.
1976-01-01
The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
NASA Technical Reports Server (NTRS)
Windley, P. J.
1991-01-01
In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
NASA Astrophysics Data System (ADS)
La Rosa, Vanessa; Kacperek, Andrzej; Royle, Gary; Gibson, Adam
2014-06-01
Metal fiducial markers are often implanted on the back of the eye before proton therapy to improve target localization and reduce patient setup errors. We aim to detect characteristic x-ray emissions from metal targets during proton therapy to verify the treatment range accuracy. Initially gold was chosen for its biocompatibility properties. Proton-induced x-ray emissions (PIXE) from a 15 mm diameter gold marker were detected at different penetration depths of a 59 MeV proton beam at the CATANA proton facility at INFN-LNS (Italy). The Monte Carlo code Geant4 was used to reproduce the experiment and to investigate the effect of different size markers, materials, and the response to both mono-energetic and fully modulated beams. The intensity of the emitted x-rays decreases with decreasing proton energy and thus decreases with depth. If we assume the range to be the depth at which the dose is reduced to 10% of its maximum value and we define the residual range as the distance between the marker and the range of the beam, then the minimum residual range which can be detected with 95% confidence level is the depth at which the PIXE peak is equal to 1.96 σbkg, which is the standard variation of the background noise. With our system and experimental setup this value is 3 mm, when 20 GyE are delivered to a gold marker of 15 mm diameter. Results from silver are more promising. Even when a 5 mm diameter silver marker is placed at a depth equal to the range, the PIXE peak is 2.1 σbkg. Although these quantitative results are dependent on the experimental setup used in this research study, they demonstrate that the real-time analysis of the PIXE emitted by fiducial metal markers can be used to derive beam range. Further analysis are needed to demonstrate the feasibility of the technique in a clinical setup.
Al-Mohammed, Huda I; Mahyoub, Fareed H; Moftah, Belal A
2010-07-01
The object of this study was to compare the difference of skin dose measured in patients with acute lymphatic leukemia (ALL) treated with total body irradiation (TBI) using metal oxide semiconductor field-effect transistors (mobile MOSFET dose verification system (TN-RD-70-W) and thermoluminescent dosimeters (TLD-100 chips, Harshaw/ Bicron, OH, USA). Because TLD has been the most-commonly used technique in the skin dose measurement of TBI, the aim of the present study is to prove the benefit of using the mobile MOSFET (metal oxide semiconductor field effect transistor) dosimeter, for entrance dose measurements during the total body irradiation (TBI) over thermoluminescent dosimeters (TLD). The measurements involved 10 pediatric patients ages between 3 and 14 years. Thermoluminescent dosimeters and MOSFET dosimetry were performed at 9 different anatomic sites on each patient. The present results show there is a variation between skin dose measured with MOSFET and TLD in all patients, and for every anatomic site selected, there is no significant difference in the dose delivered using MOSFET as compared to the prescribed dose. However, there is a significant difference for every anatomic site using TLD compared with either the prescribed dose or MOSFET. The results indicate that the dosimeter measurements using the MOSFET gave precise measurements of prescribed dose. However, TLD measurement showed significant increased skin dose of cGy as compared to either prescribed dose or MOSFET group. MOSFET dosimeters provide superior dose accuracy for skin dose measurement in TBI as compared with TLD.
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
1981-03-01
overcome the shortcomings of this system. A phase III study develops the breakup model of the Space Shuttle clus’ter at various times into flight. The...2-1 ROCKET MODEL ..................................................... 2-5 COMBUSTION CHAMBER OPERATION ................................... 2-5...2-19 RESULTS .......................................................... 2-22 ROCKET MODEL
Why Verifying Diagnostic Decisions with a Checklist Can Help: Insights from Eye Tracking
ERIC Educational Resources Information Center
Sibbald, Matthew; de Bruin, Anique B. H.; Yu, Eric; van Merrienboer, Jeroen J. G.
2015-01-01
Making a diagnosis involves ratifying or verifying a proposed answer. Formalizing this verification process with checklists, which highlight key variables involved in the diagnostic decision, is often advocated. However, the mechanisms by which a checklist might allow clinicians to improve their verification process have not been well studied. We…
38 CFR 21.7652 - Certification of enrollment and verification of pursuit.
Code of Federal Regulations, 2011 CFR
2011-07-01
... maintain daily attendance records for any course leading to a standard college degree. (a) Content of... breaks between school years. (3) When a reservist enrolls in independent study leading to a standard...) Verification of pursuit. (1) A reservist who is pursuing a course leading to a standard college degree must...
ERIC Educational Resources Information Center
Bramao, Ines; Faisca, Luis; Forkstam, Christian; Inacio, Filomena; Araujo, Susana; Petersson, Karl Magnus; Reis, Alexandra
2012-01-01
In this study, we used event-related potentials (ERPs) to evaluate the contribution of surface color and color knowledge information in object identification. We constructed two color-object verification tasks--a surface and a knowledge verification task--using high color diagnostic objects; both typical and atypical color versions of the same…
38 CFR 21.7652 - Certification of enrollment and verification of pursuit.
Code of Federal Regulations, 2010 CFR
2010-07-01
... maintain daily attendance records for any course leading to a standard college degree. (a) Content of... breaks between school years. (3) When a reservist enrolls in independent study leading to a standard...) Verification of pursuit. (1) A reservist who is pursuing a course leading to a standard college degree must...
NASA Astrophysics Data System (ADS)
Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.
2008-02-01
IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).
Online 3D EPID-based dose verification: Proof of concept.
Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel
2016-07-01
Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.
Correlation Between Intercritical Heat-Affected Zone and Type IV Creep Damage Zone in Grade 91 Steel
NASA Astrophysics Data System (ADS)
Wang, Yiyu; Kannan, Rangasayee; Li, Leijun
2018-04-01
A soft zone in Cr-Mo steel weldments has been reported to accompany the infamous Type IV cracking, the highly localized creep damage in the heat-affected zone of creep-resistant steels. However, the microstructural features and formation mechanism of this soft zone are not well understood. In this study, using microhardness profiling and microstructural verification, the initial soft zone in the as-welded condition was identified to be located in the intercritical heat-affected zone of P91 steel weldments. It has a mixed structure, consisting of Cr-rich re-austenitized prior austenite grains and fine Cr-depleted, tempered martensite grains retained from the base metal. The presence of these further-tempered retained grains, originating from the base metal, is directly responsible for the hardness reduction of the identified soft zone in the as-welded condition. The identified soft zone exhibits a high location consistency at three thermal stages. Local chemistry analysis and thermodynamic calculation show that the lower chromium concentrations inside these retained grains thermodynamically decrease their potentials for austenitic transformation during welding. Heterogeneous grain growth is observed in the soft zone during postweld heat treatment. The mismatch of strengths between the weak Cr-depleted grains and strong Cr-rich grains enhances the creep damage. Local deformation of the weaker Cr-depleted grains accelerates the formation of creep cavities.
Mai, Hang-Nga; Lee, Kyeong Eun; Lee, Kyu-Bok; Jeong, Seung-Mi; Lee, Seok-Jae; Lee, Cheong-Hee; An, Seo-Young; Lee, Du-Hyeong
2017-10-01
The purpose of this study was to evaluate the reliability of computer-aided replica technique (CART) by calculating its agreement with the replica technique (RT), using statistical agreement analysis. A prepared metal die and a metal crown were fabricated. The gap between the restoration and abutment was replicated using silicone indicator paste (n = 25). Gap measurements differed in the control (RT) and experimental (CART) groups. In the RT group, the silicone replica was manually sectioned, and the marginal and occlusal gaps were measured using a microscope. In the CART group, the gap was digitized using optical scanning and image superimposition, and the gaps were measured using a software program. The agreement between the measurement techniques was evaluated by using the 95% Bland-Altman limits of agreement and concordance correlation coefficients (CCC). The least acceptable CCC was 0.90. The RT and CART groups showed linear association, with a strong positive correlation in gap measurements, but without significant differences. The 95% limits of agreement between the paired gap measurements were 3.84% and 7.08% of the mean. The lower 95% confidence limits of CCC were 0.9676 and 0.9188 for the marginal and occlusal gap measurements, respectively, and the values were greater than the allowed limit. The CART is a reliable digital approach for evaluating the fit accuracy of fixed dental prostheses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bojechko, Casey; Phillps, Mark; Kalet, Alan
Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less
Impact of a quality-assessment dashboard on the comprehensive review of pharmacist performance.
Trinh, Long D; Roach, Erin M; Vogan, Eric D; Lam, Simon W; Eggers, Garrett G
2017-09-01
The impact of a quality-assessment dashboard and individualized pharmacist performance feedback on the adherence of order verification was evaluated. A before-and-after study was conducted at a 1,440-bed academic medical center. Adherence of order verification was defined as orders verified according to institution-derived, medication-related guidelines and policies. Formulas were developed to assess the adherence of verified orders to dosing guidelines using patient-specific height, weight, and serum creatinine clearance values from the electronic medical record at the time of pharmacist verification. A total of 5 medications were assessed by the formulas for adherence and displayed on the dashboard: ampicillin-sulbactam, ciprofloxacin, piperacillin-tazobactam, acyclovir, and enoxaparin. Adherence of order verification was assessed before (May 1-July 31, 2015) and after (November 1, 2015-January 31, 2016) individualized performance feedback was given based on trends identified by the quality-assessment dashboard. There was a significant increase in the overall adherence rate postintervention (90.1% versus 91.9%, p = 0.040). Among the 34 pharmacists who participated, the percentage of pharmacists with at least 90% overall adherence increased postintervention (52.9% versus 70.6%, p = 0.103). Time to verification was similar before and after the study intervention (median, 6.0 minutes; interquartile range, 3-13 minutes). The rate of documentation for nonadherent orders increased significantly postintervention (57.1% versus 68.5%, p = 0.019). The implementation of the quality-assessment dashboard, educational sessions, and individualized performance feedback significantly improved pharmacist order-verification adherence to institution-derived, medication-related guidelines and policies and the documentation rate of nonadherent orders. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Numazaki, M; Kudo, E
1995-04-01
The present study was conducted to examine determinants of information-gathering behavior with regard to one's own characteristics. Four tasks with different self-congruent and incongruent diagnosticity were presented to subjects. As self-assessment theory predicted, high diagnostic tasks were preferred to low tasks. And as self-verification theory predicted, self-congruent diagnosticity had a stronger effect on task preference than self-incongruent diagnosticity. In addition, subjects who perceived the relevant characteristics important inclined to choose self-assessment behavior more than who did not. Also, subjects who were certain of their self-concept inclined to choose self-verification behavior more than who were not. These results suggest that both self-assessment and self-verification motivations play important roles in information-gathering behavior regarding one's characteristics, and strength of the motivations is determined by the importance of relevant characteristics or the certainty of self-concept.
Knoeferle, Pia; Urbach, Thomas P.; Kutas, Marta
2010-01-01
To re-establish picture-sentence verification – discredited possibly for its over-reliance on post-sentence response time (RT) measures - as a task for situated comprehension, we collected event-related brain potentials (ERPs) as participants read a subject-verb-object sentence, and RTs indicating whether or not the verb matched a previously depicted action. For mismatches (vs matches), speeded RTs were longer, verb N400s over centro-parietal scalp larger, and ERPs to the object noun more negative. RTs (congruence effect) correlated inversely with the centro-parietal verb N400s, and positively with the object ERP congruence effects. Verb N400s, object ERPs, and verbal working memory scores predicted more variance in RT effects (50%) than N400s alone. Thus, (1) verification processing is not all post-sentence; (2) simple priming cannot account for these results; and (3) verification tasks can inform studies of situated comprehension. PMID:20701712
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...
National Centers for Environmental Prediction
Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
National Centers for Environmental Prediction
Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plotkowski, A.; Kirka, M. M.; Babu, S. S.
A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less
Plotkowski, A.; Kirka, M. M.; Babu, S. S.
2017-10-16
A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less
42 CFR 457.380 - Eligibility verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...
The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...
Force on a storage ring vacuum chamber after sudden turn-off of a magnet power supply
NASA Astrophysics Data System (ADS)
Sinha, Gautam; Prabhu, S. S.
2011-10-01
We are commissioning a 2.5 GeV synchrotron radiation source (SRS) where electrons travel in high vacuum inside the vacuum chambers made of aluminum alloys. These chambers are kept between the pole gaps of magnets and are made to facilitate the radiation coming out of the storage ring to the experimental station. These chambers are connected by metallic bellows. During the commissioning phase of the SRS, the metallic bellows became ruptured due to the frequent tripping of the dipole magnet power supply. The machine was down for quite some time. In the case of a power supply trip, the current in the magnets decays exponentially. It was observed experimentally that the fast B field decay generates a large eddy current in the chambers and consequently the chambers are subjected to a huge Lorentz force. This motivated us to develop a theoretical model to study the force acting on a metallic plate when exposed to an exponentially decaying field and then to extend it for a rectangular vacuum chamber. The problem is formulated using Maxwell’s equations and converted to the inhomogeneous Helmholtz equation. After taking the Laplace transform, the equation is solved with appropriate boundary conditions. Final results are obtained after taking the appropriate inverse Laplace transform. The expressions for eddy current contour and magnetic field produced by the eddy current are also derived. Variations of the force on chambers of different wall thickness due to spatially varying and exponentially time decaying field are presented. The result is a general theory which can be applied to different geometries and calculation of power loss as well. Comparisons are made with results obtained by simulation using a finite element based code, for quick verification of the theoretical model.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Ramaswami, S.; Snipes, J. S.; Yavari, R.; Arakere, A.; Yen, C.-F.; Cheeseman, B. A.
2013-05-01
A fully coupled (two-way), transient, thermal-mechanical finite-element procedure is developed to model conventional gas metal arc welding (GMAW) butt-joining process. Two-way thermal-mechanical coupling is achieved by making the mechanical material model of the workpiece and the weld temperature-dependent and by allowing the potential work of plastic deformation resulting from large thermal gradients to be dissipated in the form of heat. To account for the heat losses from the weld into the surroundings, heat transfer effects associated with natural convection and radiation to the environment and thermal-heat conduction to the adjacent workpiece material are considered. The procedure is next combined with the basic physical-metallurgy concepts and principles and applied to a prototypical (plain) low-carbon steel (AISI 1005) to predict the distribution of various crystalline phases within the as-welded material microstructure in different fusion zone and heat-affected zone locations, under given GMAW-process parameters. The results obtained are compared with available open-literature experimental data to provide validation/verification for the proposed GMAW modeling effort.
New digital anti-copy/scan and verification technologies
NASA Astrophysics Data System (ADS)
Phillips, George K.
2004-06-01
This white paper reviews the method for making bearer printed information indistinguishable on a non-copyable substrate when a copied attempt is made on either an analog or digital electrostatic photocopier device. In 1995 we received patent number 5,704,651 for a non-copyable technology trademarked MetallicSafe. In this patent the abstract describes the usage of a reflective layer, formed on a complex pattern region and having graphic or font size shapes and type coordinating to particular patterns in the complex pattern region. The technology used in this patent has now been improved and evolved to new methods of creating a non-copyable substrate trademarked CopySafe+. CopySafe+ is formed of a metallic specular light reflector, a white camouflaged diffused light reflector, and the content information 'light absorption' layer. The synthesizing of these layers on a substrate creates dynamic camouflaged interference patterns and the phenomena of image chaos on a copy. In short, the orientation of a plurality of spectral and diffused light reflection camouflaged layers, mixed and coordinated with light absorption printed information, inhibits the copying device from reproducing the printed content.
Development of steady-state model for MSPT and detailed analyses of receiver
NASA Astrophysics Data System (ADS)
Yuasa, Minoru; Sonoda, Masanori; Hino, Koichi
2016-05-01
Molten salt parabolic trough system (MSPT) uses molten salt as heat transfer fluid (HTF) instead of synthetic oil. The demonstration plant of MSPT was constructed by Chiyoda Corporation and Archimede Solar Energy in Italy in 2013. Chiyoda Corporation developed a steady-state model for predicting the theoretical behavior of the demonstration plant. The model was designed to calculate the concentrated solar power and heat loss using ray tracing of incident solar light and finite element modeling of thermal energy transferred into the medium. This report describes the verification of the model using test data on the demonstration plant, detailed analyses on the relation between flow rate and temperature difference on the metal tube of receiver and the effect of defocus angle on concentrated power rate, for solar collector assembly (SCA) development. The model is accurate to an extent of 2.0% as systematic error and 4.2% as random error. The relationships between flow rate and temperature difference on metal tube and the effect of defocus angle on concentrated power rate are shown.
A Modeling Approach for Plastic-Metal Laser Direct Joining
NASA Astrophysics Data System (ADS)
Lutey, Adrian H. A.; Fortunato, Alessandro; Ascari, Alessandro; Romoli, Luca
2017-09-01
Laser processing has been identified as a feasible approach to direct joining of metal and plastic components without the need for adhesives or mechanical fasteners. The present work sees development of a modeling approach for conduction and transmission laser direct joining of these materials based on multi-layer optical propagation theory and numerical heat flow simulation. The scope of this methodology is to predict process outcomes based on the calculated joint interface and upper surface temperatures. Three representative cases are considered for model verification, including conduction joining of PBT and aluminum alloy, transmission joining of optically transparent PET and stainless steel, and transmission joining of semi-transparent PA 66 and stainless steel. Conduction direct laser joining experiments are performed on black PBT and 6082 anticorodal aluminum alloy, achieving shear loads of over 2000 N with specimens of 2 mm thickness and 25 mm width. Comparison with simulation results shows that consistently high strength is achieved where the peak interface temperature is above the plastic degradation temperature. Comparison of transmission joining simulations and published experimental results confirms these findings and highlights the influence of plastic layer optical absorption on process feasibility.
Computational materials design for energy applications
NASA Astrophysics Data System (ADS)
Ozolins, Vidvuds
2013-03-01
General adoption of sustainable energy technologies depends on the discovery and development of new high-performance materials. For instance, waste heat recovery and electricity generation via the solar thermal route require bulk thermoelectrics with a high figure of merit (ZT) and thermal stability at high-temperatures. Energy recovery applications (e.g., regenerative braking) call for the development of rapidly chargeable systems for electrical energy storage, such as electrochemical supercapacitors. Similarly, use of hydrogen as vehicular fuel depends on the ability to store hydrogen at high volumetric and gravimetric densities, as well as on the ability to extract it at ambient temperatures at sufficiently rapid rates. We will discuss how first-principles computational methods based on quantum mechanics and statistical physics can drive the understanding, improvement and prediction of new energy materials. We will cover prediction and experimental verification of new earth-abundant thermoelectrics, transition metal oxides for electrochemical supercapacitors, and kinetics of mass transport in complex metal hydrides. Research has been supported by the US Department of Energy under grant Nos. DE-SC0001342, DE-SC0001054, DE-FG02-07ER46433, and DE-FC36-08GO18136.
Backward spoof surface wave in plasmonic metamaterial of ultrathin metallic structure.
Liu, Xiaoyong; Feng, Yijun; Zhu, Bo; Zhao, Junming; Jiang, Tian
2016-02-04
Backward wave with anti-parallel phase and group velocities is one of the basic properties associated with negative refraction and sub-diffraction image that have attracted considerable interest in the context of photonic metamaterials. It has been predicted theoretically that some plasmonic structures can also support backward wave propagation of surface plasmon polaritons (SPPs), however direct experimental demonstration has not been reported, to the best of our knowledge. In this paper, a specially designed plasmonic metamaterial of corrugated metallic strip has been proposed that can support backward spoof SPP wave propagation. The dispersion analysis, the full electromagnetic field simulation and the transmission measurement of the plasmonic metamaterial waveguide have clearly validated the backward wave propagation with dispersion relation possessing negative slope and opposite directions of group and phase velocities. As a further verification and application, a contra-directional coupler is designed and tested that can route the microwave signal to opposite terminals at different operating frequencies, indicating new application opportunities of plasmonic metamaterial in integrated functional devices and circuits for microwave and terahertz radiation.
NASA Technical Reports Server (NTRS)
Meyers, Valerie; James, John T.; McCoy, Torin; Garcia, Hector
2010-01-01
Many lamps used in various spacecraft contain elemental mercury, which is efficiently absorbed through the lungs as a vapor. The liquid metal vaporizes slowly at room temperature, but may be completely vaporized when lamps are operating. Because current spacecraft environmental control systems are unable to remove mercury vapors, we considered short-term and long-term exposures. Using an existing study, we estimated mercury vapor releases from lamps that are not in operation during missions lasting less than or equal to 30 days; whereas we conservatively assumed complete vaporization from lamps that are operating or being used during missions lasing more than 30 days. Based on mercury toxicity, the Johnson Space Center's Toxicology Group recommends stringent safety controls and verifications for any hardware containing elemental mercury that could yield airborne mercury vapor concentrations greater than 0.1 mg/m3 in the total spacecraft atmosphere for exposures lasting less than or equal to 30 days, or concentrations greater than 0.01 mg/m3 for exposures lasting more than 30 days.
Design for pressure regulating components
NASA Technical Reports Server (NTRS)
Wichmann, H.
1973-01-01
The design development for Pressure Regulating Components included a regulator component trade-off study with analog computer performance verification to arrive at a final optimized regulator configuration for the Space Storable Propulsion Module, under development for a Jupiter Orbiter mission. This application requires the pressure regulator to be capable of long-term fluorine exposure. In addition, individual but basically identical (for purposes of commonality) units are required for separate oxidizer and fuel pressurization. The need for dual units requires improvement in the regulation accuracy over present designs. An advanced regulator concept was prepared featuring redundant bellows, all metallic/ceramic construction, friction-free guidance of moving parts, gas damping, and the elimination of coil springs normally used for reference forces. The activities included testing of actual size seat/poppet components to determine actual discharge coefficients and flow forces. The resulting data was inserted into the computer model of the regulator. Computer simulation of the propulsion module performance over two mission profiles indicated satisfactory minimization of propellant residual requirements imposed by regulator performance uncertainties.
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2009-01-01
In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.
EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...
40 CFR 1066.240 - Torque transducer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...
Study of Measurement Strategies of Geometric Deviation of the Position of the Threaded Holes
NASA Astrophysics Data System (ADS)
Drbul, Mário; Martikan, Pavol; Sajgalik, Michal; Czan, Andrej; Broncek, Jozef; Babik, Ondrej
2017-12-01
Verification of product and quality control is an integral part of current production process. In terms of functional requirements and product interoperability, it is necessary to analyze their dimensional and also geometric specifications. Threaded holes are verified elements too, which are a substantial part of detachable screw connections and have a broad presence in engineering products. This paper deals with on the analysing of measurement strategies of verification geometric deviation of the position of the threaded holes, which are the indirect method of measuring threaded pins when applying different measurement strategies which can affect the result of the verification of the product..
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-06
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
High-resolution face verification using pore-scale facial features.
Li, Dong; Zhou, Huiling; Lam, Kin-Man
2015-08-01
Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.
Geometrical verification system using Adobe Photoshop in radiotherapy.
Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige
2005-02-01
Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p < 0.007). The geometrical verification system using Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.
NASA Astrophysics Data System (ADS)
Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong
2007-03-01
As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.
Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G
2014-08-01
In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler
Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
A Comparative Study of Two Azimuth Based Non Standard Location Methods
2017-03-23
Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It
A Methodology for Formal Hardware Verification, with Application to Microprocessors.
1993-08-29
concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...
USDA-ARS?s Scientific Manuscript database
This study investigated rice irrigation water use in the University of Arkansas Rice Research Verification Program between the years of 2003 and 2011. Irrigation water use averaged 747 mm (29.4 inches) over the nine years. A significant 40% water savings was reported for rice grown under a zero gr...
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...
78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2008-06-12
Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.
40 CFR 1065.395 - Inertial PM balance verifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...
40 CFR 1065.395 - Inertial PM balance verifications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...
22 CFR 123.14 - Import certificate/delivery verification procedure.
Code of Federal Regulations, 2010 CFR
2010-04-01
... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...
22 CFR 123.14 - Import certificate/delivery verification procedure.
Code of Federal Regulations, 2011 CFR
2011-04-01
... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
24 CFR 5.512 - Verification of eligible immigration status.
Code of Federal Regulations, 2010 CFR
2010-04-01
... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...
Quantitative assessment of the physical potential of proton beam range verification with PET/CT.
Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T
2008-08-07
A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.
Quantitative assessment of the physical potential of proton beam range verification with PET/CT
NASA Astrophysics Data System (ADS)
Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.
2008-08-01
A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.
Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation
NASA Technical Reports Server (NTRS)
Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna
2000-01-01
This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks.
Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok
2017-01-24
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks
Kim, In-hwan; Kim, Bo-sung; Song, JooSeok
2017-01-01
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
NASA Astrophysics Data System (ADS)
Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich
2007-03-01
Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.
Verification of the Skorohod-Olevsky Viscous Sintering (SOVS) Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lester, Brian T.
2017-11-16
Sintering refers to a manufacturing process through which mechanically pressed bodies of ceramic (and sometimes metal) powders are heated to drive densification thereby removing the inherit porosity of green bodies. As the body densifies through the sintering process, the ensuing material flow leads to macroscopic deformations of the specimen and as such the final configuration differs form the initial. Therefore, as with any manufacturing step, there is substantial interest in understanding and being able to model the sintering process to predict deformation and residual stress. Efforts in this regard have been pursued for face seals, gear wheels, and consumer productsmore » like wash-basins. To understand the sintering process, a variety of modeling approaches have been pursued at different scales.« less
Mass spectrometer vacuum housing and pumping system
Coutts, G.W.; Bushman, J.F.; Alger, T.W.
1996-07-23
A vacuum housing and pumping system is described for a portable gas chromatograph/mass spectrometer (GC/MS). The vacuum housing section of the system has minimum weight for portability while designed and constructed to utilize metal gasket sealed stainless steel to be compatible with high vacuum operation. The vacuum pumping section of the system consists of a sorption (getter) pump to remove atmospheric leakage and outgassing contaminants as well as the gas chromatograph carrier gas (hydrogen) and an ion pump to remove the argon from atmospheric leaks. The overall GC/MS system has broad application to contaminants, hazardous materials, illegal drugs, pollution monitoring, etc., as well as for use by chemical weapon treaty verification teams, due to the light weight and portability thereof. 7 figs.
Mass spectrometer vacuum housing and pumping system
Coutts, Gerald W.; Bushman, John F.; Alger, Terry W.
1996-01-01
A vacuum housing and pumping system for a portable gas chromatograph/mass spectrometer (GC/MS). The vacuum housing section of the system has minimum weight for portability while designed and constructed to utilize metal gasket sealed stainless steel to be compatible with high vacuum operation. The vacuum pumping section of the system consists of a sorption (getter) pump to remove atmospheric leakage and outgassing contaminants as well as the gas chromatograph carrier gas (hydrogen) and an ion pump to remove the argon from atmospheric leaks. The overall GC/MS system has broad application to contaminants, hazardous materials, illegal drugs, pollution monitoring, etc., as well as for use by chemical weapon treaty verification teams, due to the light weight and portability thereof.
Collected Papers in Structural Mechanics Honoring Dr. James H. Starnes, Jr.
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr. (Compiler); Nemeth, Michael P. (Compiler); Malone, John B. (Compiler)
2006-01-01
This special publication contains a collection of structural mechanics papers honoring Dr. James H. Starnes, Jr. presented at the 46th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference held in Austin, Texas, April 18-21, 2005. Contributors to this publication represent a small number of those influenced by Dr. Starnes' technical leadership, his technical prowess and diversity, and his technical breath and depth in engineering mechanics. These papers cover some of the research areas Dr. Starnes investigated, which included buckling, postbuckling, and collapse of structures; composite structural mechanics, residual strength and damage tolerance of metallic and composite structures; and aircraft structural design, certification and verification. He actively pursued technical understanding and clarity, championed technical excellence, and modeled humility and perseverance.
Development of 1-m primary mirror for a spaceborne camera
NASA Astrophysics Data System (ADS)
Kihm, Hagyong; Yang, Ho-Soon; Rhee, Hyug-Gyo; Lee, Yun-Woo
2015-09-01
We present the development of a 1-m lightweight mirror system for a spaceborne electro-optical camera. The mirror design was optimized to satisfy the performance requirements under launch loads and space environment. The mirror made of Zerodur® has pockets at the back surface and three square bosses at the rim. Metallic bipod flexures support the mirror at the bosses and adjust the mirror's surface distortion due to gravity. We also show an analytical formulation of the bipod flexure, where compliance and stiffness matrices of the bipod flexure are derived to estimate theoretical performance and to make initial design guidelines. Optomechanical performances such as surface distortions due to gravity is explained. Environmental verification of the mirror is achieved by vibration tests.
7 CFR 272.8 - State income and eligibility verification system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...
24 CFR 985.3 - Indicators, HUD verification methods and ratings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...
78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...
30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?
Code of Federal Regulations, 2010 CFR
2010-07-01
... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
Verification test report on a solar heating and hot water system
NASA Technical Reports Server (NTRS)
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
46 CFR 61.40-3 - Design verification testing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...
Hsu, Shih-Ming; Hung, Chao-Hsiung; Liao, Yi-Jen; Fu, Hsiao-Mei; Tsai, Jo-Ting
2017-01-01
CyberKnife is one of multiple modalities for stereotactic radiosurgery (SRS). Due to the nature of CyberKnife and the characteristics of SRS, dose evaluation of the CyberKnife procedure is critical. A radiophotoluminescent glass dosimeter was used to verify the dose accuracy for the CyberKnife procedure and validate a viable dose verification system for CyberKnife treatment. A radiophotoluminescent glass dosimeter, thermoluminescent dosimeter, and Kodak EDR2 film were used to measure the lateral dose profile and percent depth dose of CyberKnife. A Monte Carlo simulation for dose verification was performed using BEAMnrc to verify the measured results. This study also used a radiophotoluminescent glass dosimeter coupled with an anthropomorphic phantom to evaluate the accuracy of the dose given by CyberKnife. Measurements from the radiophotoluminescent glass dosimeter were compared with the results of a thermoluminescent dosimeter and EDR2 film, and the differences found were less than 5%. The radiophotoluminescent glass dosimeter has some advantages in terms of dose measurements over CyberKnife, such as repeatability, stability, and small effective size. These advantages make radiophotoluminescent glass dosimeters a potential candidate dosimeter for the CyberKnife procedure. This study concludes that radiophotoluminescent glass dosimeters are a promising and reliable dosimeter for CyberKnife dose verification with clinically acceptable accuracy within 5%. PMID:28046056
Hsieh, Ling-Ling; Shieh, Jiunn-I; Wei, Li-Ju; Wang, Yi-Chun; Cheng, Kai-Yuan; Shih, Cheng-Ting
2017-05-01
Polymer gel dosimeters (PGDs) have been widely studied for use in the pretreatment verification of clinical radiation therapy. However, the readability of PGDs in three-dimensional (3D) dosimetry remain unclear. In this study, the pretreatment verifications of clinical radiation therapy were performed using an N-isopropyl-acrylamide (NIPAM) PGD, and the results were used to evaluate the performance of the NIPAM PGD on 3D dose measurement. A gel phantom was used to measure the dose distribution of a clinical case of intensity-modulated radiation therapy. Magnetic resonance imaging scans were performed for dose readouts. The measured dose volumes were compared with the planned dose volume. The relative volume histograms showed that relative volumes with a negative percent dose difference decreased as time elapsed. Furthermore, the histograms revealed few changes after 24h postirradiation. For the 3%/3mm and 2%/2mm criteria, the pass rates of the 12- and 24-h dose volumes were higher than 95%, respectively. This study thus concludes that the pass rate map can be used to evaluate the dose-temporal readability of PGDs and that the NIPAM PGD can be used for clinical pretreatment verifications. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Brantley
2016-01-01
A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less
An unattended verification station for UF6 cylinders: Field trial findings
NASA Astrophysics Data System (ADS)
Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.
2017-12-01
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2014-12-01
Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nose, Takayuki, E-mail: nose-takayuki@nms.ac.jp; Chatani, Masashi; Otani, Yuki
Purpose: High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Methods and Materials: Conventional X-ray fluoroscopic images aremore » degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Results: Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. Conclusions: With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use.« less
Nose, Takayuki; Chatani, Masashi; Otani, Yuki; Teshima, Teruki; Kumita, Shinichirou
2017-03-15
High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Conventional X-ray fluoroscopic images are degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use. Copyright © 2016 Elsevier Inc. All rights reserved.
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
NASA Astrophysics Data System (ADS)
Gao, Anran; Lu, Na; Dai, Pengfei; Fan, Chunhai; Wang, Yuelin; Li, Tie
2014-10-01
Sensitive and quantitative analysis of proteins is central to disease diagnosis, drug screening, and proteomic studies. Here, a label-free, real-time, simultaneous and ultrasensitive prostate-specific antigen (PSA) sensor was developed using CMOS-compatible silicon nanowire field effect transistors (SiNW FET). Highly responsive n- and p-type SiNW arrays were fabricated and integrated on a single chip with a complementary metal oxide semiconductor (CMOS) compatible anisotropic self-stop etching technique which eliminated the need for a hybrid method. The incorporated n- and p-type nanowires revealed complementary electrical response upon PSA binding, providing a unique means of internal control for sensing signal verification. The highly selective, simultaneous and multiplexed detection of PSA marker at attomolar concentrations, a level useful for clinical diagnosis of prostate cancer, was demonstrated. The detection ability was corroborated to be effective by comparing the detection results at different pH values. Furthermore, the real-time measurement was also carried out in a clinically relevant sample of blood serum, indicating the practicable development of rapid, robust, high-performance, and low-cost diagnostic systems.Sensitive and quantitative analysis of proteins is central to disease diagnosis, drug screening, and proteomic studies. Here, a label-free, real-time, simultaneous and ultrasensitive prostate-specific antigen (PSA) sensor was developed using CMOS-compatible silicon nanowire field effect transistors (SiNW FET). Highly responsive n- and p-type SiNW arrays were fabricated and integrated on a single chip with a complementary metal oxide semiconductor (CMOS) compatible anisotropic self-stop etching technique which eliminated the need for a hybrid method. The incorporated n- and p-type nanowires revealed complementary electrical response upon PSA binding, providing a unique means of internal control for sensing signal verification. The highly selective, simultaneous and multiplexed detection of PSA marker at attomolar concentrations, a level useful for clinical diagnosis of prostate cancer, was demonstrated. The detection ability was corroborated to be effective by comparing the detection results at different pH values. Furthermore, the real-time measurement was also carried out in a clinically relevant sample of blood serum, indicating the practicable development of rapid, robust, high-performance, and low-cost diagnostic systems. Electronic supplementary information (ESI) available: Electrical characterization of fabricated n- and p-type nanowires, and influence of Debye screening on PSA sensing. See DOI: 10.1039/c4nr03210a
Physical properties of star clusters in the outer LMC as observed by the DES
Pieres, A.; Santiago, B.; Balbinot, E.; ...
2016-05-26
The Large Magellanic Cloud (LMC) harbors a rich and diverse system of star clusters, whose ages, chemical abundances, and positions provide information about the LMC history of star formation. We use Science Verification imaging data from the Dark Energy Survey to increase the census of known star clusters in the outer LMC and to derive physical parameters for a large sample of such objects using a spatially and photometrically homogeneous data set. Our sample contains 255 visually identified cluster candidates, of which 109 were not listed in any previous catalog. We quantify the crowding effect for the stellar sample producedmore » by the DES Data Management pipeline and conclude that the stellar completeness is < 10% inside typical LMC cluster cores. We therefore develop a pipeline to sample and measure stellar magnitudes and positions around the cluster candidates using DAOPHOT. We also implement a maximum-likelihood method to fit individual density profiles and colour-magnitude diagrams. For 117 (from a total of 255) of the cluster candidates (28 uncatalogued clusters), we obtain reliable ages, metallicities, distance moduli and structural parameters, confirming their nature as physical systems. The distribution of cluster metallicities shows a radial dependence, with no clusters more metal-rich than [Fe/H] ~ -0.7 beyond 8 kpc from the LMC center. Furthermore, the age distribution has two peaks at ≃ 1.2 Gyr and ≃ 2.7 Gyr.« less
Development of oil canning index model for sheet metal forming products with large curvature
NASA Astrophysics Data System (ADS)
Kim, Honglae; Lee, Seonggi; Murugesan, Mohanraj; Hong, Seokmoo; Lee, Shanghun; Ki, Juncheol; Jung, Hunchul; Kim, Naksoo
2017-09-01
Oil canning is predominantly caused by unequal stretches and heterogeneous stress distributions in steel sheets, which affects the appearance of components and develop noise and vibration problems. This paper proposes the formulation of an Oil canning index (OCI) model that can predict the occurrence of oil canning in the sheet metal. To investigate the influence of material properties, we used electro-galvanized (EGI) and galvanized (GI) steel sheets with different thicknesses and processing conditions. Furthermore, this paper presents an appropriate experimental and numerical procedure for determining the sheet stiffness and indentation properties to evaluate the oil canning results. Experiments were carried out by varying the tensile force over different materials, thicknesses, and bead force. Comparison of the discrete results obtained from these experiments confirmed that the product shape characteristics, such as curvature, have a significant influence on the oil canning occurrence. Based on the results, we propose the new OCI model, which can effectively predict the oil canning occurrence owing to the shape curvature. Verification of the accuracy and usability of our model has been carried out by simulating the experiments that were done with the sheet metal. The authors observed a good agreement between the experimental and numerical results from the model. This research work can be considered as a very effective method for eliminating appearance defects from the automobile products.
Rapid Freeform Sheet Metal Forming: Technology Development and System Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiridena, Vijitha; Verma, Ravi; Gutowski, Timothy
The objective of this project is to develop a transformational RApid Freeform sheet metal Forming Technology (RAFFT) in an industrial environment, which has the potential to increase manufacturing energy efficiency up to ten times, at a fraction of the cost of conventional technologies. The RAFFT technology is a flexible and energy-efficient process that eliminates the need for having geometry-specific forming dies. The innovation lies in the idea of using the energy resource at the local deformation area which provides greater formability, process control, and process flexibility relative to traditional methods. Double-Sided Incremental Forming (DSIF), the core technology in RAFFT, ismore » a new concept for sheet metal forming. A blank sheet is clamped around its periphery and gradually deformed into a complex 3D freeform part by two strategically aligned stylus-type tools that follow a pre-described toolpath. The two tools, one on each side of the blank, can form a part with sharp features for both concave and convex shapes. Since deformation happens locally, the forming force at any instant is significantly decreased when compared to traditional methods. The key advantages of DSIF are its high process flexibility, high energy-efficiency, low capital investment, and the elimination of the need for massive amounts of die casting and machining. Additionally, the enhanced formability and process flexibility of DSIF can open up design spaces and result in greater weight savings.« less
Characterization of Electron Beam Free-Form Fabricated 2219 Aluminum and 316 Stainless Steel
NASA Technical Reports Server (NTRS)
Ekrami, Yasamin; Forth, Scott C.; Waid, Michael C.
2011-01-01
Researchers at NASA Langley Research Center have developed an additive manufacturing technology for ground and future space based applications. The electron beam free form fabrication (EBF3) is a rapid metal fabrication process that utilizes an electron beam gun in a vacuum environment to replicate a CAD drawing of a part. The electron beam gun creates a molten pool on a metal substrate, and translates with respect to the substrate to deposit metal in designated regions through a layer additive process. Prior to demonstration and certification of a final EBF3 part for space flight, it is imperative to conduct a series of materials validation and verification tests on the ground in order to evaluate mechanical and microstructural properties of the EBF3 manufactured parts. Part geometries of EBF3 2219 aluminum and 316 stainless steel specimens were metallographically inspected, and tested for strength, fatigue crack growth, and fracture toughness. Upon comparing the results to conventionally welded material, 2219 aluminum in the as fabricated condition demonstrated a 30% and 16% decrease in fracture toughness and ductility, respectively. The strength properties of the 316 stainless steel material in the as deposited condition were comparable to annealed stainless steel alloys. Future fatigue crack growth tests will integrate various stress ranges and maximum to minimum stress ratios needed to fully characterize EBF3 manufactured specimens.
Physical properties of star clusters in the outer LMC as observed by the DES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pieres, A.; Santiago, B.; Balbinot, E.
The Large Magellanic Cloud (LMC) harbors a rich and diverse system of star clusters, whose ages, chemical abundances, and positions provide information about the LMC history of star formation. We use Science Verification imaging data from the Dark Energy Survey to increase the census of known star clusters in the outer LMC and to derive physical parameters for a large sample of such objects using a spatially and photometrically homogeneous data set. Our sample contains 255 visually identified cluster candidates, of which 109 were not listed in any previous catalog. We quantify the crowding effect for the stellar sample producedmore » by the DES Data Management pipeline and conclude that the stellar completeness is < 10% inside typical LMC cluster cores. We therefore develop a pipeline to sample and measure stellar magnitudes and positions around the cluster candidates using DAOPHOT. We also implement a maximum-likelihood method to fit individual density profiles and colour-magnitude diagrams. For 117 (from a total of 255) of the cluster candidates (28 uncatalogued clusters), we obtain reliable ages, metallicities, distance moduli and structural parameters, confirming their nature as physical systems. The distribution of cluster metallicities shows a radial dependence, with no clusters more metal-rich than [Fe/H] ~ -0.7 beyond 8 kpc from the LMC center. Furthermore, the age distribution has two peaks at ≃ 1.2 Gyr and ≃ 2.7 Gyr.« less
Formal Verification at System Level
NASA Astrophysics Data System (ADS)
Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.
2009-05-01
System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.
Full-chip level MEEF analysis using model based lithography verification
NASA Astrophysics Data System (ADS)
Kim, Juhwan; Wang, Lantian; Zhang, Daniel; Tang, Zongwu
2005-11-01
MEEF (Mask Error Enhancement Factor) has become a critical factor in CD uniformity control since optical lithography process moved to sub-resolution era. A lot of studies have been done by quantifying the impact of the mask CD (Critical Dimension) errors on the wafer CD errors1-2. However, the benefits from those studies were restricted only to small pattern areas of the full-chip data due to long simulation time. As fast turn around time can be achieved for the complicated verifications on very large data by linearly scalable distributed processing technology, model-based lithography verification becomes feasible for various types of applications such as post mask synthesis data sign off for mask tape out in production and lithography process development with full-chip data3,4,5. In this study, we introduced two useful methodologies for the full-chip level verification of mask error impact on wafer lithography patterning process. One methodology is to check MEEF distribution in addition to CD distribution through process window, which can be used for RET/OPC optimization at R&D stage. The other is to check mask error sensitivity on potential pinch and bridge hotspots through lithography process variation, where the outputs can be passed on to Mask CD metrology to add CD measurements on those hotspot locations. Two different OPC data were compared using the two methodologies in this study.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-12-01
The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.
Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk
2015-01-01
The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383
Online 3D EPID-based dose verification: Proof of concept
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda
Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less
NASA Astrophysics Data System (ADS)
Honnell, Kevin; Burnett, Sarah; Yorke, Chloe'; Howard, April; Ramsey, Scott
2017-06-01
The Noh problem is classic verification problem in the field of compressible flows. Simple to conceptualize, it is nonetheless difficult for numerical codes to predict correctly, making it an ideal code-verification test bed. In its original incarnation, the fluid is a simple ideal gas; once validated, however, these codes are often used to study highly non-ideal fluids and solids. In this work the classic Noh problem is extended beyond the commonly-studied polytropic ideal gas to more realistic equations of state (EOS) including the stiff gas, the Nobel-Abel gas, and the Carnahan-Starling hard-sphere fluid, thus enabling verification studies to be performed on more physically-realistic fluids. Exact solutions are compared with numerical results obtained from the Lagrangian hydrocode FLAG, developed at Los Alamos. For these more realistic EOSs, the simulation errors decreased in magnitude both at the origin and at the shock, but also spread more broadly about these points compared to the ideal EOS. The overall spatial convergence rate remained first order.
Verified compilation of Concurrent Managed Languages
2017-11-01
designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A
Pharmacy Automation in Navy Medicine: A Study of Naval Medical Center San Diego
2015-09-01
to pharmacist verification. ...............................................................7 Figure 3. Robotic Delivery System Installed at Naval...medication, caps the vial, and affixes the label. This completed prescription is then placed on the conveyer belt for routing to pharmacist ...performing all steps, including transportation, up to pharmacist verification via the conveyer belt. Manual fills are located along the conveyor system
Two-Black Box Concept for Warhead Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Cameron Russell; Frame, Katherine Chiyoko; Mckigney, Edward Allen
2017-03-06
We have created a possible solution to meeting the requirements of certification/authentication while still employing complicated criteria. Technical solutions to protecting information from the host in an inspection environment needs to be assessed by those with specific expertise but, LANL can still study the verification problem. The two-black box framework developed provides another potential solution to the confidence vs. certification paradox.
Verification testing of the Polymem UF120 S2 Ultrafiltration Membrane Module was conducted over a 46-day period at the Green Bay Water Utility Filtration Plant, Luxemburg, Wisconsin. The ETV testing described herein was funded in conjunction with a 12-month membrane pilot study f...
Review of waste package verification tests. Semiannual report, October 1982-March 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soo, P.
1983-08-01
The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.
The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...
This verification study was a special project designed to determine the efficacy of a draft standard operating procedure (SOP) developed by US EPA Region 3 for the determination of selected glycols in drinking waters that may have been impacted by active unconventional oil and ga...
ERIC Educational Resources Information Center
Kwon, Junehee; Lee, Yee Ming; Park, Eunhye; Wang, Yujia; Rushing, Keith
2017-01-01
Purpose/Objectives: This study assessed current practices and attitudes of school nutrition program (SNP) management staff regarding free and reduced-price (F-RP) meal application and verification in SNPs. Methods: Stratified, randomly selected 1,500 SNP management staff in 14 states received a link to an online questionnaire and/or a printed…
Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas
NASA Astrophysics Data System (ADS)
Bonfiglio, D.; Cappello, S.; Chacon, L.
2010-11-01
A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)
Code of Federal Regulations, 2010 CFR
2010-07-01
... confidentiality of, Statements of Account, Verification Auditor's Reports, and other verification information... GENERAL PROVISIONS § 201.29 Access to, and confidentiality of, Statements of Account, Verification Auditor... Account, including the Primary Auditor's Reports, filed under 17 U.S.C. 1003(c) and access to a Verifying...
Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.
1987-06-01
166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold
Code of Federal Regulations, 2011 CFR
2011-10-01
...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...
Code of Federal Regulations, 2013 CFR
2013-10-01
...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...) 3206-0215, Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...
25 CFR 61.8 - Verification forms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
Simulation validation and management
NASA Astrophysics Data System (ADS)
Illgen, John D.
1995-06-01
Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
Self-verification and depression among youth psychiatric inpatients.
Joiner, T E; Katz, J; Lew, A S
1997-11-01
According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.
Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon; Bours, Patrick
In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.
Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies
NASA Technical Reports Server (NTRS)
Shum, C. K.
2000-01-01
This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.
Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Miller, Karen A.; Garner, James R.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less
The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...
49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?
Code of Federal Regulations, 2010 CFR
2010-10-01
... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...
40 CFR 1065.550 - Gas analyzer range verification and drift verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
.... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...
Automated verification of flight software. User's manual
NASA Technical Reports Server (NTRS)
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
Tegel, Hanna; Yderland, Louise; Boström, Tove; Eriksson, Cecilia; Ukkonen, Kaisa; Vasala, Antti; Neubauer, Peter; Ottosson, Jenny; Hober, Sophia
2011-08-01
Protein production and analysis in a parallel fashion is today applied in laboratories worldwide and there is a great need to improve the techniques and systems used for this purpose. In order to save time and money, a fast and reliable screening method for analysis of protein production and also verification of the protein product is desired. Here, a micro-scale protocol for the parallel production and screening of 96 proteins in plate format is described. Protein capture was achieved using immobilized metal affinity chromatography and the product was verified using matrix-assisted laser desorption ionization time-of-flight MS. In order to obtain sufficiently high cell densities and product yield in the small-volume cultivations, the EnBase® cultivation technology was applied, which enables cultivation in as small volumes as 150 μL. Here, the efficiency of the method is demonstrated by producing 96 human, recombinant proteins, both in micro-scale and using a standard full-scale protocol and comparing the results in regard to both protein identity and sample purity. The results obtained are highly comparable to those acquired through employing standard full-scale purification protocols, thus validating this method as a successful initial screening step before protein production at a larger scale. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cutting More than Metal: Breaking the Development Cycle
NASA Technical Reports Server (NTRS)
Singer, Chris
2014-01-01
New technology is changing the way we do business at NASA. The ability to use these new tools is made possible by a learning culture able to embrace innovation, flexibility, and prudent risk tolerance, while retaining the hard-won lessons learned of other successes and failures. Technologies such as 3-D manufacturing and structured light scanning are re-shaping the entire product life cycle, from design and analysis, through production, verification, logistics and operations. New fabrication techniques, verification techniques, integrated analysis, and models that follow the hardware from initial concept through operation are reducing the cost and time of building space hardware. Using these technologies to be more efficient, reliable and affordable requires we bring them to a level safe for NASA systems, maintain appropriate rigor in testing and acceptance, and transition new technology. Maximizing these technologies also requires cultural acceptance and understanding and balancing rules with creativity. Evolved systems engineering processes at NASA are increasingly more flexible than they have been in the past, enabling the implementation of new techniques and approaches. This paper provides an overview of NASA Marshall Space Flight Center's new approach to development, as well as examples of how that approach has been incorporated into NASA's Space Launch System (SLS) Program, which counts among its key tenants - safety, affordability, and sustainability. One of the 3D technologies that will be discussed in this paper is the design and testing of various rocket engine components.
NASA Astrophysics Data System (ADS)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program
NASA Technical Reports Server (NTRS)
Manobianco, John; Nutter, Paul
1997-01-01
The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, J; Hu, W; Xing, Y
Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less
Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations
NASA Technical Reports Server (NTRS)
Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)
1998-01-01
This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.
Scheuermann, Taneisha S; Richter, Kimber P; Rigotti, Nancy A; Cummins, Sharon E; Harrington, Kathleen F; Sherman, Scott E; Zhu, Shu-Hong; Tindle, Hilary A; Preacher, Kristopher J
2017-12-01
To estimate the prevalence and predictors of failed biochemical verification of self-reported abstinence among participants enrolled in trials of hospital-initiated smoking cessation interventions. Comparison of characteristics between participants who verified and those who failed to verify self-reported abstinence. Multi-site randomized clinical trials conducted between 2010 and 2014 in hospitals throughout the United States. Recently hospitalized smokers who reported tobacco abstinence 6 months post-randomization and provided a saliva sample for verification purposes (n = 822). Outcomes were salivary cotinine-verified smoking abstinence at 10 and 15 ng/ml cut-points. Predictors and correlates included participant demographics and tobacco use; hospital diagnoses and treatment; and study characteristics collected via surveys and electronic medical records. Usable samples were returned by 69.8% of the 1178 eligible trial participants who reported 7-day point prevalence abstinence. The proportion of participants verified as quit was 57.8% [95% confidence interval (CI) = 54.4, 61.2; 10 ng/ml cut-off] or 60.6% (95% CI = 57.2, 63.9; 15 ng/ml). Factors associated independently with verification at 10 ng/ml were education beyond high school education [odds ratio (OR) = 1.51; 95% CI = 1.07, 2.11], continuous abstinence since hospitalization (OR = 2.82; 95% CI = 2.02, 3.94), mailed versus in-person sample (OR = 3.20; 95% CI = 1.96, 5.21) and race. African American participants were less likely to verify abstinence than white participants (OR = 0.64; 95% CI = 0.44, 0.93). Findings were similar for verification at 15 ng/ml. Verification rates did not differ by treatment group. In the United States, high rates (40%) of recently hospitalized smokers enrolled in smoking cessation trials fail biochemical verification of their self-reported abstinence. © 2017 Society for the Study of Addiction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capozzi, D.; et al.
We present the first study of the evolution of the galaxy luminosity and stellar-mass functions (GLF and GSMF) carried out by the Dark Energy Survey (DES). We describe the COMMODORE galaxy catalogue selected from Science Verification images. This catalogue is made ofmore » $$\\sim 4\\times 10^{6}$$ galaxies at $$0« less
Face Verification across Age Progression using Discriminative Methods
2008-01-01
progression. The most related study to our work is [30], where the probabilistic eigenspace frame - work [22] is adapted for face identification across...solution has the same CAR and CRR, is frequently used to measure verification performance, B. Gradient Orientation and Gradient Orientation Pyramid Now we...proposed GOP representation. The other five approaches are different from our method in both representations and classification frame - works. For
Comments for A Conference on Verification in the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, James E.
2012-06-12
The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less
Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-03-01
The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.
Simple method to verify OPC data based on exposure condition
NASA Astrophysics Data System (ADS)
Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu
2006-03-01
In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.
The 2014 Sandia Verification and Validation Challenge: Problem statement
Hu, Kenneth; Orient, George
2016-01-18
This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less
A new technique for measuring listening and reading literacy in developing countries
NASA Astrophysics Data System (ADS)
Greene, Barbara A.; Royer, James M.; Anzalone, Stephen
1990-03-01
One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.
An unattended verification station for UF 6 cylinders: Field trial findings
Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...
2017-08-26
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
An unattended verification station for UF 6 cylinders: Field trial findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L. E.; Miller, K. A.; McDonald, B. S.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification
NASA Technical Reports Server (NTRS)
Melton, D. M.
1998-01-01
Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.
Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S R; Bihari, B L; Salari, K
As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.
NASA Technical Reports Server (NTRS)
1978-01-01
The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.
Bias in estimating accuracy of a binary screening test with differential disease verification
Brinton, John T.; Ringham, Brandy M.; Glueck, Deborah H.
2011-01-01
SUMMARY Sensitivity, specificity, positive and negative predictive value are typically used to quantify the accuracy of a binary screening test. In some studies it may not be ethical or feasible to obtain definitive disease ascertainment for all subjects using a gold standard test. When a gold standard test cannot be used an imperfect reference test that is less than 100% sensitive and specific may be used instead. In breast cancer screening, for example, follow-up for cancer diagnosis is used as an imperfect reference test for women where it is not possible to obtain gold standard results. This incomplete ascertainment of true disease, or differential disease verification, can result in biased estimates of accuracy. In this paper, we derive the apparent accuracy values for studies subject to differential verification. We determine how the bias is affected by the accuracy of the imperfect reference test, the percent who receive the imperfect reference standard test not receiving the gold standard, the prevalence of the disease, and the correlation between the results for the screening test and the imperfect reference test. It is shown that designs with differential disease verification can yield biased estimates of accuracy. Estimates of sensitivity in cancer screening trials may be substantially biased. However, careful design decisions, including selection of the imperfect reference test, can help to minimize bias. A hypothetical breast cancer screening study is used to illustrate the problem. PMID:21495059
A calibration method for patient specific IMRT QA using a single therapy verification film
Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.
2013-01-01
Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558
SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tachibana, H; Tachibana, R
2015-06-15
Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less
Using Teamcenter engineering software for a successive punching tool lifecycle management
NASA Astrophysics Data System (ADS)
Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.
2015-11-01
The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.
From Verified Models to Verifiable Code
NASA Technical Reports Server (NTRS)
Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.
NASA Technical Reports Server (NTRS)
1989-01-01
The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.
Quantum money with classical verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavinsky, Dmitry
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
Quantum money with classical verification
NASA Astrophysics Data System (ADS)
Gavinsky, Dmitry
2014-12-01
We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
Towards composition of verified hardware devices
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, G. C.
1991-01-01
Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.
Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report
NASA Technical Reports Server (NTRS)
Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.
2017-01-01
This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.
40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification. (c...
Formal verification of an oral messages algorithm for interactive consistency
NASA Technical Reports Server (NTRS)
Rushby, John
1992-01-01
The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
ERIC Educational Resources Information Center
Scholes, Robert J.; And Others
The effects of sentence imitation and picture verification on the recall of subsequent digits were studied. Stimuli consisted of 20 sentences, each sentence followed by a string of five digit names, and five structural types of sentences were presented. Subjects were instructed to listen to the sentence and digit string and then either immediately…
ADVANCED SURVEILLANCE OF ENVIROMENTAL RADIATION IN AUTOMATIC NETWORKS.
Benito, G; Sáez, J C; Blázquez, J B; Quiñones, J
2018-06-01
The objective of this study is the verification of the operation of a radiation monitoring network conformed by several sensors. The malfunction of a surveillance network has security and economic consequences, which derive from its maintenance and could be avoided with an early detection. The proposed method is based on a kind of multivariate distance, and the verification for the methodology has been tested at CIEMAT's local radiological early warning network.
The verification of LANDSAT data in the geographical analysis of wetlands in west Tennessee
NASA Technical Reports Server (NTRS)
Rehder, J.; Quattrochi, D. A.
1978-01-01
The reliability of LANDSAT imagery as a medium for identifying, delimiting, monitoring, measuring, and mapping wetlands in west Tennessee was assessed to verify LANDSAT as an accurate, efficient cartographic tool that could be employed by a wide range of users to study wetland dynamics. The verification procedure was based on the visual interpretation and measurement of multispectral imagery. The accuracy testing procedure was predicated on surrogate ground truth data gleaned from medium altitude imagery of the wetlands. Fourteen sites or case study areas were selected from individual 9 x 9 inch photo frames on the aerial photography. These sites were then used as data control calibration parameters for assessing the cartography accuracy of the LANDSAT imagery. An analysis of results obtained from the verification tests indicated that 1:250,000 scale LANDSAT data were the most reliable scale of imagery for visually mapping and measuring wetlands using the area grid technique. The mean areal percentage of accuracy was 93.54 percent (real) and 96.93 percent (absolute). As a test of accuracy, the LANDSAT 1:250,000 scale overall wetland measurements were compared with an area cell mensuration of the swamplands from 1:130,000 scale color infrared U-2 aircraft imagery. The comparative totals substantiated the results from the LANDSAT verification procedure.
Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker
Aguilar, Juan José
2014-01-01
This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744
NASA Technical Reports Server (NTRS)
Kellogg, E.; Brissenden, R.; Flanagan, K.; Freeman, M.; Hughes, J.; Jones, M.; Ljungberg, M.; Mckinnon, P.; Podgorski, W.; Schwartz, D.
1992-01-01
Advanced X-ray Astrophysics Facility (AXAF) X-ray optics testing is conducted by VETA-I, which consists of six nested Wolter type I grazing-incidence mirrors; VETA's X-ray Detection System (VXDS) in turn measures the imaging properties of VETA-I, yielding FWHM and encircled energy of the X-ray image obtained, as well as its effective area. VXDS contains a high resolution microchannel plate imaging X-ray detector and a pinhole scanning system in front of proportional-counter detectors. VETA-I's X-ray optics departs from the AXAF flight configuration in that it uses a temporary holding fixture; its mirror elements are not cut to final length, and are not coated with the metal film used to maximize high-energy reflection.
Fracture mechanics concepts in reliability analysis of monolithic ceramics
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.; Gyekenyesi, John P.
1987-01-01
Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.
Verification of the ASTM G-124 Purge Equation
NASA Technical Reports Server (NTRS)
Robbins, Katherine E.; Davis, Samuel Eddie
2009-01-01
ASTM G-124 seeks to evaluate combustion characteristics of metals in high-purity (greater than 99%) oxygen atmospheres. ASTM G-124 provides the following equation to determine the minimum number of purges required to reach this level of purity in a test chamber: n = -4/log10(Pa/Ph), where "n" is the total number of purge cycles required, Ph is the absolute pressure used for the purge on each cycle and Pa is the atmospheric pressure or the vent pressure. The origin of this equation is not known and has been the source of frequent questions as to its accuracy and reliability. This paper shows the derivation of the G-124 purge equation, and experimentally explores the equation to determine if it accurately predicts the number of cycles required.
NASA Astrophysics Data System (ADS)
Petric, Martin Peter
This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic
Kleene Algebra and Bytecode Verification
2016-04-27
computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static
Security Verification of Secure MANET Routing Protocols
2012-03-22
SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Matthew F. Steele, Captain, USAF AFIT/GCS/ ENG /12-03 DEPARTMENT OF THE AIR FORCE AIR...States AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS THESIS Presented to the Faculty Department of Electrical and Computer...DISTRIBUTION UNLIMITED AFIT/GCS/ ENG /12-03 SECURITY VERIFICATION OF SECURE MANET ROUTING PROTOCOLS Matthew F. Steele, B.S.E.E. Captain, USAF
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
Radiation loss of planar surface plasmon polaritons transmission lines at microwave frequencies.
Xu, Zhixia; Li, Shunli; Yin, Xiaoxing; Zhao, Hongxin; Liu, Leilei
2017-07-21
Radiation loss of a typical spoof surface plasmon polaritons (SSPPs) transmission line (TL) is investigated in this paper. A 325 mm-long SSPPs TL is designed and fabricated. Simulated results show that radiation loss contributes more to transmission loss than dielectric loss and conductor loss from 2 GHz to 10 GHz. Radiation loss of the SSPPs TL could be divided into two parts, one is caused by the input mode converter, and the other is caused by the corrugated metallic strip. This paper explains mechanisms of radiation loss from different parts, designs a loaded SSPPs TL with a series of resistors to absorb electromagnetic energy on corrugated metallic strip, and then discriminates radiation loss from the input mode converter, proposes the concept of average radiation length (ARL) to evaluate radiation loss from SSPPs of finite length, and concludes that radiation loss is mainly caused by corrugated structure of finite length at low frequency band and by the input mode converter at high frequency band. To suppress radiation loss, a mixed slow wave TL based on the combination of coplanar waveguides (CPWs) and SSPPs is presented. The designed structure, sample fabrication and experimental verification are discussed.
NASA Astrophysics Data System (ADS)
Fujiwara, Kohei; Nishihara, Kazuki; Shiogai, Junichi; Tsukazaki, Atsushi
2017-05-01
Wide-bandgap oxides exhibiting high electron mobility hold promise for the development of useful electronic and optoelectronic devices as well as for basic research on two-dimensional electron transport phenomena. A perovskite-type tin oxide, BaSnO3, is currently one of such targets owing to distinctly high mobility at room temperature. The challenge to overcome towards the use of BaSnO3 thin films in applications is suppression of dislocation scattering, which is one of the dominant scattering origins for electron transport. Here, we show that the mobility of the BaSnO3 electric-double-layer transistor reaches 300 cm2 V-1 s-1 at 50 K. The improved mobility indicates that charged dislocation scattering is effectively screened by electrostatically doped high-density charge carriers. We also observed metallic conduction persisting down to 2 K, which is attributed to the transition to the degenerate semiconductor. The experimental verification of bulk-level mobility at the densely accumulated surface sheds more light on the importance of suppression of dislocation scattering by interface engineering in doped BaSnO3 thin films for transparent electrode applications.
NASA Astrophysics Data System (ADS)
Lansey, Eli
Optical or photonic metamaterials that operate in the infrared and visible frequency regimes show tremendous promise for solving problems in renewable energy, infrared imaging, and telecommunications. However, many of the theoretical and simulation techniques used at lower frequencies are not applicable to this higher-frequency regime. Furthermore, technological and financial limitations of photonic metamaterial fabrication increases the importance of reliable theoretical models and computational techniques for predicting the optical response of photonic metamaterials. This thesis focuses on aperture array metamaterials. That is, a rectangular, circular, or other shaped cavity or hole embedded in, or penetrating through a metal film. The research in the first portion of this dissertation reflects our interest in developing a fundamental, theoretical understanding of the behavior of light's interaction with these aperture arrays, specifically regarding enhanced optical transmission. We develop an approximate boundary condition for metals at optical frequencies, and a comprehensive, analytical explanation of the physics underlying this effect. These theoretical analyses are augmented by computational techniques in the second portion of this thesis, used both for verification of the theoretical work, and solving more complicated structures. Finally, the last portion of this thesis discusses the results from designing, fabricating and characterizing a light-splitting metamaterial.
A Practitioners Perspective on Verification
NASA Astrophysics Data System (ADS)
Steenburgh, R. A.
2017-12-01
NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less
The Environmental Technology Verification Program, established by the EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance.
14 CFR 460.17 - Verification program.
Code of Federal Regulations, 2011 CFR
2011-01-01
... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...
14 CFR 460.17 - Verification program.
Code of Federal Regulations, 2010 CFR
2010-01-01
... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...
14 CFR 460.17 - Verification program.
Code of Federal Regulations, 2012 CFR
2012-01-01
... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...
14 CFR 460.17 - Verification program.
Code of Federal Regulations, 2013 CFR
2013-01-01
... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...
14 CFR 460.17 - Verification program.
Code of Federal Regulations, 2014 CFR
2014-01-01
... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
Limitations in learning: How treatment verifications fail and what to do about it?
Richardson, Susan; Thomadsen, Bruce
The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
Verification in Referral-Based Crowdsourcing
Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.
2012-01-01
Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530
Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach
NASA Technical Reports Server (NTRS)
Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip
2017-01-01
While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.
Corrigan, Damion K; Piletsky, Sergey; McCrossen, Sean
2009-01-01
This article compares the technical performances of several different commercially available swabbing materials for the purpose of cleaning verification. A steel surface was soiled with solutions of acetaminophen, nicotinic acid, diclofenac, and benzamidine and wiped with each swabbing material. The compounds were extracted with water or ethanol (depending on polarity of analyte) and their concentration in extract was quantified spectrophotometrically. The study also investigated swab debris on the wiped surface. The swab performances were compared and the best swab material was identified.