38 CFR 74.15 - What length of time may a business participate in VetBiz VIP Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... business participate in VetBiz VIP Verification Program? 74.15 Section 74.15 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) VETERANS SMALL BUSINESS REGULATIONS Application Guidelines § 74.15 What length of time may a business participate in VetBiz VIP Verification Program? (a) A...
NASA Technical Reports Server (NTRS)
Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.
1993-01-01
To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.
Time-space modal logic for verification of bit-slice circuits
NASA Astrophysics Data System (ADS)
Hiraishi, Hiromi
1996-03-01
The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.
Manikandan, A.; Biplab, Sarkar; David, Perianayagam A.; Holla, R.; Vivek, T. R.; Sujatha, N.
2011-01-01
For high dose rate (HDR) brachytherapy, independent treatment verification is needed to ensure that the treatment is performed as per prescription. This study demonstrates dosimetric quality assurance of the HDR brachytherapy using a commercially available two-dimensional ion chamber array called IMatriXX, which has a detector separation of 0.7619 cm. The reference isodose length, step size, and source dwell positional accuracy were verified. A total of 24 dwell positions, which were verified for positional accuracy gave a total error (systematic and random) of –0.45 mm, with a standard deviation of 1.01 mm and maximum error of 1.8 mm. Using a step size of 5 mm, reference isodose length (the length of 100% isodose line) was verified for single and multiple catheters of same and different source loadings. An error ≤1 mm was measured in 57% of tests analyzed. Step size verification for 2, 3, 4, and 5 cm was performed and 70% of the step size errors were below 1 mm, with maximum of 1.2 mm. The step size ≤1 cm could not be verified by the IMatriXX as it could not resolve the peaks in dose profile. PMID:21897562
NASA Astrophysics Data System (ADS)
Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong
2007-03-01
As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.
Determining the hospital trauma financial impact in a statewide trauma system.
Mabry, Charles D; Kalkwarf, Kyle J; Betzold, Richard D; Spencer, Horace J; Robertson, Ronald D; Sutherland, Michael J; Maxson, Robert T
2015-04-01
There have been no comprehensive studies across an organized statewide trauma system using a standardized method to determine cost. Trauma financial impact includes the following costs: verification, response, and patient care cost (PCC). We conducted a survey of participating trauma centers (TCs) for federal fiscal year 2012, including separate accounting for verification and response costs. Patient care cost was merged with their trauma registry data. Seventy-five percent of the 2012 state trauma registry had data submitted. Each TC's reasonable cost from the Medicare Cost Report was adjusted to remove embedded costs for response and verification. Cost-to-charge ratios were used to give uniform PCC across the state. Median (mean ± SD) costs per patient for TC response and verification for Level I and II centers were $1,689 ($1,492 ± $647) and $450 ($636 ± $431) for Level III and IV centers. Patient care cost-median (mean ± SD) costs for patients with a length of stay >2 days rose with increasing Injury Severity Score (ISS): ISS <9: $6,787 ($8,827 ± $8,165), ISS 9 to 15: $10,390 ($14,340 ± $18,395); ISS 16 to 25: $15,698 ($23,615 ± $21,883); and ISS 25+: $29,792 ($41,407 ± $41,621), and with higher level of TC: Level I: $13,712 ($23,241 ± $29,164); Level II: $8,555 ($13,515 ± $15,296); and Levels III and IV: $8,115 ($10,719 ± $11,827). Patient care cost rose with increasing ISS, length of stay, ICU days, and ventilator days for patients with length of stay >2 days and ISS 9+. Level I centers had the highest mean ISS, length of stay, ICU days, and ventilator days, along with the highest PCC. Lesser trauma accounted for lower charges, payments, and PCC for Level II, III, and IV TCs, and the margin was variable. Verification and response costs per patient were highest for Level I and II TCs. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
NASA Astrophysics Data System (ADS)
Kim, A. A.; Klochkov, D. V.; Konyaev, M. A.; Mihaylenko, A. S.
2017-11-01
The article considers the problem of control and verification of the laser ceilometers basic performance parameters and describes an alternative method based on the use of multi-length fiber optic delay line, simulating atmospheric track. The results of the described experiment demonstrate the great potential of this method for inspection and verification procedures of laser ceilometers.
Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario
2016-01-01
Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441
NASA Astrophysics Data System (ADS)
Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes
2017-04-01
A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.
Interferometric step gauge for CMM verification
NASA Astrophysics Data System (ADS)
Hemming, B.; Esala, V.-P.; Laukkanen, P.; Rantanen, A.; Viitala, R.; Widmaier, T.; Kuosmanen, P.; Lassila, A.
2018-07-01
The verification of the measurement capability of coordinate measuring machines (CMM) is usually performed using gauge blocks or step gauges as reference standards. Gauge blocks and step gauges are robust and easy to use, but have some limitations such as finite lengths and uncertainty of thermal expansion. This paper describes the development, testing and uncertainty evaluation of an interferometric step gauge (ISG) for CMM verification. The idea of the ISG is to move a carriage bearing a gauge block along a rail and to measure the position with an interferometer. For a displacement of 1 m the standard uncertainty of the position of the gauge block is 0.2 µm. A short range periodic error of CMM can also be detected.
GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER
The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...
Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart
2010-07-01
High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.
Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart
2010-01-01
High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601
Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers
NASA Astrophysics Data System (ADS)
Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille
This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.
Asessment of adequacy of the monitoring method in the activity of a verification laboratory
NASA Astrophysics Data System (ADS)
Ivanov, R. N.; Grinevich, V. A.; Popov, A. A.; Shalay, V. V.; Malaja, L. D.
2018-04-01
Questions of assessing adequacy of a risk monitoring technique for a verification laboratory operation concerning the conformity to the accreditation criteria, and aimed at decision-making on advisability of a verification laboratory activities in the declared area of accreditation are considered.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
...-0008; OMB Number 1014-0009] Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION: 30-Day notice. SUMMARY... the Notice to Lessees (NTL) on the Legacy Data Verification Process (LDVP). This notice also provides...
The CHANDRA X-Ray Observatory: Thermal Design, Verification, and Early Orbit Experience
NASA Technical Reports Server (NTRS)
Boyd, David A.; Freeman, Mark D.; Lynch, Nicolie; Lavois, Anthony R. (Technical Monitor)
2000-01-01
The CHANDRA X-ray Observatory (formerly AXAF), one of NASA's "Great Observatories" was launched aboard the Shuttle in July 1999. CHANDRA comprises a grazing-incidence X-ray telescope of unprecedented focal-length, collecting area and angular resolution -- better than two orders of magnitude improvement in imaging performance over any previous soft X-ray (0.1-10 keV) mission. Two focal-plane instruments, one with a 150 K passively-cooled detector, provide celestial X-ray images and spectra. Thermal control of CHANDRA includes active systems for the telescope mirror and environment and the optical bench, and largely passive systems for the focal plans instruments. Performance testing of these thermal control systems required 1-1/2 years at increasing levels of integration, culminating in thermal-balance testing of the fully-configured observatory during the summer of 1998. This paper outlines details of thermal design tradeoffs and methods for both the Observatory and the two focal-plane instruments, the thermal verification philosophy of the Chandra program (what to test and at what level), and summarizes the results of the instrument, optical system and observatory testing.
ADEN ALOS PALSAR Product Verification
NASA Astrophysics Data System (ADS)
Wright, P. A.; Meadows, P. J.; Mack, G.; Miranda, N.; Lavalle, M.
2008-11-01
Within the ALOS Data European Node (ADEN) the verification of PALSAR products is an important and continuing activity, to ensure data utility for the users. The paper will give a summary of the verification activities, the status of the ADEN PALSAR processor and the current quality issues that are important for users of ADEN PALSAR data.
78 FR 6849 - Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-31
... (Verification of VA Benefits) Activity Under OMB Review AGENCY: Veterans Benefits Administration, Department of... ``OMB Control No. 2900-0406.'' SUPPLEMENTARY INFORMATION: Title: Verification of VA Benefits, VA Form 26... eliminate unlimited versions of lender- designed forms. The form also informs the lender whether or not the...
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
Equations for estimating Clark Unit-hydrograph parameters for small rural watersheds in Illinois
Straub, Timothy D.; Melching, Charles S.; Kocher, Kyle E.
2000-01-01
Simulation of the measured discharge hydrographs for the verification storms utilizing TC and R obtained from the estimation equations yielded good results. The error in peak discharge for 21 of the 29 verification storms was less than 25 percent, and the error in time-to-peak discharge for 18 of the 29 verification storms also was less than 25 percent. Therefore, applying the estimation equations to determine TC and R for design-storm simulation may result in reliable design hydrographs, as long as the physical characteristics of the watersheds under consideration are within the range of those characteristics for the watersheds in this study [area: 0.02-2.3 mi2, main-channel length: 0.17-3.4 miles, main-channel slope: 10.5-229 feet per mile, and insignificant percentage of impervious cover].
NASA Astrophysics Data System (ADS)
Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.
2015-11-01
High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.
Video Vehicle Detector Verification System (V2DVS) operators manual and project final report.
DOT National Transportation Integrated Search
2012-03-01
The accurate detection of the presence, speed and/or length of vehicles on roadways is recognized as critical for : effective roadway congestion management and safety. Vehicle presence sensors are commonly used for traffic : volume measurement and co...
Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael
This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.
In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.
Harris, David B.; Gibbons, Steven J.; Rodgers, Arthur J.; ...
2012-05-01
In this approach, small scale-length medium perturbations not modeled in the tomographic inversion might be described as random fields, characterized by particular distribution functions (e.g., normal with specified spatial covariance). Conceivably, random field parameters (scatterer density or scale length) might themselves be the targets of tomographic inversions of the scattered wave field. As a result, such augmented models may provide processing gain through the use of probabilistic signal sub spaces rather than deterministic waveforms.
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...
Kelm-Nelson, Cynthia A; Stevenson, Sharon A; Ciucci, Michelle R
2016-09-01
Datasets provided in this article represent the Rattus norvegicus primer design and verification used in Pink1 -/- and wildtype Long Evans brain tissue. Accessible tables include relevant information, accession numbers, sequences, temperatures and product length, describing primer design specific to the transcript amplification use. Additionally, results of Sanger sequencing of qPCR reaction products (FASTA aligned sequences) are presented for genes of interest. Results and further interpretation and discussion can be found in the original research article "Atp13a2 expression in the periaqueductal gray is decreased in the Pink1 -/- rat model of Parkinson disease" [1].
Results from an Independent View on The Validation of Safety-Critical Space Systems
NASA Astrophysics Data System (ADS)
Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.
2013-08-01
The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.
Verification of Ceramic Structures
NASA Astrophysics Data System (ADS)
Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit
2012-07-01
In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2009-02-17
Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16
A validation of 11 body-condition indices in a giant snake species that exhibits positive allometry.
Falk, Bryan G; Snow, Ray W; Reed, Robert N
2017-01-01
Body condition is a gauge of the energy stores of an animal, and though it has important implications for fitness, survival, competition, and disease, it is difficult to measure directly. Instead, body condition is frequently estimated as a body condition index (BCI) using length and mass measurements. A desirable BCI should accurately reflect true body condition and be unbiased with respect to size (i.e., mean BCI estimates should not change across different length or mass ranges), and choosing the most-appropriate BCI is not straightforward. We evaluated 11 different BCIs in 248 Burmese pythons (Python bivittatus), organisms that, like other snakes, exhibit simple body plans well characterized by length and mass. We found that the length-mass relationship in Burmese pythons is positively allometric, where mass increases rapidly with respect to length, and this allowed us to explore the effects of allometry on BCI verification. We employed three alternative measures of 'true' body condition: percent fat, scaled fat, and residual fat. The latter two measures mostly accommodated allometry in true body condition, but percent fat did not. Our inferences of the best-performing BCIs depended heavily on our measure of true body condition, with most BCIs falling into one of two groups. The first group contained most BCIs based on ratios, and these were associated with percent fat and body length (i.e., were biased). The second group contained the scaled mass index and most of the BCIs based on linear regressions, and these were associated with both scaled and residual fat but not body length (i.e., were unbiased). Our results show that potential differences in measures of true body condition should be explored in BCI verification studies, particularly in organisms undergoing allometric growth. Furthermore, the caveats of each BCI and similarities to other BCIs are important to consider when determining which BCI is appropriate for any particular taxon.
A validation of 11 body-condition indices in a giant snake species that exhibits positive allometry
Falk, Bryan; Snow, Ray W.; Reed, Robert N.
2017-01-01
Body condition is a gauge of the energy stores of an animal, and though it has important implications for fitness, survival, competition, and disease, it is difficult to measure directly. Instead, body condition is frequently estimated as a body condition index (BCI) using length and mass measurements. A desirable BCI should accurately reflect true body condition and be unbiased with respect to size (i.e., mean BCI estimates should not change across different length or mass ranges), and choosing the most-appropriate BCI is not straightforward. We evaluated 11 different BCIs in 248 Burmese pythons (Python bivittatus), organisms that, like other snakes, exhibit simple body plans well characterized by length and mass. We found that the length-mass relationship in Burmese pythons is positively allometric, where mass increases rapidly with respect to length, and this allowed us to explore the effects of allometry on BCI verification. We employed three alternative measures of ‘true’ body condition: percent fat, scaled fat, and residual fat. The latter two measures mostly accommodated allometry in true body condition, but percent fat did not. Our inferences of the best-performing BCIs depended heavily on our measure of true body condition, with most BCIs falling into one of two groups. The first group contained most BCIs based on ratios, and these were associated with percent fat and body length (i.e., were biased). The second group contained the scaled mass index and most of the BCIs based on linear regressions, and these were associated with both scaled and residual fat but not body length (i.e., were unbiased). Our results show that potential differences in measures of true body condition should be explored in BCI verification studies, particularly in organisms undergoing allometric growth. Furthermore, the caveats of each BCI and similarities to other BCIs are important to consider when determining which BCI is appropriate for any particular taxon.
SITE CHARACTERIZATION AND MONITORING TECHNOLOGY VERIFICATION: PROGRESS AND RESULTS
The Site Characterization and Monitoring Technology Pilot of the U.S. Environmental Protection Agency's Environmental Technology Verification Program (ETV) has been engaged in verification activities since the fall of 1994 (U.S. EPA, 1997). The purpose of the ETV is to promote th...
75 FR 51821 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... Verification form, the Employment Verification and Community Site Information form, the Payment Information Form, the Authorization to Release Information form and the Self-Certification Form. Once health...,035 Employment Verification and 5,175 1 5,175 .75 3,881 Community Site Information Form Loan...
75 FR 17923 - Agency Information Collection Activities: Proposed Collection: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-08
... Verification and Community Site Information form, the Loan Information and Verification form, the Authorization to Release Information form, the Applicant Checklist, and the Self-Certification form. The annual... respondent responses response hours NHSC LRP Application 5,175 1 5,175 0.30 1,553 Employment Verification-- 5...
78 FR 58492 - Generator Verification Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-24
... Control Functions), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or...), MOD-027-1 (Verification of Models and Data for Turbine/Governor and Load Control or Active Power... Category B and C contingencies, as required by wind generators in Order No. 661, or that those generators...
20 CFR 212.5 - Verification of military service.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Verification of military service. 212.5... MILITARY SERVICE § 212.5 Verification of military service. Military service may be verified by the... armed forces that shows the beginning and ending dates of the individual's active military service; or a...
Magnetic cleanliness verification approach on tethered satellite
NASA Technical Reports Server (NTRS)
Messidoro, Piero; Braghin, Massimo; Grande, Maurizio
1990-01-01
Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.
ERDEC Contribution to the 1993 International Treaty Verification Round Robin Exercise 4
1994-07-01
COLUMN Detector: MS (Finnigan 5100) Phase: AT-5 Manufacturer: Alltech GC CONDITIONS Length: 25 m Carrier gas: Helium Inner diameter: 0.25 mm Carrier...ionized in the ion source. The resulting CH,* then chemically reacts with the analyte. The advantage of this technique is that because less energy is
Solar cycle length hypothesis appears to support the ipcc on global warming
NASA Astrophysics Data System (ADS)
Laut, P.; Gundermann, J.
1998-12-01
Since the discovery of a striking correlation between 1-2-2-2-1 filtered solar cycle lengths and the 11-year running average of northern hemisphere land air temperatures, there have been widespread speculations as to whether these findings would rule out any significant contributions to global warming from the enhanced concentrations of greenhouse gases. The solar hypothesis (as we shall term this assumption) claims that solar activity causes a significant component of the global mean temperature to vary in phase opposite to the filtered solar cycle lengths. In an earlier article we have demonstrated that for data covering the period 1860-1980 the solar hypothesis does not rule out any significant contribution from man-made greenhouse gases and sulphate aerosols. The present analysis goes a step further. We analyse the period 1579-1987 and find that the solar hypothesis-instead of contradicting-appears to support the assumption of a significant warming due to human activities. We have tentatively corrected the historical northern hemisphere land air temperature anomalies by removing the assumed effects of human activities. These are represented by northern hemisphere land air temperature anomalies calculated as the contributions from man-made greenhouse gases and sulphate aerosols by using an upwelling diffusion-energy balance model similar to the model of [Wigley and Raper, 1993] employed in the Second Assessment Report of The Intergovernmental Panel on Climate Change (IPCC). It turns out that the agreement of the filtered solar cycle lengths with the corrected temperature anomalies is substantially better than with the historical anomalies, with the mean square deviation reduced by 36% for a climate sensitivity of 2.5°C, the central value of the IPCC assessment, and by 43% for the best-fit value of 1.7°C. Therefore our findings support a total reversal of the common assumption that a verification of the solar hypothesis would challenge the IPCC assessment of man-made global warming.
The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the U.S. EPA's Office of Research and Development. It...
Verification of micro-scale photogrammetry for smooth three-dimensional object measurement
NASA Astrophysics Data System (ADS)
Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard
2017-05-01
By using sub-millimetre laser speckle pattern projection we show that photogrammetry systems are able to measure smooth three-dimensional objects with surface height deviations less than 1 μm. The projection of laser speckle patterns allows correspondences on the surface of smooth spheres to be found, and as a result, verification artefacts with low surface height deviations were measured. A combination of VDI/VDE and ISO standards were also utilised to provide a complete verification method, and determine the quality parameters for the system under test. Using the proposed method applied to a photogrammetry system, a 5 mm radius sphere was measured with an expanded uncertainty of 8.5 μm for sizing errors, and 16.6 μm for form errors with a 95 % confidence interval. Sphere spacing lengths between 6 mm and 10 mm were also measured by the photogrammetry system, and were found to have expanded uncertainties of around 20 μm with a 95 % confidence interval.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Phyllis C.
A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.
Requirement Specifications for a Design and Verification Unit.
ERIC Educational Resources Information Center
Pelton, Warren G.; And Others
A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…
SU-E-T-455: Impact of Different Independent Dose Verification Software Programs for Secondary Check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Itano, M; Yamazaki, T; Kosaka, M
2015-06-15
Purpose: There have been many reports for different dose calculation algorithms for treatment planning system (TPS). Independent dose verification program (IndpPro) is essential to verify clinical plans from the TPS. However, the accuracy of different independent dose verification programs was not evident. We conducted a multi-institutional study to reveal the impact of different IndpPros using different TPSs. Methods: Three institutes participated in this study. They used two different IndpPros (RADCALC and Simple MU Analysis (SMU), which implemented the Clarkson algorithm. RADCALC needed the input of radiological path length (RPL) computed by the TPSs (Eclipse or Pinnacle3). SMU used CT imagesmore » to compute the RPL independently from TPS). An ion-chamber measurement in water-equivalent phantom was performed to evaluate the accuracy of two IndpPros and the TPS in each institute. Next, the accuracy of dose calculation using the two IndpPros compared to TPS was assessed in clinical plan. Results: The accuracy of IndpPros and the TPSs in the homogenous phantom was +/−1% variation to the measurement. 1543 treatment fields were collected from the patients treated in the institutes. The RADCALC showed better accuracy (0.9 ± 2.2 %) than the SMU (1.7 ± 2.1 %). However, the accuracy was dependent on the TPS (Eclipse: 0.5%, Pinnacle3: 1.0%). The accuracy of RADCALC with Eclipse was similar to that of SMU in one of the institute. Conclusion: Depending on independent dose verification program, the accuracy shows systematic dose accuracy variation even though the measurement comparison showed a similar variation. The variation was affected by radiological path length calculation. IndpPro with Pinnacle3 has different variation because Pinnacle3 computed the RPL using physical density. Eclipse and SMU uses electron density, though.« less
NASA Astrophysics Data System (ADS)
Roed-Larsen, Trygve; Flach, Todd
The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, David A.
2012-08-16
Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2008-03-19
Conduct verification surveys of available grids at the David Witherspoon Incorporated 1630 Site (DWI 1630) in Knoxville, Tennessee. The IVT conducted verification activities of partial grids H19, J21, J22, X20, and X21.
Kröger, Niklas; Schlobohm, Jochen; Pösch, Andreas; Reithmeier, Eduard
2017-09-01
In Michelson interferometer setups the standard way to generate different optical path lengths between a measurement arm and a reference arm relies on expensive high precision linear stages such as piezo actuators. We present an alternative approach based on the refraction of light at optical interfaces using a cheap stepper motor with high gearing ratio to control the rotation of a glass plate. The beam path is examined and a relation between angle of rotation and change in optical path length is devised. As verification, an experimental setup is presented, and reconstruction results from a measurement standard are shown. The reconstructed step height from this setup lies within 1.25% of the expected value.
Active alignment/contact verification system
Greenbaum, William M.
2000-01-01
A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.
Integration and verification testing of the Large Synoptic Survey Telescope camera
NASA Astrophysics Data System (ADS)
Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.
2016-08-01
We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.
Sound absorption by a Helmholtz resonator
NASA Astrophysics Data System (ADS)
Komkin, A. I.; Mironov, M. A.; Bykov, A. I.
2017-07-01
Absorption characteristics of a Helmholtz resonator positioned at the end wall of a circular duct are considered. The absorption coefficient of the resonator is experimentally investigated as a function of the diameter and length of the resonator neck and the depth of the resonator cavity. Based on experimental data, the linear analytic model of a Helmholtz resonator is verified, and the results of verification are used to determine the dissipative attached length of the resonator neck so as to provide the agreement between experimental and calculated data. Dependences of sound absorption by a Helmholtz resonator on its geometric parameters are obtained.
Verification of operational solar flare forecast: Case of Regional Warning Center Japan
NASA Astrophysics Data System (ADS)
Kubo, Yûki; Den, Mitsue; Ishii, Mamoru
2017-08-01
In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... Collection Activities: Form G-845 and Form G- 845 Supplement, Revision of a Currently Approved Information Collection; Comment Request ACTION: 30-Day Notice of Information Collection under Review: Form G- 845 and Form G-845 Supplement, Document Verification Request and Document Verification Request Supplement; OMB...
NASA Astrophysics Data System (ADS)
Tiwary, C. S.; Chakraborty, S.; Mahapatra, D. R.; Chattopadhyay, K.
2014-05-01
This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al2Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al2Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different length scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.
Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less
Three-step method for menstrual and oral contraceptive cycle verification.
Schaumberg, Mia A; Jenkins, David G; Janse de Jonge, Xanne A K; Emmerton, Lynne M; Skinner, Tina L
2017-11-01
Fluctuating endogenous and exogenous ovarian hormones may influence exercise parameters; yet control and verification of ovarian hormone status is rarely reported and limits current exercise science and sports medicine research. The purpose of this study was to determine the effectiveness of an individualised three-step method in identifying the mid-luteal or high hormone phase in endogenous and exogenous hormone cycles in recreationally-active women and determine hormone and demographic characteristics associated with unsuccessful classification. Cross-sectional study design. Fifty-four recreationally-active women who were either long-term oral contraceptive users (n=28) or experiencing regular natural menstrual cycles (n=26) completed step-wise menstrual mapping, urinary ovulation prediction testing and venous blood sampling for serum/plasma hormone analysis on two days, 6-12days after positive ovulation prediction to verify ovarian hormone concentrations. Mid-luteal phase was successfully verified in 100% of oral contraceptive users, and 70% of naturally-menstruating women. Thirty percent of participants were classified as luteal phase deficient; when excluded, the success of the method was 89%. Lower age, body fat and longer menstrual cycles were significantly associated with luteal phase deficiency. A step-wise method including menstrual cycle mapping, urinary ovulation prediction and serum/plasma hormone measurement was effective at verifying ovarian hormone status. Additional consideration of age, body fat and cycle length enhanced identification of luteal phase deficiency in physically-active women. These findings enable the development of stricter exclusion criteria for female participants in research studies and minimise the influence of ovarian hormone variations within sports and exercise science and medicine research. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tachibana, H; Tachibana, R
2015-06-15
Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification softwaremore » program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction.« less
INF and IAEA: A comparative analysis of verification strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheinman, L.; Kratzer, M.
1992-07-01
This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter
2017-01-01
Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…
Sex Determination of Carolina Wrens (Thryothorus ludovicianus) in the Mississippi Alluvial Valley
Twedt, D.J.
2004-01-01
I identified sexual dimorphism in wing length (unflattened chord) of Carolina Wrens (Thryothorus ludovicianus) within the central Mississippi Alluvial Valley (northeast Louisiana and west-central Mississippi) and used this difference to assign a sex to captured wrens. Wrens were identified as female when wing length was less than 57.5 mm or male when wing length was greater than 58.5 mm. Verification of predicted sex was obtained from recaptures of banded individuals where sex was ascertained from the presence of a cloacal protuberance or brood patch. Correct prediction of sex was 81% for adult females and 95% for adult males. An alternative model, which categorized wrens with wing lengths of 58 and 59 mm as birds of unknown sex, increased correct prediction of females to 93% but reduced the number of individuals to which sex was assigned. These simple, predictive, wing-length-based models also correctly assigned sex for more than 88% of young (hatching-year) birds.
Frame synchronization methods based on channel symbol measurements
NASA Technical Reports Server (NTRS)
Dolinar, S.; Cheung, K.-M.
1989-01-01
The current DSN frame synchronization procedure is based on monitoring the decoded bit stream for the appearance of a sync marker sequence that is transmitted once every data frame. The possibility of obtaining frame synchronization by processing the raw received channel symbols rather than the decoded bits is explored. Performance results are derived for three channel symbol sync methods, and these are compared with results for decoded bit sync methods reported elsewhere. It is shown that each class of methods has advantages or disadvantages under different assumptions on the frame length, the global acquisition strategy, and the desired measure of acquisition timeliness. It is shown that the sync statistics based on decoded bits are superior to the statistics based on channel symbols, if the desired operating region utilizes a probability of miss many orders of magnitude higher than the probability of false alarm. This operating point is applicable for very large frame lengths and minimal frame-to-frame verification strategy. On the other hand, the statistics based on channel symbols are superior if the desired operating point has a miss probability only a few orders of magnitude greater than the false alarm probability. This happens for small frames or when frame-to-frame verifications are required.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-06
... solicits comments on information needed to issue a Personal Identity Verification (PIV) identification card... Personnel Security and Identity Management (07C), Department of Veterans Affairs, 810 Vermont Avenue NW...
INF and IAEA: A comparative analysis of verification strategy. [Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheinman, L.; Kratzer, M.
1992-07-01
This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.
Contraction of gut smooth muscle cells assessed by fluorescence imaging.
Tokita, Yohei; Akiho, Hirotada; Nakamura, Kazuhiko; Ihara, Eikichi; Yamamoto, Masahiro
2015-03-01
Here we discuss the development of a novel cell imaging system for the evaluation of smooth muscle cell (SMC) contraction. SMCs were isolated from the circular and longitudinal muscular layers of mouse small intestine by enzymatic digestion. SMCs were stimulated by test agents, thereafter fixed in acrolein. Actin in fixed SMCs was stained with phalloidin and cell length was determined by measuring diameter at the large end of phalloidin-stained strings within the cells. The contractile response was taken as the decrease in the average length of a population of stimulated-SMCs. Various mediators and chemically identified compounds of daikenchuto (DKT), pharmaceutical-grade traditional Japanese prokinetics, were examined. Verification of the integrity of SMC morphology by phalloidin and DAPI staining and semi-automatic measurement of cell length using an imaging analyzer was a reliable method by which to quantify the contractile response. Serotonin, substance P, prostaglandin E2 and histamine induced SMC contraction in concentration-dependent manner. Two components of DKT, hydroxy-α-sanshool and hydroxy-β-sanshool, induced contraction of SMCs. We established a novel cell imaging technique to evaluate SMC contractility. This method may facilitate investigation into SMC activity and its role in gastrointestinal motility, and may assist in the discovery of new prokinetic agents. Copyright © 2015 Japanese Pharmacological Society. Production and hosting by Elsevier B.V. All rights reserved.
Distilling the Verification Process for Prognostics Algorithms
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai
2013-01-01
The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.
ERIC Educational Resources Information Center
Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil
2016-01-01
It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…
Kraus, Michael W; Chen, Serena
2009-07-01
Extending research on the automatic activation of goals associated with significant others, the authors hypothesized that self-verification goals typically pursued with significant others are automatically elicited when a significant-other representation is activated. Supporting this hypothesis, the activation of a significant-other representation through priming (Experiments 1 and 3) or through a transference encounter (Experiment 2) led participants to seek feedback that verifies their preexisting self-views. Specifically, significant-other primed participants desired self-verifying feedback, in general (Experiment 1), from an upcoming interaction partner (Experiment 2), and relative to acquaintance-primed participants and favorable feedback (Experiment 3). Finally, self-verification goals were activated, especially for relational self-views deemed high in importance to participants' self-concepts (Experiment 2) and held with high certainty (Experiment 3). Implications for research on self-evaluative goals, the relational self, and the automatic goal activation literature are discussed, as are consequences for close relationships. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
International Cooperative for Aerosol Prediction Workshop on Aerosol Forecast Verification
NASA Technical Reports Server (NTRS)
Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.
2011-01-01
The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.
NASA Technical Reports Server (NTRS)
Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.
2011-01-01
The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.
7 CFR 62.206 - Access to program documents and activities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (CONTINUED) LIVESTOCK, MEAT, AND OTHER AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.206 Access to program documents and... SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS...
NASA Technical Reports Server (NTRS)
Defeo, P.; Doane, D.; Saito, J.
1982-01-01
A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.
Multidimensional Multiphysics Simulation of TRISO Particle Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. D. Hales; R. L. Williamson; S. R. Novascone
2013-11-01
Multidimensional multiphysics analysis of TRISO-coated particle fuel using the BISON finite-element based nuclear fuels code is described. The governing equations and material models applicable to particle fuel and implemented in BISON are outlined. Code verification based on a recent IAEA benchmarking exercise is described, and excellant comparisons are reported. Multiple TRISO-coated particles of increasing geometric complexity are considered. It is shown that the code's ability to perform large-scale parallel computations permits application to complex 3D phenomena while very efficient solutions for either 1D spherically symmetric or 2D axisymmetric geometries are straightforward. Additionally, the flexibility to easily include new physical andmore » material models and uncomplicated ability to couple to lower length scale simulations makes BISON a powerful tool for simulation of coated-particle fuel. Future code development activities and potential applications are identified.« less
INS/EKF-based stride length, height and direction intent detection for walking assistance robots.
Brescianini, Dario; Jung, Jun-Young; Jang, In-Hun; Park, Hyun Sub; Riener, Robert
2011-01-01
We propose an algorithm used to obtain the information on stride length, height difference, and direction based on user's intent during walking. For exoskeleton robots used to assist paraplegic patients' walking, this information is used to generate gait patterns by themselves in on-line. To obtain this information, we attach an inertial measurement unit(IMU) on crutches and apply an extended kalman filter-based error correction method to reduce the phenomena of drift due to bias of the IMU. The proposed method is verifed in real walking scenarios including walking, climbing up-stairs, and changing direction of walking with normal. © 2011 IEEE
Hubble Space Telescope high speed photometer orbital verification
NASA Technical Reports Server (NTRS)
Richards, Evan E.
1991-01-01
The purpose of this report is to provide a summary of the results of the HSP (High Speed Photometer) Orbital Verification (OV) tests and to report conclusions and lessons learned from the initial operations of the HSP. The HSP OV plan covered the activities through fine (phase 3) alignment. This report covers all activities (OV, SV, and SAO) from launch to the completion of phase 3 alignment. Those activities in this period that are not OV tests are described to the extent that they relate to OV activities.
Evaluating DFT for Transition Metals and Binaries: Developing the V/DM-17 Test Set
NASA Astrophysics Data System (ADS)
Decolvenaere, Elizabeth; Mattsson, Ann
We have developed the V-DM/17 test set to evaluate the experimental accuracy of DFT calculations of transition metals. When simulation and experiment disagree, the disconnect in length-scales and temperatures makes determining ``who is right'' difficult. However, methods to evaluate the experimental accuracy of functionals in the context of solid-state materials science, especially for transition metals, is lacking. As DFT undergoes a shift from a descriptive to a predictive tool, these issues of verification are becoming increasingly important. With undertakings like the Materials Project leading the way in high-throughput predictions and discoveries, the development of a one-size-fits-most approach to verification is critical. Our test set evaluates 26 transition metal elements and 80 transition metal alloys across three physical observables: lattice constants, elastic coefficients, and formation energy of alloys. Whether or not the formation energy can be reproduced measures whether the relevant physics are captured in a calculation. This is especially important question in transition metals, where active d-electrons can thwart commonly used techniques. In testing the V/DM-17 test set, we offer new views into the performance of existing functionals. Sandia National Labs is a multi-mission laboratory managed and operated by Sandia Corp., a wholly owned subsidiary of Lockheed Martin Corp., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Amyay, Omar
A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
... Protocol Gas Verification Program; EPA ICR No. 2375.01, OMB Control Number 2060-NEW AGENCY: Environmental... Air Protocol Gas Verification Program. ICR numbers: EPA ICR No. 2375.01, OMB Control No. 2060-NEW. ICR...
Towards the Verification of Human-Robot Teams
NASA Technical Reports Server (NTRS)
Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.
2005-01-01
Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.
7 CFR 981.70 - Records and verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 8 2010-01-01 2010-01-01 false Records and verification. 981.70 Section 981.70 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing..., inventories, reserve disposition, advertising and promotion activities, as well as other pertinent information...
Applying Independent Verification and Validation to Automatic Test Equipment
NASA Technical Reports Server (NTRS)
Calhoun, Cynthia C.
1997-01-01
This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.
International Space Station Major Constituent Analyzer On-Orbit Performance
NASA Technical Reports Server (NTRS)
Gardner, Ben D.; Erwin, Philip M.; Thoresen, Souzan; Granahan, John; Matty, Chris
2010-01-01
The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic changeout, including the analyzer (ORU 02) and the verification gas assembly (ORU 08). The longest lasting ORU 02 was recently replaced after a record service length of 1033 days. The comparatively high performance duration may be attributable to a reduced inlet flow rate into the analyzer, resulting in increased ion pump lifetime; however, there may be other factors as well. A recent schedule slip for delivery of replacement verification gas led to a demonstration that the calibration interval could be extended on a short-term basis. An analysis of ORU 08 performance characteristics indicates that it is possible to temporarily extend the calibration interval from 6 weeks to 12 weeks if necessary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, D.; Yang, L. J., E-mail: yanglj@mail.xjtu.edu.cn; Ma, J. B.
The paper has proposed a new triggering method for long spark gap based on capillary plasma ejection and conducted the experimental verification under the extremely low working coefficient, which represents that the ratio of the spark gap charging voltage to the breakdown voltage is particularly low. The quasi-neutral plasma is ejected from the capillary and develops through the axial direction of the spark gap. The electric field in the spark gap is thus changed and its breakdown is incurred. It is proved by the experiments that the capillary plasma ejection is effective in triggering the long spark gap under themore » extremely low working coefficient in air. The study also indicates that the breakdown probabilities, the breakdown delay, and the delay dispersion are all mainly determined by the characteristics of the ejected plasma, including the length of the plasma flow, the speed of the plasma ejection, and the ionization degree of the plasma. Moreover, the breakdown delay and the delay dispersion increase with the length of the long spark gap, and the polarity effect exists in the triggering process. Lastly, compared with the working patterns of the triggering device installed in the single electrode, the working pattern of the devices installed in both the two electrodes, though with the same breakdown process, achieves the ignition under longer gap distance. To be specific, at the gap length of 14 cm and the working coefficient of less than 2%, the spark gap is still ignited accurately.« less
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Camargo, J. C. G.
1982-01-01
The potential of using LANDSAT MSS imagery for morphometric and topological studies of drainage basins was verified. Using Tiete and Aguapei watershed (Western Plateau) as the test site because of its homogeneous landscape. Morphometric variables collected for ten drainage basins include: circularity index; river density; drainage density; topographic texture; areal and index length; basin parameter; and main river length 1st order and 2nd order channel length. The topographical variables determined were: order; magnitude; bifuraction ratio; weighted bifuraction ratio; number of segments; number of linking; trajectory length; and topological diameter. Data were collected on topographical maps at the scale of 1:250,000 and 1:59,000 and on LANDSAT imagery at the scale of 1:250,000. The results which were summarized on tables for further analysis, show that LANDSAT imagery can supply the lack of topographic charts for drainage studies.
NASA Technical Reports Server (NTRS)
Stowell, Elbridge Z; Schwartz, Edward B; Houbolt, John C
1945-01-01
A theoretical investigation was made of the behavior of a cantilever beam in rotational motion about a transverse axis through the root determining the stresses, the deflections, and the accelerations that occur in the beam as a result of the arrest of motion. The equations for bending and shear stress reveal that, at a given percentage of the distance from root to tip and at a given trip velocity, the bending stresses for a particular mode are independent of the length of the beam and the shear stresses vary inversely with the length. When examined with respect to a given angular velocity instead of a given tip velocity, the equations reveal that the bending stress is proportional to the length of the beam whereas the shear stress is independent of the length. Sufficient experimental verification of the theory has previously been given in connection with another problem of the same type.
SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baba, H; Tachibana, H; Kamima, T
2015-06-15
Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less
NASA Astrophysics Data System (ADS)
Hoesl, M.; Deepak, S.; Moteabbed, M.; Jassens, G.; Orban, J.; Park, Y. K.; Parodi, K.; Bentefour, E. H.; Lu, H. M.
2016-04-01
The purpose of this work is the clinical commissioning of a recently developed in vivo range verification system (IRVS) for treatment of prostate cancer by anterior and anterior oblique proton beams. The IRVS is designed to perform a complete workflow for pre-treatment range verification and adjustment. It contains specifically designed dosimetry and electronic hardware and a specific software for workflow control with database connection to the treatment and imaging systems. An essential part of the IRVS system is an array of Si-diode detectors, designed to be mounted to the endorectal water balloon routinely used for prostate immobilization. The diodes can measure dose rate as function of time from which the water equivalent path length (WEPL) and the dose received are extracted. The former is used for pre-treatment beam range verification and correction, if necessary, while the latter is to monitor the dose delivered to patient rectum during the treatment and serves as an additional verification. The entire IRVS workflow was tested for anterior and 30 degree inclined proton beam in both solid water and anthropomorphic pelvic phantoms, with the measured WEPL and rectal doses compared to the treatment plan. Gafchromic films were also used for measurement of the rectal dose and compared to IRVS results. The WEPL measurement accuracy was in the order of 1 mm and after beam range correction, the dose received by the rectal wall were 1.6% and 0.4% from treatment planning, respectively, for the anterior and anterior oblique field. We believe the implementation of IRVS would make the treatment of prostate with anterior proton beams more accurate and reliable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zdarek, J.; Pecinka, L.
Leak-before-break (LBB) analysis of WWER type reactors in the Czech and Sloval Republics is summarized in this paper. Legislative bases, required procedures, and validation and verification of procedures are discussed. A list of significant issues identified during the application of LBB analysis is presented. The results of statistical evaluation of crack length characteristics are presented and compared for the WWER 440 Type 230 and 213 reactors and for the WWER 1000 Type 302, 320 and 338 reactors.
Teichmuller Space Resolution of the EPR Paradox
NASA Astrophysics Data System (ADS)
Winterberg, Friedwardt
2013-04-01
The mystery of Newton's action-at-a-distance law of gravity was resolved by Einstein with Riemann's non-Euclidean geometry, which permitted the explanation of the departure from Newton's law for the motion of Mercury. It is here proposed that the similarly mysterious non-local EPR-type quantum correlations may be explained by a Teichmuller space geometry below the Planck length, for which an experiment for its verification is proposed.
NASA Technical Reports Server (NTRS)
McDonald, K. C.; Kimball, J. S.; Zimmerman, R.
2002-01-01
We employ daily surface Radar backscatter data from the SeaWinds Ku-band Scatterometer onboard Quikscat to estimate landscape freeze-thaw state and associated length of the seasonal non-frozen period as a surrogate for determining the annual growing season across boreal and subalpine regions of North America for 2000 and 2001.
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., the fringes (“e”) are at least half the length of one bonded cell wall (“d”) (in the ribbon direction... width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls of incomplete cells around the edge of the sample are trimmed as follows (See Figure 3). In the width...
7 CFR 932.62 - Verification of reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 8 2010-01-01 2010-01-01 false Verification of reports. 932.62 Section 932.62 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing... any and all records of such handler with respect to advertising and promotion activities subject to...
7 CFR 982.69 - Verification of reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 8 2010-01-01 2010-01-01 false Verification of reports. 982.69 Section 982.69 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing... respect to promotion and advertising activities conducted pursuant to § 982.58. Each handler shall furnish...
Small Angle Neutron Scattering Observation of Chain Retraction after a Large Step Deformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanchard, A.; Heinrich, M.; Pyckhout-Hintzen, W.
The process of retraction in entangled linear chains after a fast nonlinear stretch was detected from time-resolved but quenched small angle neutron scattering (SANS) experiments on long, well-entangled polyisoprene chains. The statically obtained SANS data cover the relevant time regime for retraction, and they provide a direct, microscopic verification of this nonlinear process as predicted by the tube model. Clear, quantitative agreement is found with recent theories of contour length fluctuations and convective constraint release, using parameters obtained mainly from linear rheology. The theory captures the full range of scattering vectors once the crossover to fluctuations on length scales belowmore » the tube diameter is accounted for.« less
Proton Therapy Verification with PET Imaging
Zhu, Xuping; Fakhri, Georges El
2013-01-01
Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiwary, C. S., E-mail: cst.iisc@gmail.com; Chattopadhyay, K.; Chakraborty, S.
2014-05-28
This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al{sub 2}Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al{sub 2}Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different lengthmore » scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.« less
NASA Technical Reports Server (NTRS)
Harvill, W. E.; Kizer, J. A.
1976-01-01
The advantageous structural uses of advanced filamentary composites are demonstrated by design, fabrication, and test of three boron-epoxy reinforced C-130 center wing boxes. The advanced development work necessary to support detailed design of a composite reinforced C-130 center wing box was conducted. Activities included the development of a basis for structural design, selection and verification of materials and processes, manufacturing and tooling development, and fabrication and test of full-scale portions of the center wing box. Detailed design drawings, and necessary analytical structural substantiation including static strength, fatigue endurance, flutter, and weight analyses are considered. Some additional component testing was conducted to verify the design for panel buckling, and to evaluate specific local design areas. Development of the cool tool restraint concept was completed, and bonding capabilities were evaluated using full-length skin panel and stringer specimens.
New generation of universal modeling for centrifugal compressors calculation
NASA Astrophysics Data System (ADS)
Galerkin, Y.; Drozdov, A.
2015-08-01
The Universal Modeling method is in constant use from mid - 1990th. Below is presented the newest 6th version of the Method. The flow path configuration of 3D impellers is presented in details. It is possible to optimize meridian configuration including hub/shroud curvatures, axial length, leading edge position, etc. The new model of vaned diffuser includes flow non-uniformity coefficient based on CFD calculations. The loss model was built from the results of 37 experiments with compressors stages of different flow rates and loading factors. One common set of empirical coefficients in the loss model guarantees the efficiency definition within an accuracy of 0.86% at the design point and 1.22% along the performance curve. The model verification was made. Four multistage compressors performances with vane and vaneless diffusers were calculated. As the model verification was made, four multistage compressors performances with vane and vaneless diffusers were calculated. Two of these compressors have quite unusual flow paths. The modeling results were quite satisfactory in spite of these peculiarities. One sample of the verification calculations is presented in the text. This 6th version of the developed computer program is being already applied successfully in the design practice.
Space Weather Models and Their Validation and Verification at the CCMC
NASA Technical Reports Server (NTRS)
Hesse, Michael
2010-01-01
The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.
77 FR 64596 - Proposed Information Collection (Income Verification) Activity: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-22
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0518] Proposed Information Collection (Income... to income- dependent benefits. DATES: Written comments and recommendations on the proposed collection... techniques or the use of other forms of information technology. Title: Income Verification, VA Form 21-0161a...
ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR AIR POLLUTION CONTROL TECHNOLOGIES
The report describes the activities and progress of the pilot Air Pollution Control Technologies (APCT) portion of the Environmental Technology Verification (ETV) Program during the period from 09/15/97 to 09/15/02. The objective of the ETV Program is to verify the performance of...
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
Cross-Language Phonological Activation of Meaning: Evidence from Category Verification
ERIC Educational Resources Information Center
Friesen, Deanna C.; Jared, Debra
2012-01-01
The study investigated phonological processing in bilingual reading for meaning. English-French and French-English bilinguals performed a category verification task in either their first or second language. Interlingual homophones (words that share phonology across languages but not orthography or meaning) and single language control words served…
77 FR 291 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-04
... Verification System (IEVS) Reporting and Supporting Regulations Contained in 42 CFR 431.17, 431.306, 435.910... verifications; Form Number: CMS-R-74 (OCN 0938-0467); Frequency: Monthly; Affected Public: State, Local, or..., issuers offering group health insurance coverage, and self-insured nonfederal governmental plans (through...
76 FR 45902 - Agency Information Collection Activities: Proposed Request and Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... will allow our users to maintain one User ID, consisting of a self-selected Username and Password, to...) Registration and identity verification; (2) enhancement of the User ID; and (3) authentication. The...- person identification verification process for individuals who cannot or are not willing to register...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-12
... (original and update), and verification audit; names of the person(s) who completed the self-assessment... of the self assessment, date of the verification audit report, name of the auditor, signature and... self assessment, (2) conducting a baseline survey of the regulated industry, and (3) obtaining an...
78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...
76 FR 9020 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
... -------- Preparation and Submission of Data 54 1 640 34,560 Verification Procedures--Sec. Sec. 261.60-261.63 Caseload... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-30
... for OMB Review; Comment Request; Placement Verification and Follow-Up of Job Corps Participants ACTION... Training Administration (ETA) sponsored information collection request (ICR) titled, ``Placement Verification and Follow-up of Job Corps Participants,'' to the Office of Management and Budget (OMB) for review...
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
Hubble Space Telescope high speed photometer science verification test report
NASA Technical Reports Server (NTRS)
Richards, Evan E.
1992-01-01
The purpose of this report is to summarize the results of the HSP Science Verification (SV) tests, the status of the HSP at the end of the SV period, and the work remaining to be done. The HSP OV report (November 1991) covered all activities (OV, SV, and SAO) from launch to the completion of phase three alignment, OV 3233 performed in the 91154 SMS, on June 8, 1991. This report covers subsequent activities through May 1992.
Svoboda, A; Lo, Y; Sheu, R; Dumane, V; Rosenzweig, K
2012-06-01
To present our experience using CT to plan and verify intraluminal HDR treatment for a patient with obstructive jaundice. Due to the obstruction's proximity to the small bowel, along with small bowel adhesions from past surgical history, it was imperative to verify source position relative to the bowel before each treatment. Treatment was administered to a total dose of 2000cGy in 5 fractions via a 6F intraluminal catheter inserted into the patient's 14F percutaneous drainage catheter. Graduations on the intraluminal catheter were used to measure the exact length of catheter inserted in to the patient's drainage tube allowing reproducibility. Dummy seeds inserted during CT were identified by iteratively aligning the planning system's 3D reconstruction axis to the catheter at multiple points as it snaked through the liver. Taking in to account the known offset between actual dwell positions and dummy source positions, we determined what dwell positions to activate for planning. CT verification was performed prior to each treatment to insure that the drainage catheter had not moved and that the distance from treatment site to small bowel was adequate. Dummy seeds and anatomical landmarks were identified on the scout image and correlated to the CT. Verification CTs showed remarkable consistency in the day-to-day drainage catheter position. The physician was able to easily identify the small bowel of concern on the CT and determine if a safe distance existed for treatment. The method outlined in this work provides a safe means by which to treat bile duct obstructions using HDR when critical structures are nearby. We were prepared to make real-time adjustments to our treatment plan to account for significant variation, but found it unnecessary to do so in this particular case. © 2012 American Association of Physicists in Medicine.
Bauer, Julia; Chen, Wenjing; Nischwitz, Sebastian; Liebl, Jakob; Rieken, Stefan; Welzel, Thomas; Debus, Juergen; Parodi, Katia
2018-04-24
A reliable Monte Carlo prediction of proton-induced brain tissue activation used for comparison to particle therapy positron-emission-tomography (PT-PET) measurements is crucial for in vivo treatment verification. Major limitations of current approaches to overcome include the CT-based patient model and the description of activity washout due to tissue perfusion. Two approaches were studied to improve the activity prediction for brain irradiation: (i) a refined patient model using tissue classification based on MR information and (ii) a PT-PET data-driven refinement of washout model parameters. Improvements of the activity predictions compared to post-treatment PT-PET measurements were assessed in terms of activity profile similarity for six patients treated with a single or two almost parallel fields delivered by active proton beam scanning. The refined patient model yields a generally higher similarity for most of the patients, except in highly pathological areas leading to tissue misclassification. Using washout model parameters deduced from clinical patient data could considerably improve the activity profile similarity for all patients. Current methods used to predict proton-induced brain tissue activation can be improved with MR-based tissue classification and data-driven washout parameters, thus providing a more reliable basis for PT-PET verification. Copyright © 2018 Elsevier B.V. All rights reserved.
Laser interferometric high-precision geometry (angle and length) monitor for JASMINE
NASA Astrophysics Data System (ADS)
Niwa, Y.; Arai, K.; Ueda, A.; Sakagami, M.; Gouda, N.; Kobayashi, Y.; Yamada, Y.; Yano, T.
2008-07-01
The telescope geometry of JASMINE should be stabilized and monitored with the accuracy of about 10 to 100 pm or 10 to 100 prad of rms over about 10 hours. For this purpose, a high-precision interferometric laser metrology system is employed. Useful techniques for measuring displacements on extremely small scales are the wave-front sensing method and the heterodyne interferometrical method. Experiments for verification of measurement principles are well advanced.
NASA Astrophysics Data System (ADS)
Peleshko, V. A.
2016-06-01
The deviator constitutive relation of the proposed theory of plasticity has a three-term form (the stress, stress rate, and strain rate vectors formed from the deviators are collinear) and, in the specialized (applied) version, in addition to the simple loading function, contains four dimensionless constants of the material determined from experiments along a two-link strain trajectory with an orthogonal break. The proposed simple mechanism is used to calculate the constants of themodel for four metallic materials that significantly differ in the composition and in the mechanical properties; the obtained constants do not deviate much from their average values (over the four materials). The latter are taken as universal constants in the engineering version of the model, which thus requires only one basic experiment, i. e., a simple loading test. If the material exhibits the strengthening property in cyclic circular deformation, then the model contains an additional constant determined from the experiment along a strain trajectory of this type. (In the engineering version of the model, the cyclic strengthening effect is not taken into account, which imposes a certain upper bound on the difference between the length of the strain trajectory arc and the module of the strain vector.) We present the results of model verification using the experimental data available in the literature about the combined loading along two- and multi-link strain trajectories with various lengths of links and angles of breaks, with plane curvilinear segments of various constant and variable curvature, and with three-dimensional helical segments of various curvature and twist. (All in all, we use more than 80 strain programs; the materials are low- andmedium-carbon steels, brass, and stainless steel.) These results prove that the model can be used to describe the process of arbitrary active (in the sense of nonnegative capacity of the shear) combine loading and final unloading of originally quasi-isotropic elastoplastic materials. In practical calculations, in the absence of experimental data about the properties of a material under combined loading, the use of the engineering version of the model is quite acceptable. The simple identification, wide verifiability, and the availability of a software implementation of the method for solving initial-boundary value problems permit treating the proposed theory as an applied theory.
Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.
Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna
2016-06-27
This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.
Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants
Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna
2016-01-01
This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-14
... require additional verification to identify inappropriate or inaccurate rental assistance, and may provide... Affordable Housing Act, the Native American Housing Assistance and Self-Determination Act of 1996, and the... matching activities. The computer matching program will also provide for the verification of social...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-14
... and natural gas resources in a manner that is consistent with the need to make such resources... to prevent or minimize the likelihood of blowouts, loss of well control, fires, spillages, physical... the environment or to property, or endanger life or health.'' BSEE's Legacy Data Verification Process...
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2012 CFR
2012-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2013 CFR
2013-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2014 CFR
2014-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
49 CFR 587.15 - Verification of aluminum honeycomb crush strength.
Code of Federal Regulations, 2011 CFR
2011-10-01
... material being tested. Four samples, each measuring 300 mm (11.8 in) × 300 mm (11.8 in) × 25 mm (1 in.... Samples of the following size are used for testing. The length is 150 mm (5.9 in) ±6 mm (0.24 in), the width is 150 mm (5.9 in) ±6 mm (0.24 in), and the thickness is 25 mm (1 in) ±2 mm (0.08 in). The walls...
Impact of prospective verification of intravenous antibiotics in an ED.
Hunt, Allyson; Nakajima, Steven; Hall Zimmerman, Lisa; Patel, Manav
2016-12-01
Delay in appropriate antibiotic therapy is associated with an increase in mortality and prolonged length of stay. Automatic dispensing machines decrease the delivery time of intravenous (IV) antibiotics to patients in the emergency department (ED). However, when IV antibiotics are not reviewed by pharmacists before being administered, patients are at risk for receiving inappropriate antibiotic therapy. The objective of this study was to determine if a difference exists in the time to administration of appropriate antibiotic therapy before and after implementation of prospective verification of antibiotics in the ED. This retrospective, institutional review board-approved preimplementation vs postimplementation study evaluated patients 18years or older who were started on IV antibiotics in the ED. Patients were excluded if pregnant, if the patient is a prisoner, if no cultures were drawn, or if the patient was transferred from an outside facility. Appropriate antibiotic therapy was based on empiric source-specific evidence-based guidelines, appropriate pharmacokinetic and pharmacodynamic properties, and microbiologic data. The primary end point was the time from ED arrival to administration of appropriate antibiotic therapy. Of the 1628 evaluated, 128 patients met the inclusion criteria (64 pre vs 64 post). Patients were aged 65.2±17.0years, with most of infections being pneumonia (44%) and urinary tract infections (18%) and most patients being noncritically ill. Time to appropriate antibiotic therapy was reduced in the postgroup vs pregroup (8.1±8.6 vs 15.2±22.8hours, respectively, P=.03). In addition, appropriate empiric antibiotics were initiated more frequently after the implementation (92% post vs 66% pre; P=.0001). There was no difference in mortality or length of stay between the 2 groups. Prompt administration of the appropriate antibiotics is imperative in patients with infections presenting to the ED. The impact of prospective verification of antibiotics by pharmacists led to significant improvement on both empiric selection of and time to appropriate antibiotic therapy. Copyright © 2016 Elsevier Inc. All rights reserved.
Sexing California gulls using morphometrics and discriminant function analysis
Herring, Garth; Ackerman, Joshua T.; Eagles-Smith, Collin A.; Takekawa, John Y.
2010-01-01
A discriminant function analysis (DFA) model was developed with DNA sex verification so that external morphology could be used to sex 203 adult California Gulls (Larus californicus) in San Francisco Bay (SFB). The best model was 97% accurate and included head-to-bill length, culmen depth at the gonys, and wing length. Using an iterative process, the model was simplified to a single measurement (head-to-bill length) that still assigned sex correctly 94% of the time. A previous California Gull sex determination model developed for a population in Wyoming was then assessed by fitting SFB California Gull measurement data to the Wyoming model; this new model failed to converge on the same measurements as those originally used by the Wyoming model. Results from the SFB discriminant function model were compared to the Wyoming model results (by using SFB data with the Wyoming model); the SFB model was 7% more accurate for SFB California gulls. The simplified DFA model (head-to-bill length only) provided highly accurate results (94%) and minimized the measurements and time required to accurately sex California Gulls.
Alloy, L B; Lipman, A J
1992-05-01
In this commentary we examine Swann, Wenzlaff, Krull, and Pelham's (1992) findings with respect to each of 5 central propositions in self-verification theory. We conclude that although the data are consistent with self-verification theory, none of the 5 components of the theory have been demonstrated convincingly as yet. Specifically, we argue that depressed subjects' selection of social feedback appears to be balanced or evenhanded rather than biased toward negative feedback and that there is little evidence to indicate that depressives actively seek negative appraisals. Furthermore, we suggest that the studies are silent with respect to the motivational postulates of self-verification theory and that a variety of competing cognitive and motivational models can explain Swann et al.'s findings as well as self-verification theory.
40 CFR 1065.378 - NO2-to-NO converter conversion verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Measurements § 1065.378 NO2-to-NO converter conversion verification. (a) Scope and frequency. If you use an... catalytic activity of the NO2-to-NO converter has not deteriorated. (b) Measurement principles. An NO2-to-NO.... Allow for stabilization, accounting only for transport delays and instrument response. (ii) Use an NO...
Cooperative Networked Control of Dynamical Peer-to-Peer Vehicle Systems
2007-12-28
dynamic deployment and task allocation;verification and hybrid systems; and information management for cooperative control. The activity of the...32 5.3 Decidability Results on Discrete and Hybrid Systems ...... .................. 33 5.4 Switched Systems...solved. Verification and hybrid systems. The program has produced significant advances in the theory of hybrid input-output automata, (HIOA) and the
Partner verification: restoring shattered images of our intimates.
De La Ronde, C; Swann, W B
1998-08-01
When spouses received feedback that disconfirmed their impressions of their partners, they attempted to undermine that feedback during subsequent interactions with these partners. Such partner verification activities occurred whether partners construed the feedback as overly favorable or overly unfavorable. Furthermore, because spouses tended to see their partners as their partners saw themselves, their efforts to restore their impressions of partners often worked hand-in-hand with partners' efforts to verify their own views. Finally, support for self-verification theory emerged in that participants were more intimate with spouses who verified their self-views, whether their self-views happened to be positive or negative.
Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin
2016-05-13
This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bachner K. M.; Pepper, S.; Gomera, J.
BNL has offered Nuclear Nonproliferation, Safeguards and Security in the 21st Century,? referred to as NNSS, every year since 2009 for graduate students in technical and policy fields related to nuclear safeguards and nonproliferation. The course focuses on relevant policy issues, in addition to technical components, and is part of a larger NGSI short course initiative that includes separate courses that are delivered at three other national laboratories and NNSA headquarters. [SCHOLZ and ROSENTHAL] The course includes lectures from esteemed nonproliferation experts, tours of various BNL facilities and laboratories, and in-field and table-top exercises on both technical and policy subjects.more » Topics include the history of the Treaty on the Non-proliferation of Nuclear Weapons (NPT) and other relevant treaties, the history of and advances in international nuclear safeguards, current relevant political situations in countries such as Iran, Iraq, and the Democratic Peoples? Republic of Korea (DPRK), nuclear science and technology, instrumentation and techniques used for verification activities, and associated research and development. The students conduct a mock Design Information Verification (DIV) at BNL?s decommissioned Medical Research Reactor. The capstone of the course includes a series of student presentations in which students act as policy advisors and provide recommendations in response to scenarios involving a current nonproliferation related event that are prepared by the course organizers. ?The course is open to domestic and foreign students, and caters to students in, entering, or recently having completed graduate school. Interested students must complete an application and provide a resume and a statement describing their interest in the course. Eighteen to 22 students attend annually; 165 students have completed the course to date. A stipend helps to defray students? travel and subsistence expenses. In 2015, the course was shortened from three weeks to two weeks to streamline the material, standardize NGSI course length, and draw in a larger applicant pool. ?The international and interdisciplinary mix of students attending the course encourages discussions of the topics presented during the course. Information about the course is available at https://www.bnl.gov/nnsscourse/. While a complete analysis of course students has not been undertaken, BNL is aware of three individuals who worked at national laboratories after attending the NNSS course, one who worked at a national laboratory prior to attending NNSS, two who worked as federal employees after attending NNSS, three who were Nonproliferation Graduate Fellows before or after attending NNSS, and three who have participated in other NGSI activities.?Design Information Verification is an IAEA inspection activity that is implemented for the purpose of ensuring that the facility design is consistent with the declared use of a facility.« less
NASA Astrophysics Data System (ADS)
Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng
2018-03-01
In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.
1980-05-28
Total Deviation Angles and Measured Inlet Axial Velocity . . . . 55 ix LIST OF FIGURES (Continued) Figure Page 19 Points Defining Blade Sections of...distance from leading edge to point of maximum camber along chord line ar tip vortex core radius AVR axial velocity ratio (Vx /V x c chord length CL tip...yaw ceoefficient d longitudinal distance from leading edge to tip vortex calculation point G distance from chord line to maximum camber point K cascade
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Lo, R. Y.
1987-01-01
Modeling of SEU has been done in a CMOS static RAM containing 1-micron-channel-length transistors fabricated from a p-well epilayer process using both circuit-simulation and numerical-simulation techniques. The modeling results have been experimentally verified with the aid of heavy-ion beams obtained from a three-stage tandem van de Graaff accelerator. Experimental evidence for a novel SEU mode in an ON n-channel device is presented.
National Center for Nuclear Security - NCNS
None
2018-01-16
As the United States embarks on a new era of nuclear arms control, the tools for treaty verification must be accurate and reliable, and must work at stand-off distances. The National Center for Nuclear Security, or NCNS, at the Nevada National Security Site, is poised to become the proving ground for these technologies. The center is a unique test bed for non-proliferation and arms control treaty verification technologies. The NNSS is an ideal location for these kinds of activities because of its multiple environments; its cadre of experienced nuclear personnel, and the artifacts of atmospheric and underground nuclear weapons explosions. The NCNS will provide future treaty negotiators with solid data on verification and inspection regimes and a realistic environment in which future treaty verification specialists can be trained. Work on warhead monitoring at the NCNS will also support future arms reduction treaties.
45 CFR 261.60 - What hours of participation may a State report for a work-eligible individual?
Code of Federal Regulations, 2010 CFR
2010-10-01
... days that it wishes to count as holidays for those in unpaid activities in its Work Verification Plan... policies and definitions as part of its Work Verification Plan, specified at § 261.62. (c) For unsubsidized... toward the participation rate for a self-employed individual than the number derived by dividing the...
Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. D. Habel
2008-05-20
This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.
This verification study was a special project designed to determine the efficacy of a draft standard operating procedure (SOP) developed by US EPA Region 3 for the determination of selected glycols in drinking waters that may have been impacted by active unconventional oil and ga...
Mantziaras, I D; Stamou, A; Katsiri, A
2011-06-01
This paper refers to nitrogen removal optimization of an alternating oxidation ditch system through the use of a mathematical model and pilot testing. The pilot system where measurements have been made has a total volume of 120 m(3) and consists of two ditches operating in four phases during one cycle and performs carbon oxidation, nitrification, denitrification and settling. The mathematical model consists of one-dimensional mass balance (convection-dispersion) equations based on the IAWPRC ASM 1 model. After the calibration and verification of the model, simulation system performance was made. Optimization is achieved by testing operational cycles and phases with different time lengths. The limits of EU directive 91/271 for nitrogen removal have been used for comparison. The findings show that operational cycles with smaller time lengths can achieve higher nitrogen removals and that an "equilibrium" between phase time percentages in the whole cycle, for a given inflow, must be achieved.
Verification and Validation of Multisegmented Mooring Capabilities in FAST v8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersen, Morten T.; Wendt, Fabian F.; Robertson, Amy N.
2016-07-01
The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.
JPL control/structure interaction test bed real-time control computer architecture
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1989-01-01
The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.
Fabrication and optimization of 1.55-μm InGaAsP/InP high-power semiconductor diode laser
NASA Astrophysics Data System (ADS)
Qing, Ke; Shaoyang, Tan; Songtao, Liu; Dan, Lu; Ruikang, Zhang; Wei, Wang; Chen, Ji
2015-09-01
A comprehensive design optimization of 1.55-μm high power InGaAsP/InP board area lasers is performed aiming at increasing the internal quantum efficiency (ηi) while maintaining the low internal loss (αi) of the device, thereby achieving high power operation. Four different waveguide structures of broad area lasers were fabricated and characterized in depth. Through theoretical analysis and experiment verifications, we show that laser structures with stepped waveguide and thin upper separate confinement layer will result in high ηi and overall slope efficiency. A continuous wave (CW) single side output power of 160 mW was obtained for an uncoated laser with a 50-μm active area width and 1 mm cavity length. Project supported by the National Natural Science Foundation of China (Nos. 61274046, 61201103) and the National High Technology Research and Development Program of China (No. 2013AA014202).
Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A
2015-12-08
Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.
Doebling, Scott William
2016-10-22
This paper documents the escape of high explosive (HE) products problem. The problem, first presented by Fickett & Rivard, tests the implementation and numerical behavior of a high explosive detonation and energy release model and its interaction with an associated compressible hydrodynamics simulation code. The problem simulates the detonation of a finite-length, one-dimensional piece of HE that is driven by a piston from one end and adjacent to a void at the other end. The HE equation of state is modeled as a polytropic ideal gas. The HE detonation is assumed to be instantaneous with an infinitesimal reaction zone. Viamore » judicious selection of the material specific heat ratio, the problem has an exact solution with linear characteristics, enabling a straightforward calculation of the physical variables as a function of time and space. Lastly, implementation of the exact solution in the Python code ExactPack is discussed, as are verification cases for the exact solution code.« less
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.
Workgroup for Hydraulic laboratory Testing and Verification of Hydroacoustic Instrumentation
Fulford, Janice M.; Armstrong, Brandy N.; Thibodeaux, Kirk G.
2015-01-01
An international workgroup was recently formed for hydraulic laboratory testing and verification of hydroacoustic instrumentation used for water velocity measurements. The activities of the workgroup have included one face to face meeting, conference calls and an inter-laboratory exchange of two acoustic meters among participating laboratories. Good agreement was found among four laboratories at higher tow speeds and poorer agreement at the lowest tow speed.
Quantitative assessment of the physical potential of proton beam range verification with PET/CT.
Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T
2008-08-07
A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.
Quantitative assessment of the physical potential of proton beam range verification with PET/CT
NASA Astrophysics Data System (ADS)
Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.
2008-08-01
A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.
NASA Technical Reports Server (NTRS)
Bernstein, Karen S.; Kujala, Rod; Fogt, Vince; Romine, Paul
2011-01-01
This document establishes the structural requirements for human-rated spaceflight hardware including launch vehicles, spacecraft and payloads. These requirements are applicable to Government Furnished Equipment activities as well as all related contractor, subcontractor and commercial efforts. These requirements are not imposed on systems other than human-rated spacecraft, such as ground test articles, but may be tailored for use in specific cases where it is prudent to do so such as for personnel safety or when assets are at risk. The requirements in this document are focused on design rather than verification. Implementation of the requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The SVP may also document unique verifications that meet or exceed these requirements with NASA Technical Authority approval.
Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA
NASA Astrophysics Data System (ADS)
Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo
2014-06-01
In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Performing Verification and Validation in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1999-01-01
The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.
Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersen, Morten T.; Wendt, Fabian; Robertson, Amy
2016-08-01
The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.
Leakage effect analysis on the performance of a cylindrical adjustable inertance tube
NASA Astrophysics Data System (ADS)
Zhou, Wenjie; Pfotenhauer, John M.; Zhi, Xiaoqin
2018-04-01
The inertance tube plays a significant role in improving the performance of the Stirling type pulse tube cryocooler by providing the desired phase angle between the mass flow and pressure wave. The phase angle is highly depended on the inertance tube geometry, such as diameter and length. A cylindrical threaded root device with variable thread depth on the outer screw and inner screw creates an adjustable inertance tube whose diameter and length can be adjusted in the real time. However, due to its geometry imperfectness, the performance of this threaded inertance tube is reduced by the leaks through the roots between the two screws. Its phase angle shift ability is decreased by 30% with the leakage clearance thickness of 15.5 μm according to both the theoretical prediction and the experimental verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
ADAMS, WADE C
At Pennsylvania Department of Environmental Protection's request, ORAU's IEAV program conducted verification surveys on the excavated surfaces of Section 3, SUs 1, 4, and 5 at the Whittaker site on March 13 and 14, 2013. The survey activities included visual inspections, gamma radiation surface scans, gamma activity measurements, and soil sampling activities. Verification activities also included the review and assessment of the licensee's project documentation and methodologies. Surface scans identified four areas of elevated direct gamma radiation distinguishable from background; one area within SUs 1 and 4 and two areas within SU5. One area within SU5 was remediated by removingmore » a golf ball size piece of slag while ORAU staff was onsite. With the exception of the golf ball size piece of slag within SU5, a review of the ESL Section 3 EXS data packages for SUs 1, 4, and 5 indicated that these locations of elevated gamma radiation were also identified by the ESL gamma scans and that ESL personnel performed additional investigations and soil sampling within these areas. The investigative results indicated that the areas met the release criteria.« less
NASA Astrophysics Data System (ADS)
Podkościelny, P.; Nieszporek, K.
2007-01-01
Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul
2002-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)
2001-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
Optical stabilization for time transfer infrastructure
NASA Astrophysics Data System (ADS)
Vojtech, Josef; Altmann, Michal; Skoda, Pavel; Horvath, Tomas; Slapak, Martin; Smotlacha, Vladimir; Havlis, Ondrej; Munster, Petr; Radil, Jan; Kundrat, Jan; Altmannova, Lada; Velc, Radek; Hula, Miloslav; Vohnout, Rudolf
2017-08-01
In this paper, we propose and present verification of all-optical methods for stabilization of the end-to-end delay of an optical fiber link. These methods are verified for deployment within infrastructure for accurate time and stable frequency distribution, based on sharing of fibers with research and educational network carrying live data traffic. Methods range from path length control, through temperature conditioning method to transmit wavelength control. Attention is given to achieve continuous control for relatively broad range of delays. We summarize design rules for delay stabilization based on the character and the total delay jitter.
Experimental verification of ‘waveguide’ plasmonics
NASA Astrophysics Data System (ADS)
Prudêncio, Filipa R.; Costa, Jorge R.; Fernandes, Carlos A.; Engheta, Nader; Silveirinha, Mário G.
2017-12-01
Surface plasmons polaritons are collective excitations of an electron gas that occur at an interface between negative-ɛ and positive-ɛ media. Here, we report the experimental observation of such surface waves using simple waveguide metamaterials filled only with available positive-ɛ media at microwave frequencies. In contrast to optical designs, in our setup the propagation length of the surface plasmons can be rather long as low loss conventional dielectrics are chosen to avoid typical losses from negative-ɛ media. Plasmonic phenomena have potential applications in enhancing light-matter interactions, implementing nanoscale photonic circuits and integrated photonics.
Pulse power applications of silicon diodes in EML capacitive pulsers
NASA Astrophysics Data System (ADS)
Dethlefsen, Rolf; McNab, Ian; Dobbie, Clyde; Bernhardt, Tom; Puterbaugh, Robert; Levine, Frank; Coradeschi, Tom; Rinaldi, Vito
1993-01-01
Crowbar diodes are used for increasing the energy transfer from capacitive pulse forming networks. They also prevent voltage reversal on the energy storage capacitors. 52 mm diameter diodes with a 5 kV reverse blocking voltage, rated 40 kA were successfully used for the 32 MJ SSG rail gun. An uprated diode with increased current capability and a 15 kV reverse blocking voltage has been developed. Transient thermal analysis has predicted the current ratings for different pulse length. Analysis verification is obtained from destructive testing.
Active Interrogation for Spent Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swinhoe, Martyn Thomas; Dougan, Arden
2015-11-05
The DDA instrument for nuclear safeguards is a fast, non-destructive assay, active neutron interrogation technique using an external 14 MeV DT neutron generator for characterization and verification of spent nuclear fuel assemblies.
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
Hybrid Gama Emission Tomography (HGET): FY16 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Smith, Leon E.; Wittman, Richard S.
2017-02-01
Current International Atomic Energy Agency (IAEA) methodologies for the verification of fresh low-enriched uranium (LEU) and mixed oxide (MOX) fuel assemblies are volume-averaging methods that lack sensitivity to individual pins. Further, as fresh fuel assemblies become more and more complex (e.g., heavy gadolinium loading, high degrees of axial and radial variation in fissile concentration), the accuracy of current IAEA instruments degrades and measurement time increases. Particularly in light of the fact that no special tooling is required to remove individual pins from modern fuel assemblies, the IAEA needs new capabilities for the verification of unirradiated (i.e., fresh LEU and MOX)more » assemblies to ensure that fissile material has not been diverted. Passive gamma emission tomography has demonstrated potential to provide pin-level verification of spent fuel, but gamma-ray emission rates from unirradiated fuel emissions are significantly lower, precluding purely passive tomography methods. The work presented here introduces the concept of Hybrid Gamma Emission Tomography (HGET) for verification of unirradiated fuels, in which a neutron source is used to actively interrogate the fuel assembly and the resulting gamma-ray emissions are imaged using tomographic methods to provide pin-level verification of fissile material concentration.« less
Separating stages of arithmetic verification: An ERP study with a novel paradigm.
Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes
2015-08-01
In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.
Acelam, Philip A
2015-01-01
To determine and verify how anthropometric variables correlate to ureteric lengths and how well statistical models approximate the actual ureteric lengths. In this work, 129 charts of endourological patients (71 females and 58 males) were studied retrospectively. Data were gathered from various research centers from North and South America. Continuous data were studied using descriptive statistics. Anthropometric variables (age, body surface area, body weight, obesity, and stature) were utilized as predictors of ureteric lengths. Linear regressions and correlations were used for studying relationships between the predictors and the outcome variables (ureteric lengths); P-value was set at 0.05. To assess how well statistical models were capable of predicting the actual ureteric lengths, percentages (or ratios of matched to mismatched results) were employed. The results of the study show that anthropometric variables do not correlate well to ureteric lengths. Statistical models can partially estimate ureteric lengths. Out of the five anthropometric variables studied, three of them: body frame, stature, and weight, each with a P<0.0001, were significant. Two of the variables: age (R (2)=0.01; P=0.20) and obesity (R (2)=0.03; P=0.06), were found to be poor estimators of ureteric lengths. None of the predictors reached the expected (match:above:below) ratio of 1:0:0 to qualify as reliable predictors of ureteric lengths. There is not sufficient evidence to conclude that anthropometric variables can reliably predict ureteric lengths. These variables appear to lack adequate specificity as they failed to reach the expected (match:above:below) ratio of 1:0:0. Consequently, selections of ureteral stents continue to remain a challenge. However, height (R (2)=0.68) with the (match:above:below) ratio of 3:3:4 appears suited for use as estimator, but on the basis of decision rule. Additional research is recommended for stent improvements and ureteric length determinations.
Acelam, Philip A
2015-01-01
Objective To determine and verify how anthropometric variables correlate to ureteric lengths and how well statistical models approximate the actual ureteric lengths. Materials and methods In this work, 129 charts of endourological patients (71 females and 58 males) were studied retrospectively. Data were gathered from various research centers from North and South America. Continuous data were studied using descriptive statistics. Anthropometric variables (age, body surface area, body weight, obesity, and stature) were utilized as predictors of ureteric lengths. Linear regressions and correlations were used for studying relationships between the predictors and the outcome variables (ureteric lengths); P-value was set at 0.05. To assess how well statistical models were capable of predicting the actual ureteric lengths, percentages (or ratios of matched to mismatched results) were employed. Results The results of the study show that anthropometric variables do not correlate well to ureteric lengths. Statistical models can partially estimate ureteric lengths. Out of the five anthropometric variables studied, three of them: body frame, stature, and weight, each with a P<0.0001, were significant. Two of the variables: age (R2=0.01; P=0.20) and obesity (R2=0.03; P=0.06), were found to be poor estimators of ureteric lengths. None of the predictors reached the expected (match:above:below) ratio of 1:0:0 to qualify as reliable predictors of ureteric lengths. Conclusion There is not sufficient evidence to conclude that anthropometric variables can reliably predict ureteric lengths. These variables appear to lack adequate specificity as they failed to reach the expected (match:above:below) ratio of 1:0:0. Consequently, selections of ureteral stents continue to remain a challenge. However, height (R2=0.68) with the (match:above:below) ratio of 3:3:4 appears suited for use as estimator, but on the basis of decision rule. Additional research is recommended for stent improvements and ureteric length determinations. PMID:26317082
Wang, Guoqiang; Zhang, Honglin; Zhao, Jiyang; Li, Wei; Cao, Jia; Zhu, Chengjian; Li, Shuhua
2016-05-10
Density functional theory (DFT) investigations revealed that 4-cyanopyridine was capable of homolytically cleaving the B-B σ bond of diborane via the cooperative coordination to the two boron atoms of the diborane to generate pyridine boryl radicals. Our experimental verification provides supportive evidence for this new B-B activation mode. With this novel activation strategy, we have experimentally realized the catalytic reduction of azo-compounds to hydrazine derivatives, deoxygenation of sulfoxides to sulfides, and reduction of quinones with B2 (pin)2 at mild conditions. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Neogi, Natasha A.
2016-01-01
There is a current drive towards enabling the deployment of increasingly autonomous systems in the National Airspace System (NAS). However, shifting the traditional roles and responsibilities between humans and automation for safety critical tasks must be managed carefully, otherwise the current emergent safety properties of the NAS may be disrupted. In this paper, a verification activity to assess the emergent safety properties of a clearly defined, safety critical, operational scenario that possesses tasks that can be fluidly allocated between human and automated agents is conducted. Task allocation role sets were proposed for a human-automation team performing a contingency maneuver in a reduced crew context. A safety critical contingency procedure (engine out on takeoff) was modeled in the Soar cognitive architecture, then translated into the Hybrid Input Output formalism. Verification activities were then performed to determine whether or not the safety properties held over the increasingly autonomous system. The verification activities lead to the development of several key insights regarding the implicit assumptions on agent capability. It subsequently illustrated the usefulness of task annotations associated with specialized requirements (e.g., communication, timing etc.), and demonstrated the feasibility of this approach.
Yi, Yan; Li, Xihong; Ouyang, Yan; Lin, Ge; Lu, Guangxiu; Gong, Fei
2016-05-01
To investigate a forecasting method developed to predict first trimester pregnancy outcomes using the first routine ultrasound scan for early pregnancy on days 27-29 after ET and to determine whether to perform a repeated scan several days later based on this forecasting method. Prospective analysis. Infertile patients at an assisted reproductive technology center. A total of 9,963 patients with an early singleton pregnancy after in vitro fertilization (IVF)-ET. None. Ongoing pregnancy >12 weeks of gestation. The classification score of ongoing pregnancy was equal to (1.57 × Maternal age) + (1.01 × Mean sac diameter) + (-0.19 × Crown-rump length) + 25.15 (if cardiac activity is present) + 1.30 (if intrauterine hematomas are present) - 47.35. The classification score of early pregnancy loss was equal to (1.66 × Maternal age) + (0.84 × Mean sac diameter) + (-0.38 × Crown-rump length) + 8.69 (if cardiac activity is present) + 1.60 (if intrauterine hematomas are present) - 34.77. In verification samples, 94.44% of cases were correctly classified using these forecasting models. The discriminant forecasting models are accurate in predicting first trimester pregnancy outcomes based on the first scan for early pregnancy after ET. When the predictive result is ongoing pregnancy, a second scan can be postponed until 11-14 weeks if no symptoms of abdominal pain or vaginal bleeding are present. When the predictive results suggest early pregnancy loss, repeated scans are imperative to avoid a misdiagnosis before evacuating the uterus. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rieben, James C., Jr.
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.
Verification of space weather forecasts at the UK Met Office
NASA Astrophysics Data System (ADS)
Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.
2017-12-01
The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.
NASA Technical Reports Server (NTRS)
Gravitz, Robert M.; Hale, Joseph
2006-01-01
NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.
Discriminative Features Mining for Offline Handwritten Signature Verification
NASA Astrophysics Data System (ADS)
Neamah, Karrar; Mohamad, Dzulkifli; Saba, Tanzila; Rehman, Amjad
2014-03-01
Signature verification is an active research area in the field of pattern recognition. It is employed to identify the particular person with the help of his/her signature's characteristics such as pen pressure, loops shape, speed of writing and up down motion of pen, writing speed, pen pressure, shape of loops, etc. in order to identify that person. However, in the entire process, features extraction and selection stage is of prime importance. Since several signatures have similar strokes, characteristics and sizes. Accordingly, this paper presents combination of orientation of the skeleton and gravity centre point to extract accurate pattern features of signature data in offline signature verification system. Promising results have proved the success of the integration of the two methods.
Aoki, Kimiko; Tanaka, Hiroyuki; Kawahara, Takashi
2018-07-01
The standard method for personal identification and verification of urine samples in doping control is short tandem repeat (STR) analysis using nuclear DNA (nDNA). The DNA concentration of urine is very low and decreases under most conditions used for sample storage; therefore, the amount of DNA from cryopreserved urine samples may be insufficient for STR analysis. We aimed to establish a multiplexed assay for urine mitochondrial DNA typing containing only trace amounts of DNA, particularly for Japanese populations. A multiplexed suspension-array assay using oligo-tagged microspheres (Luminex MagPlex-TAG) was developed to measure C-stretch length in hypervariable region 1 (HV1) and 2 (HV2), five single nucleotide polymorphisms (SNPs), and one polymorphic indel. Based on these SNPs and the indel, the Japanese population can be classified into five major haplogroups (D4, B, M7a, A, D5). The assay was applied to DNA samples from urine cryopreserved for 1 - 1.5 years (n = 63) and fresh blood (n = 150). The assay with blood DNA enabled Japanese subjects to be categorized into 62 types, exhibiting a discriminatory power of 0.960. The detection limit for cryopreserved urine was 0.005 ng of nDNA. Profiling of blood and urine pairs revealed that 5 of 63 pairs showed different C-stretch patterns in HV1 or HV2. The assay described here yields valuable information in terms of the verification of urine sample sources employing only trace amounts of recovered DNA. However, blood cannot be used as a reference sample.
Applications of square-related theorems
NASA Astrophysics Data System (ADS)
Srinivasan, V. K.
2014-04-01
The square centre of a given square is the point of intersection of its two diagonals. When two squares of different side lengths share the same square centre, there are in general four diagonals that go through the same square centre. The Two Squares Theorem developed in this paper summarizes some nice theoretical conclusions that can be obtained when two squares of different side lengths share the same square centre. These results provide the theoretical basis for two of the constructions given in the book of H.S. Hall and F.H. Stevens , 'A Shorter School Geometry, Part 1, Metric Edition'. In page 134 of this book, the authors present, in exercise 4, a practical construction which leads to a verification of the Pythagorean theorem. Subsequently in Theorems 29 and 30, the authors present the standard proofs of the Pythagorean theorem and its converse. In page 140, the authors present, in exercise 15, what amounts to a geometric construction, whose verification involves a simple algebraic identity. Both the constructions are of great importance and can be replicated by using the standard equipment provided in a 'geometry toolbox' carried by students in high schools. The author hopes that the results proved in this paper, in conjunction with the two constructions from the above-mentioned book, would provide high school students an appreciation of the celebrated theorem of Pythagoras. The diagrams that accompany this document are based on the free software GeoGebra. The author formally acknowledges his indebtedness to the creators of this free software at the end of this document.
Precision segmented reflector, figure verification sensor
NASA Technical Reports Server (NTRS)
Manhart, Paul K.; Macenka, Steve A.
1989-01-01
The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed
A Conceptual Working Paper on Arms Control Verification,
1981-08-01
AD-AlIO 748 OPIRATIONAL RESEARCH AND ANALYSIS ESTABLISMENT OTTA-ETC F/S 5/4 -A CONCEPTUAL WORKING PAP" ON ARMS CONTROL VERItFCATION.(U) AUG 81 F R... researched for the paper comes from ORAE Report No. R73, Compendium of Arms Control Verification Proposals, submitted simultaneously to the Committee on...nuclear activities within the territory" of the non -nuclear weapon state, or carried out under its control anywhere. Parties also undertake not to
An evaluation of SEASAT-A candidate ocean industry economic verification experiments
NASA Technical Reports Server (NTRS)
1977-01-01
A description of the candidate economic verification experiments which could be performed with SEASAT is provided. Experiments have been identified in each of the areas of ocean-based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of Arctic operations, the ocean fishing industry, the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications.
Some considerations in the evaluation of Seasat-A scatterometer /SASS/ measurements
NASA Technical Reports Server (NTRS)
Halberstam, I.
1980-01-01
A study is presented of the geophysical algorithms relating the Seasat-A scatterometer (SASS) backscatter measurements with a wind parameter. Although these measurements are closely related to surface features, an identification with surface layer parameters such as friction velocity or the roughness length is difficult. It is shown how surface truth in the form of wind speeds and coincident stability can be used to derive friction velocity or the equivalent neutral wind at an arbitrary height; it is also shown that the derived friction velocity values are sensitive to contested formulations relating friction velocity to the roughness length, while the derived values of the equivalent neutral wind are not. Examples of geophysical verification are demonstrated using values obtained from the Gulf of Alaska Seasat Experiment; these results show very little sensitivity to the type of wind parameter employed, suggesting that this insensitivity is mainly due to a large scatter in the SASS and surface truth data.
Authentication of Botanical Origin in Herbal Teas by Plastid Noncoding DNA Length Polymorphisms.
Uncu, Ali Tevfik; Uncu, Ayse Ozgur; Frary, Anne; Doganlar, Sami
2015-07-01
The aim of this study was to develop a DNA barcode assay to authenticate the botanical origin of herbal teas. To reach this aim, we tested the efficiency of a PCR-capillary electrophoresis (PCR-CE) approach on commercial herbal tea samples using two noncoding plastid barcodes, the trnL intron and the intergenic spacer between trnL and trnF. Barcode DNA length polymorphisms proved successful in authenticating the species origin of herbal teas. We verified the validity of our approach by sequencing species-specific barcode amplicons from herbal tea samples. Moreover, we displayed the utility of PCR-CE assays coupled with sequencing to identify the origin of undeclared plant material in herbal tea samples. The PCR-CE assays proposed in this work can be applied as routine tests for the verification of botanical origin in herbal teas and can be extended to authenticate all types of herbal foodstuffs.
Linear and nonlinear verification of gyrokinetic microstability codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, R. V.; Candy, J.; Barnes, M.
2011-12-15
Verification of nonlinear microstability codes is a necessary step before comparisons or predictions of turbulent transport in toroidal devices can be justified. By verification we mean demonstrating that a code correctly solves the mathematical model upon which it is based. Some degree of verification can be accomplished indirectly from analytical instability threshold conditions, nonlinear saturation estimates, etc., for relatively simple plasmas. However, verification for experimentally relevant plasma conditions and physics is beyond the realm of analytical treatment and must rely on code-to-code comparisons, i.e., benchmarking. The premise is that the codes are verified for a given problem or set ofmore » parameters if they all agree within a specified tolerance. True verification requires comparisons for a number of plasma conditions, e.g., different devices, discharges, times, and radii. Running the codes and keeping track of linear and nonlinear inputs and results for all conditions could be prohibitive unless there was some degree of automation. We have written software to do just this and have formulated a metric for assessing agreement of nonlinear simulations. We present comparisons, both linear and nonlinear, between the gyrokinetic codes GYRO[J. Candy and R. E. Waltz, J. Comput. Phys. 186, 545 (2003)] and GS2[W. Dorland, F. Jenko, M. Kotschenreuther, and B. N. Rogers, Phys. Rev. Lett. 85, 5579 (2000)]. We do so at the mid-radius for the same discharge as in earlier work [C. Holland, A. E. White, G. R. McKee, M. W. Shafer, J. Candy, R. E. Waltz, L. Schmitz, and G. R. Tynan, Phys. Plasmas 16, 052301 (2009)]. The comparisons include electromagnetic fluctuations, passing and trapped electrons, plasma shaping, one kinetic impurity, and finite Debye-length effects. Results neglecting and including electron collisions (Lorentz model) are presented. We find that the linear frequencies with or without collisions agree well between codes, as do the time averages of the nonlinear fluxes without collisions. With collisions, the differences between the time-averaged fluxes are larger than the uncertainties defined as the oscillations of the fluxes, with the GS2 fluxes consistently larger (or more positive) than those from GYRO. However, the electrostatic fluxes are much smaller than those without collisions (the electromagnetic energy flux is negligible in both cases). In fact, except for the electron energy fluxes, the absolute magnitudes of the differences in fluxes with collisions are the same or smaller than those without. None of the fluxes exhibit large absolute differences between codes. Beyond these results, the specific linear and nonlinear benchmarks proposed here, as well as the underlying methodology, provide the basis for a wide variety of future verification efforts.« less
MAGIC polymer gel for dosimetric verification in boron neutron capture therapy
Heikkinen, Sami; Kotiluoto, Petri; Serén, Tom; Seppälä, Tiina; Auterinen, Iiro; Savolainen, Sauli
2007-01-01
Radiation‐sensitive polymer gels are among the most promising three‐dimensional dose verification tools developed to date. We tested the normoxic polymer gel dosimeter known by the acronym MAGIC (methacrylic and ascorbic acid in gelatin initiated by copper) to evaluate its use in boron neutron capture therapy (BNCT) dosimetry. We irradiated a large cylindrical gel phantom (diameter: 10 cm; length: 20 cm) in the epithermal neutron beam of the Finnish BNCT facility at the FiR 1 nuclear reactor. Neutron irradiation was simulated with a Monte Carlo radiation transport code MCNP. To compare dose–response, gel samples from the same production batch were also irradiated with 6 MV photons from a medical linear accelerator. Irradiated gel phantoms then underwent magnetic resonance imaging to determine their R2 relaxation rate maps. The measured and normalized dose distribution in the epithermal neutron beam was compared with the dose distribution calculated by computer simulation. The results support the feasibility of using MAGIC gel in BNCT dosimetry. PACS numbers: 87.53.Qc, 87.53.Wz, 87.66.Ff PMID:17592463
Direct Numerical Simulations of a Full Stationary Wind-Turbine Blade
NASA Astrophysics Data System (ADS)
Qamar, Adnan; Zhang, Wei; Gao, Wei; Samtaney, Ravi
2014-11-01
Direct numerical simulation of flow past a full stationary wind-turbine blade is carried out at Reynolds number, Re = 10,000 placed at 0 and 5 (degree) angle of attack. The study is targeted to create a DNS database for verification of solvers and turbulent models that are utilized in wind-turbine modeling applications. The full blade comprises of a circular cylinder base that is attached to a spanwise varying airfoil cross-section profile (without twist). An overlapping composite grid technique is utilized to perform these DNS computations, which permits block structure in the mapped computational space. Different flow shedding regimes are observed along the blade length. Von-Karman shedding is observed in the cylinder shaft region of the turbine blade. Along the airfoil cross-section of the blade, near body shear layer breakdown is observed. A long tip vortex originates from the blade tip region, which exits the computational plane without being perturbed. Laminar to turbulent flow transition is observed along the blade length. The turbulent fluctuations amplitude decreases along the blade length and the flow remains laminar regime in the vicinity of the blade tip. The Strouhal number is found to decrease monotonously along the blade length. Average lift and drag coefficients are also reported for the cases investigated. Supported by funding under a KAUST OCRF-CRG grant.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
TORMES-BEXUS 17 and 19: Precursor of the 6U CubeSat 3CAT-2
NASA Astrophysics Data System (ADS)
Carreno-Luengo, H.; Amezaga, A.; Bolet, A.; Vidal, D.; Jane, J.; Munoz, J. F.; Olive, R.; Camps, A.; Carola, J.; Catarino, N.; Hagenfeldt, M.; Palomo, P.; Cornara, S.
2015-09-01
3Cat-2 Assembly, Integration and Verification (AIV) activities of the Engineering Model (EM) and the Flight Model (FM) are being carried out at present. The Attitude Determination and Control System (ADCS) and Flight Software (FSW) validation campaigns will be performed at Universitat Politècnica de Catalunya (UPC) during the incomings months. An analysis and verification of the 3Cat-2 key mission requirements has been performed. The main results are summarized in this work.
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
45 CFR 261.63 - When is a State's Work Verification Plan due?
Code of Federal Regulations, 2010 CFR
2010-10-01
... Plan for validating work activities reported in the TANF Data Report and, if applicable, the SSP-MOE... procedures for TANF or SSP-MOE work activities or its internal controls for ensuring a consistent measurement...
NASA Astrophysics Data System (ADS)
Taylor, John R.; Stolz, Christopher J.
1993-08-01
Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.
NASA Astrophysics Data System (ADS)
Taylor, J. R.; Stolz, C. J.
1992-12-01
Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.
Multiparty Quantum Blind Signature Scheme Based on Graph States
NASA Astrophysics Data System (ADS)
Jian-Wu, Liang; Xiao-Shu, Liu; Jin-Jing, Shi; Ying, Guo
2018-05-01
A multiparty quantum blind signature scheme is proposed based on the principle of graph state, in which the unitary operations of graph state particles can be applied to generate the quantum blind signature and achieve verification. Different from the classical blind signature based on the mathematical difficulty, the scheme could guarantee not only the anonymity but also the unconditionally security. The analysis shows that the length of the signature generated in our scheme does not become longer as the number of signers increases, and it is easy to increase or decrease the number of signers.
A Study of the Access to the Scholarly Record from a Hospital Health Science Core Collection *
Williams, James F.; Pings, Vern M.
1973-01-01
This study is an effort to determine possible service performance levels in hospital libraries based on access to the scholarly record of medicine through selected lists of clinical journals and indexing and abstracting journals. The study was designed to test a methodology as well as to provide data for planning and management decisions for health science libraries. Findings and conclusions cover the value of a core collection of journals, length of journal files, performance of certain bibliographic instruments in citation verification, and the implications of study data for library planning and management. PMID:4744345
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toltz, A; Seuntjens, J; Hoesl, M
Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucconi, G; Department of Radiation Oncology, Massachusetts General Hospital, Boston, MA; Bentefour, E
Purpose: The clinical commissioning of a workflow for pre-treatment range verification/adjustment for the head treatment of pediatric medulloblastoma patients, including dose monitoring during treatment. Methods: An array of Si-diodes (DIODES Incorporated) is placed on the patient skin on the opposite side to the beam entrance. A “scout” SOBP beam, with a longer beam range to cover the diodes in its plateau, is delivered; the measured signal is analyzed and the extracted water equivalent path lengths (WEPL) are compared to the expected values, revealing if a range correction is needed. Diodes stay in place during treatment to measure dose. The workflowmore » was tested in solid water and head phantoms and validated against independent WEPL measurements. Both measured WEPL and skin doses were compared to computed values from the TPS (XiO); a Markus chamber was used for reference dose measurements. Results: The WEPL accuracy of the method was verified by comparing it with the dose extinction method. It resulted, for both solid water and head phantom, in the sub-millimeter range, with a deviation less than 1% to the value extracted from the TPS. The accuracy of dose measurements in the fall-off part of the dose profile was validated against the Markus chamber. The entire range verification workflow was successfully tested for the mock-treatment of head phantom with the standard delivery of 90 cGy per field per fraction. The WEPL measurement revealed no need for range correction. The dose measurements agreed to better than 4% with the prescription dose. The robustness of the method and workflow, including detector array, hardware set and software functions, was successfully stress-tested with multiple repetitions. Conclusion: The performance of the in-vivo range verification system and related workflow meet the clinical requirements in terms of the needed WEPL accuracy for pretreatment range verification with acceptable dose to the patient.« less
Effect of Data Assimilation Parameters on The Optimized Surface CO2 Flux in Asia
NASA Astrophysics Data System (ADS)
Kim, Hyunjung; Kim, Hyun Mee; Kim, Jinwoong; Cho, Chun-Ho
2018-02-01
In this study, CarbonTracker, an inverse modeling system based on the ensemble Kalman filter, was used to evaluate the effects of data assimilation parameters (assimilation window length and ensemble size) on the estimation of surface CO2 fluxes in Asia. Several experiments with different parameters were conducted, and the results were verified using CO2 concentration observations. The assimilation window lengths tested were 3, 5, 7, and 10 weeks, and the ensemble sizes were 100, 150, and 300. Therefore, a total of 12 experiments using combinations of these parameters were conducted. The experimental period was from January 2006 to December 2009. Differences between the optimized surface CO2 fluxes of the experiments were largest in the Eurasian Boreal (EB) area, followed by Eurasian Temperate (ET) and Tropical Asia (TA), and were larger in boreal summer than in boreal winter. The effect of ensemble size on the optimized biosphere flux is larger than the effect of the assimilation window length in Asia, but the importance of them varies in specific regions in Asia. The optimized biosphere flux was more sensitive to the assimilation window length in EB, whereas it was sensitive to the ensemble size as well as the assimilation window length in ET. The larger the ensemble size and the shorter the assimilation window length, the larger the uncertainty (i.e., spread of ensemble) of optimized surface CO2 fluxes. The 10-week assimilation window and 300 ensemble size were the optimal configuration for CarbonTracker in the Asian region based on several verifications using CO2 concentration measurements.
Is identity per se irrelevant? A contrarian view of self-verification effects.
Gregg, Aiden P
2009-01-01
Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby hindering therapeutic recovery. Despite the widespread acceptance of SVT, I contend that the evidence for it is weak and circumstantial. In particular, I contend that that most or all major findings cited in support of SVT can be more economically explained in terms of raison oblige theory (ROT). ROT posits that people with negative self-views solicit critical feedback, not because they want it, but because they their self-view inclines them regard it as probative, a necessary condition for considering it worth obtaining. Relevant findings are reviewed and reinterpreted with an emphasis on depression, and some new empirical data reported. (c) 2008 Wiley-Liss, Inc.
Validation and verification of a virtual environment for training naval submarine officers
NASA Astrophysics Data System (ADS)
Zeltzer, David L.; Pioch, Nicholas J.
1996-04-01
A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire
Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less
Joumaa, Venus; Bertrand, Fanny; Liu, Shuyue; Poscente, Sophia; Herzog, Walter
2018-05-16
The aim of this study was to determine the role of titin in preventing the development of sarcomere length non-uniformities following activation and after active and passive stretch, by determining the effect of partial titin degradation on sarcomere length non-uniformities and force in passive and active myofibrils. Selective partial titin degradation was performed using a low dose of trypsin. Myofibrils were set at a sarcomere length of 2.4 µm and then passively stretched to sarcomere lengths of 3.4 µm and 4.4 µm. In the active condition, myofibrils were set at a sarcomere length of 2.8µm, activated and actively stretched by 1 µm/sarcomere. The extent of sarcomere length non-uniformities was calculated for each sarcomere as the absolute difference between sarcomere length and the mean sarcomere length of the myofibril. Our main finding is that partial titin degradation does not increase sarcomere length non-uniformities after passive stretch and activation compared to when titin is intact, but increases the extent of sarcomere length non-uniformities after active stretch. Furthermore, when titin was partially degraded, active and passive stresses were substantially reduced. These results suggest that titin plays a crucial role in actively stretched myofibrils and is likely involved in active and passive force production.
Description of a Website Resource for Turbulence Modeling Verification and Validation
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.
2010-01-01
The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.
Long Rest Interval Promotes Durable Testosterone Responses in High-Intensity Bench Press.
Scudese, Estevão; Simão, Roberto; Senna, Gilmar; Vingren, Jakob L; Willardson, Jeffrey M; Baffi, Matheus; Miranda, Humberto
2016-05-01
The purpose of this study was to examine the influence of rest period duration (1 vs. 3 minute between sets) on acute hormone responses to a high-intensity and equal volume bench press workout. Ten resistance-trained men (25.2 ± 5.6 years; 78.2 ± 5.7 kg; 176.7 ± 5.4 cm; bench press relative strength: 1.3 ± 0.1 kg per kilogram of body mass) performed 2 bench press workouts separated by 1 week. Each workout consisted of 5 sets of 3 repetitions performed at 85% of 1 repetition maximum, with either 1- or 3-minute rest between sets. Circulating concentrations of total testosterone (TT), free testosterone (FT), cortisol (C), testosterone/cortisol ratio (TT/C), and growth hormone (GH) were measured at preworkout (PRE), and immediately (T0), 15 minutes (T15), and 30 minutes (T30) postworkout. Rating of perceived exertion was recorded before and after each set. For TT, both rest lengths enhanced all postexercise verifications (T0, T15, and T30) compared with PRE, with 1 minute showing decreases on T15 and T30 compared with T0. For FT, both 1- and 3-minute rest protocols triggered augmentations on distinct postexercise moments (T0 and T15 for 1 minute; T15 and T30 for 3-minute) compared with PRE. The C values did not change throughout any postexercise verification for either rests. The TT/C ratio was significantly elevated for both rests in all postexercise moments compared with PRE. Finally, GH values did not change for both rest lengths. In conclusion, although both short and long rest periods enhanced acute testosterone values, the longer rest promoted a long-lasting elevation for both TT and FT.
Radiation loss of planar surface plasmon polaritons transmission lines at microwave frequencies.
Xu, Zhixia; Li, Shunli; Yin, Xiaoxing; Zhao, Hongxin; Liu, Leilei
2017-07-21
Radiation loss of a typical spoof surface plasmon polaritons (SSPPs) transmission line (TL) is investigated in this paper. A 325 mm-long SSPPs TL is designed and fabricated. Simulated results show that radiation loss contributes more to transmission loss than dielectric loss and conductor loss from 2 GHz to 10 GHz. Radiation loss of the SSPPs TL could be divided into two parts, one is caused by the input mode converter, and the other is caused by the corrugated metallic strip. This paper explains mechanisms of radiation loss from different parts, designs a loaded SSPPs TL with a series of resistors to absorb electromagnetic energy on corrugated metallic strip, and then discriminates radiation loss from the input mode converter, proposes the concept of average radiation length (ARL) to evaluate radiation loss from SSPPs of finite length, and concludes that radiation loss is mainly caused by corrugated structure of finite length at low frequency band and by the input mode converter at high frequency band. To suppress radiation loss, a mixed slow wave TL based on the combination of coplanar waveguides (CPWs) and SSPPs is presented. The designed structure, sample fabrication and experimental verification are discussed.
NASA Technical Reports Server (NTRS)
Nicks, Oran W.; Korkan, Kenneth D.
1991-01-01
Two reports on student activities to determine the properties of a new laminar airfoil which were delivered at a conference on soaring technology are presented. The papers discuss a wind tunnel investigation and analysis of the SM701 airfoil and verification of the SM701 airfoil aerodynamic charcteristics utilizing theoretical techniques. The papers are based on a combination of analytical design, hands-on model fabrication, wind tunnel calibration and testing, data acquisition and analysis, and comparison of test results and theory.
Cleanup Verification Package for the 116-K-2 Effluent Trench
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. M. Capron
2006-04-04
This cleanup verification package documents completion of remedial action for the 116-K-2 effluent trench, also referred to as the 116-K-2 mile-long trench and the 116-K-2 site. During its period of operation, the 116-K-2 site was used to dispose of cooling water effluent from the 105-KE and 105-KW Reactors by percolation into the soil. This site also received mixed liquid wastes from the 105-KW and 105-KE fuel storage basins, reactor floor drains, and miscellaneous decontamination activities.
Tethered satellite system dynamics and control review panel and related activities, phase 3
NASA Technical Reports Server (NTRS)
1991-01-01
Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.
Theory of polyelectrolytes in solvents.
Chitanvis, Shirish M
2003-12-01
Using a continuum description, we account for fluctuations in the ionic solvent surrounding a Gaussian, charged chain and derive an effective short-ranged potential between the charges on the chain. This potential is repulsive at short separations and attractive at longer distances. The chemical potential can be derived from this potential. When the chemical potential is positive, it leads to a meltlike state. For a vanishingly low concentration of segments, this state exhibits scaling behavior for long chains. The Flory exponent characterizing the radius of gyration for long chains is calculated to be approximately 0.63, close to the classical value obtained for second order phase transitions. For short chains, the radius of gyration varies linearly with N, the chain length, and is sensitive to the parameters in the interaction potential. The linear dependence on the chain length N indicates a stiff behavior. The chemical potential associated with this interaction changes sign, when the screening length in the ionic solvent exceeds a critical value. This leads to condensation when the chemical potential is negative. In this state, it is shown using the mean-field approximation that spherical and toroidal condensed shapes can be obtained. The thickness of the toroidal polyelectrolyte is studied as a function of the parameters of the model, such as the ionic screening length. The predictions of this theory should be amenable to experimental verification.
Active Member Design, Modeling, and Verification
NASA Technical Reports Server (NTRS)
Umland, Jeffrey W.; Webster, Mark; John, Bruce
1993-01-01
The design and development of active members intended for use in structural control applications is presented. The use of three different solid state actuation materials, namely, piezoelectric, electrostictive, and magnetostrictive, is discussed. Test data is given in order to illustrate the actuator and device characteristics and performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, B.; Stern, W.; Colley, J.
International Atomic Energy Agency (IAEA) safeguards involves verification activities at a wide range of facilities in a variety of operational phases (e.g., under construction, start-up, operating, shutdown, closed-down, and decommissioned). Safeguards optimization for each different facility type and operational phase is essential for the effectiveness of safeguards implementation. The IAEA’s current guidance regarding safeguards for the different facility types in the various lifecycle phases is provided in its Design Information Examination (DIE) and Verification (DIV) procedure. 1 Greater efficiency in safeguarding facilities that are shut down or closed down, including those being decommissioned, could allow the IAEA to use amore » greater portion of its effort to conduct other verification activities. Consequently, the National Nuclear Security Administration’s Office of International Nuclear Safeguards sponsored this study to evaluate whether there is an opportunity to optimize safeguards approaches for facilities that are shutdown or closed-down. The purpose of this paper is to examine existing safeguards approaches for shutdown and closed-down facilities, including facilities being decommissioned, and to seek to identify whether they may be optimized.« less
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...
The 2014 Sandia Verification and Validation Challenge: Problem statement
Hu, Kenneth; Orient, George
2016-01-18
This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less
NEXT Thruster Component Verification Testing
NASA Technical Reports Server (NTRS)
Pinero, Luis R.; Sovey, James S.
2007-01-01
Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.
Metzger, Nicole L; Chesson, Melissa M; Momary, Kathryn M
2015-09-25
Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient's medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist's role in order verification and medication reconciliation, as well as improve clinical decision-making.
[Validation and verfication of microbiology methods].
Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción
2015-01-01
Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
STELLAR: fast and exact local alignments
2011-01-01
Background Large-scale comparison of genomic sequences requires reliable tools for the search of local alignments. Practical local aligners are in general fast, but heuristic, and hence sometimes miss significant matches. Results We present here the local pairwise aligner STELLAR that has full sensitivity for ε-alignments, i.e. guarantees to report all local alignments of a given minimal length and maximal error rate. The aligner is composed of two steps, filtering and verification. We apply the SWIFT algorithm for lossless filtering, and have developed a new verification strategy that we prove to be exact. Our results on simulated and real genomic data confirm and quantify the conjecture that heuristic tools like BLAST or BLAT miss a large percentage of significant local alignments. Conclusions STELLAR is very practical and fast on very long sequences which makes it a suitable new tool for finding local alignments between genomic sequences under the edit distance model. Binaries are freely available for Linux, Windows, and Mac OS X at http://www.seqan.de/projects/stellar. The source code is freely distributed with the SeqAn C++ library version 1.3 and later at http://www.seqan.de. PMID:22151882
Measurement of radiation damage of water-based liquid scintillator and liquid scintillator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bignell, L. J.; Diwan, M. V.; Hans, S.
2015-10-19
Liquid scintillating phantoms have been proposed as a means to perform real-time 3D dosimetry for proton therapy treatment plan verification. We have studied what effect radiation damage to the scintillator will have upon this application. We have performed measurements of the degradation of the light yield and optical attenuation length of liquid scintillator and water-based liquid scintillator after irradiation by 201 MeV proton beams that deposited doses of approximately 52 Gy, 300 Gy, and 800 Gy in the scintillator. Liquid scintillator and water-based liquid scintillator (composed of 5% scintillating phase) exhibit light yield reductions of 1.74 ± 0.55 % andmore » 1.31 ± 0.59 % after ≈ 800 Gy of proton dose, respectively. Some increased optical attenuation was observed in the irradiated samples, the measured reduction to the light yield is also due to damage to the scintillation light production. Based on our results and conservative estimates of the expected dose in a clinical context, a scintillating phantom used for proton therapy treatment plan verification would exhibit a systematic light yield reduction of approximately 0.1% after a year of operation.« less
Investigation of the trajectories and length of combustible gas jet flames in a sweeping air stream
NASA Astrophysics Data System (ADS)
Polezhaev, Yu. V.; Mostinskii, I. L.; Lamden, D. I.; Stonik, O. G.
2011-05-01
The trajectories of round gas jets and jet flames introduced into a sweeping air stream are studied. The influence of various initial conditions and of the physical properties of gases on the trajectory is considered. Experimental verification of the available approximation relations for the trajectories of flames in a wide range of the values of the blowing ratio has been carried out. It is shown that the newly obtained experimental approximation of the trajectory shape differs from the existing ones by about 20%. At small values of the blowing ratio (smaller than ~4.5) the flame trajectories cease to depend on it.
Development of LOX/LH2 tank system for H-I launch vehicle
NASA Astrophysics Data System (ADS)
Nozaki, Y.; Takamatsu, H.; Morino, Y.; Imagawa, K.
Design features of the second stage of the prospective Japanese H-1 launch vehicle are described. The stage will use an LO2/LH2 fueled engine. The fuels will be contained in a 2219 Al alloy tank insulated with sprayed polyurethane foam. The total stage length will be 5.5 m, the volume 6.8 m, pressure 3.2 kg/sq cm (LOX) and 2.5 kg/sq cm (LH2). The diameter is 2.5 m and total fuel mass is 8.7 tons. Design verification tests, consisting of burning tests and thermal evaluation, are scheduled for the near future.
Experimental verification of low sonic boom configuration
NASA Technical Reports Server (NTRS)
Ferri, A.; Wang, H. H.; Sorensen, H.
1972-01-01
A configuration designed to produce near field signature has been tested at M = 2.71 and the results are analyzed, by taking in account three-dimensional and second order effects. The configuration has an equivalent total area distribution that corresponds to an airplane flying at 60,000 ft. having a weight of 460,000 lbs, and 300 ft. length. A maximum overpressure of 0.95 lb/square foot has been obtained experimentally. The experimental results agree well with the analysis. The investigation indicates that the three-dimensional effects are very important when the measurements in wind tunnels are taken at small distances from the airplane.
Apparatus and method for classifying fuel pellets for nuclear reactor
Wilks, Robert S.; Sternheim, Eliezer; Breakey, Gerald A.; Sturges, Jr., Robert H.; Taleff, Alexander; Castner, Raymond P.
1984-01-01
Control for the operation of a mechanical handling and gauging system for nuclear fuel pellets. The pellets are inspected for diameters, lengths, surface flaws and weights in successive stations. The control includes, a computer for commanding the operation of the system and its electronics and for storing and processing the complex data derived at the required high rate. In measuring the diameter, the computer enables the measurement of a calibration pellet, stores that calibration data and computes and stores diameter-correction factors and their addresses along a pellet. To each diameter measurement a correction factor is applied at the appropriate address. The computer commands verification that all critical parts of the system and control are set for inspection and that each pellet is positioned for inspection. During each cycle of inspection, the measurement operation proceeds normally irrespective of whether or not a pellet is present in each station. If a pellet is not positioned in a station, a measurement is recorded, but the recorded measurement indicates maloperation. In measuring diameter and length a light pattern including successive shadows of slices transverse for diameter or longitudinal for length are projected on a photodiode array. The light pattern is scanned electronically by a train of pulses. The pulses are counted during the scan of the lighted diodes. For evaluation of diameter the maximum diameter count and the number of slices for which the diameter exceeds a predetermined minimum is determined. For acceptance, the maximum must be less than a maximum level and the minimum must exceed a set number. For evaluation of length, the maximum length is determined. For acceptance, the length must be within maximum and minimum limits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, Larry K.; Kassianov, Evgueni I.; Long, Charles N.
2006-03-30
In previous work, Berg and Stull (2005) developed a new parameterization for Fair-Weather Cumuli (FWC). Preliminary testing of the new scheme used data collected during a field experiment conducted during the summer of 1996. This campaign included a few research flights conducted over three locations within the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ACRF) Southern Great Plains (SGP) site. A more comprehensive verification of the new scheme requires a detailed climatology of FWC. Several cloud climatologies have been completed for the ACRF SGP, but these efforts have focused on either broad categories of clouds grouped by height and seasonmore » (e.g., Lazarus et al. 1999) or height and time of day (e.g., Dong et al. 2005). In these two examples, the low clouds were not separated by the type of cloud, either stratiform or cumuliform, nor were the horizontal chord length (the length of the cloud slice that passed directly overhead) or cloud aspect ratio (defined as the ratio of the cloud thickness to the cloud chord length) reported. Lane et al. (2002) presented distributions of cloud chord length, but only for one year. The work presented here addresses these shortcomings by looking explicitly at cases with FWC over five summers. Specifically, we will address the following questions: •Does the cloud fraction (CF), cloud-base height (CBH), and cloud-top height (CTH) of FWC change with the time of day or the year? •What is the distribution of FWC chord lengths? •Is there a relationship between the cloud chord length and the cloud thickness?« less
Building the Qualification File of EGNOS with DOORS
NASA Astrophysics Data System (ADS)
Fabre, J.
2008-08-01
EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.
Numerical Weather Predictions Evaluation Using Spatial Verification Methods
NASA Astrophysics Data System (ADS)
Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.
2014-12-01
During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain--Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is cofinanced by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007--2013).
Lucas, Todd; Pierce, Jennifer; Lumley, Mark A; Granger, Douglas A; Lin, Jue; Epel, Elissa S
2017-12-01
This experiment demonstrates that chromosomal telomere length (TL) moderates response to injustice among African Americans. Based on worldview verification theory - an emerging psychosocial framework for understanding stress - we predicted that acute stress responses would be most pronounced when individual-level expectancies for justice were discordant with justice experiences. Healthy African Americans (N=118; 30% male; M age=31.63years) provided dried blood spot samples that were assayed for TL, and completed a social-evaluative stressor task during which high versus low levels of distributive (outcome) and procedural (decision process) justice were simultaneously manipulated. African Americans with longer telomeres appeared more resilient (in emotional and neuroendocrine response-higher DHEAs:cortisol) to receiving an unfair outcome when a fair decision process was used, whereas African Americans with shorter telomeres appeared more resilient when an unfair decision process was used. TL may indicate personal histories of adversity and associated stress-related expectancies that influence responses to injustice. Copyright © 2017 Elsevier Ltd. All rights reserved.
Optimization of injection molding process parameters for a plastic cell phone housing component
NASA Astrophysics Data System (ADS)
Rajalingam, Sokkalingam; Vasant, Pandian; Khe, Cheng Seong; Merican, Zulkifli; Oo, Zeya
2016-11-01
To produce thin-walled plastic items, injection molding process is one of the most widely used application tools. However, to set optimal process parameters is difficult as it may cause to produce faulty items on injected mold like shrinkage. This study aims at to determine such an optimum injection molding process parameters which can reduce the fault of shrinkage on a plastic cell phone cover items. Currently used setting of machines process produced shrinkage and mis-specified length and with dimensions below the limit. Thus, for identification of optimum process parameters, maintaining closer targeted length and width setting magnitudes with minimal variations, more experiments are needed. The mold temperature, injection pressure and screw rotation speed are used as process parameters in this research. For optimal molding process parameters the Response Surface Methods (RSM) is applied. The major contributing factors influencing the responses were identified from analysis of variance (ANOVA) technique. Through verification runs it was found that the shrinkage defect can be minimized with the optimal setting found by RSM.
Development of the 15 meter diameter hoop column antenna
NASA Technical Reports Server (NTRS)
1986-01-01
The building of a deployable 15-meter engineering model of the 100 meter antenna based on the point-design of an earlier task of this contract, complete with an RF-capable surface is described. The 15 meter diameter was selected so that the model could be tested in existing manufacturing, near-field RF, thermal vacuum, and structural dynamics facilities. The antenna was designed with four offset paraboloidal reflector surfaces with a focal length of 366.85 in and a primary surface accuracy goal of .069 in rms. Surface adjustment capability was provided by manually resetting the length of 96 surface control cords which emanated from the lower column extremity. A detailed description of the 15-meter Hoop/Column Antenna, major subassemblies, and a history of its fabrication, assembly, deployment testing, and verification measurements are given. The deviation for one aperture surface (except the outboard extremity) was measured after adjustments in follow-on tests at the Martin Marietta Near-field Facility to be .061 in; thus the primary surface goal was achieved.
Definition of ground test for verification of large space structure control
NASA Technical Reports Server (NTRS)
Glaese, John R.
1994-01-01
Under this contract, the Large Space Structure Ground Test Verification (LSSGTV) Facility at the George C. Marshall Space Flight Center (MSFC) was developed. Planning in coordination with NASA was finalized and implemented. The contract was modified and extended with several increments of funding to procure additional hardware and to continue support for the LSSGTV facility. Additional tasks were defined for the performance of studies in the dynamics, control and simulation of tethered satellites. When the LSSGTV facility development task was completed, support and enhancement activities were funded through a new competitive contract won by LCD. All work related to LSSGTV performed under NAS8-35835 has been completed and documented. No further discussion of these activities will appear in this report. This report summarizes the tether dynamics and control studies performed.
Characterization of infrasound from lightning
NASA Astrophysics Data System (ADS)
Assink, J. D.; Evers, L. G.; Holleman, I.; Paulssen, H.
2008-08-01
During thunderstorm activity in the Netherlands, electromagnetic and infrasonic signals are emitted due to the process of lightning and thunder. It is shown that correlating infrasound detections with results from a electromagnetic lightning detection network is successful up to distances of 50 km from the infrasound array. Infrasound recordings clearly show blastwave characteristics which can be related to cloud-ground discharges, with a dominant frequency between 1-5 Hz. Amplitude measurements of CG discharges can partly be explained by the beam pattern of a line source with a dominant frequency of 3.9 Hz, up to a distance of 20 km. The ability to measure lightning activity with infrasound arrays has both positive and negative implications for CTBT verification purposes. As a scientific application, lightning studies can benefit from the worldwide infrasound verification system.
Stochastic Approaches Within a High Resolution Rapid Refresh Ensemble
NASA Astrophysics Data System (ADS)
Jankov, I.
2017-12-01
It is well known that global and regional numerical weather prediction (NWP) ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system is the use of stochastic physics. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), and Stochastic Perturbation of Physics Tendencies (SPPT). The focus of this study is to assess model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) using a variety of stochastic approaches. A single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model was utilized and ensemble members produced by employing stochastic methods. Parameter perturbations (using SPP) for select fields were employed in the Rapid Update Cycle (RUC) land surface model (LSM) and Mellor-Yamada-Nakanishi-Niino (MYNN) Planetary Boundary Layer (PBL) schemes. Within MYNN, SPP was applied to sub-grid cloud fraction, mixing length, roughness length, mass fluxes and Prandtl number. In the RUC LSM, SPP was applied to hydraulic conductivity and tested perturbing soil moisture at initial time. First iterative testing was conducted to assess the initial performance of several configuration settings (e.g. variety of spatial and temporal de-correlation lengths). Upon selection of the most promising candidate configurations using SPP, a 10-day time period was run and more robust statistics were gathered. SKEB and SPPT were included in additional retrospective tests to assess the impact of using all three stochastic approaches to address model uncertainty. Results from the stochastic perturbation testing were compared to a baseline multi-physics control ensemble. For probabilistic forecast performance the Model Evaluation Tools (MET) verification package was used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANN, F.M.
Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.
33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).
Code of Federal Regulations, 2010 CFR
2010-07-01
... Water Management District to conduct assessment, evaluation, and planning and integration activities... ensuring that the goals and purposes of the Plan are achieved. RECOVER has been organized into a Leadership Group that provides management and coordination for the activities of RECOVER and teams that accomplish...
Trajectory Based Behavior Analysis for User Verification
NASA Astrophysics Data System (ADS)
Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah
Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.
NASA Technical Reports Server (NTRS)
Saito, Jim
1987-01-01
The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.
Generic Verification Protocol for Verification of Online Turbidimeters
This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...
Verification Games: Crowd-Sourced Formal Verification
2016-03-01
VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced
NASA Astrophysics Data System (ADS)
Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.
2016-12-01
Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.
Earth Science Activities: A Guide to Effective Elementary School Science Teaching.
ERIC Educational Resources Information Center
Kanis, Ira B.; Yasso, Warren E.
The primary emphasis of this book is on new or revised earth science activities that promote concept development rather than mere verification of concepts learned by passive means. Chapter 2 describes philosophies, strategies, methods, and techniques to guide preservice and inservice teachers, school building administrators, and curriculum…
Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Brantley
2016-01-01
A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less
Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis
NASA Technical Reports Server (NTRS)
Montgomery, Todd L.
1995-01-01
This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.
National Center for Nuclear Security: The Nuclear Forensics Project (F2012)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klingensmith, A. L.
These presentation visuals introduce the National Center for Nuclear Security. Its chartered mission is to enhance the Nation’s verification and detection capabilities in support of nuclear arms control and nonproliferation through R&D activities at the NNSS. It has three focus areas: Treaty Verification Technologies, Nonproliferation Technologies, and Technical Nuclear Forensics. The objectives of nuclear forensics are to reduce uncertainty in the nuclear forensics process & improve the scientific defensibility of nuclear forensics conclusions when applied to nearsurface nuclear detonations. Research is in four key areas: Nuclear Physics, Debris collection and analysis, Prompt diagnostics, and Radiochemistry.
Apollo experience report: Communications system flight evaluation and verification
NASA Technical Reports Server (NTRS)
Travis, D.; Royston, C. L., Jr.
1972-01-01
Flight tests of the synergetic operation of the spacecraft and earth based communications equipment were accomplished during Apollo missions AS-202 through Apollo 12. The primary goals of these tests were to verify that the communications system would adequately support lunar landing missions and to establish the inflight communications system performance characteristics. To attain these goals, a communications system flight verification and evaluation team was established. The concept of the team operations, the evolution of the evaluation processes, synopses of the team activities associated with each mission, and major conclusions and recommendations resulting from the performance evaluation are represented.
A training paradigm to enhance motor recovery in contused rats: effects of staircase training.
Singh, Anita; Murray, Marion; Houle, John D
2011-01-01
Ambulating on stairs is an important aspect of daily activities for many individuals with incomplete spinal cord injury (SCI), and little is known about the effect of training for this specific task. The goal of this study was to determine whether staircase ascent training enhances motor recovery in animals with contusion injury. Rats received a midthoracic contusion lesion of moderate severity and were randomly divided into 2 groups, with one group receiving staircase ascent training for up to 8 weeks and the other receiving no training. To assess the direct effect of training, a task-specific staircase climbing test was performed. Open field test (BBB) and gait analysis (CatWalk) assessed overground recovery, and a grid test was used to assess improvement in sensorimotor tasks. Changes in muscle mass of the forelimb and hindlimb muscles were also measured, and the extent of spared white matter was determined for lesion verification and anatomical correlations. Staircase training improved the task-specific performance of ascent. Gait parameters, including base of support, stride length, regularity index (RI), and step sequence, also improved. Overground locomotion and the grid test, both showed a trend of improved performance. Finally, hindlimb muscle mass was maintained with training. Staircase ascent training after incomplete SCI has beneficial effects on task-specific as well as nonspecific motor and sensorimotor activities.
On the feasibility of automatic detection of range deviations from in-beam PET data
NASA Astrophysics Data System (ADS)
Helmbrecht, S.; Santiago, A.; Enghardt, W.; Kuess, P.; Fiedler, F.
2012-03-01
In-beam PET is a clinically proven method for monitoring ion beam cancer treatment. The objective is predominantly the verification of the range of the primary particles. Due to different processes leading to dose and activity, evaluation is done by comparing measured data to simulated. Up to now, the comparison is performed by well-trained observers (clinicians, physicists). This process is very time consuming and low in reproducibility. However, an automatic method is desirable. A one-dimensional algorithm for range comparison has been enhanced and extended to three dimensions. System-inherent uncertainties are handled by means of a statistical approach. To test the method, a set of data was prepared. Distributions of β+-activity calculated from treatment plans were compared to measurements performed in the framework of the German Heavy Ion Tumor Therapy Project at GSI Helmholtz Centre for Heavy Ion Research, Darmstadt, Germany. Artificial range deviations in the simulations served as test objects for the algorithm. Range modifications of different depth (4, 6 and 10 mm water equivalent path length) can be detected. Even though the sensitivity and specificity of a visual evaluation are higher, the method is feasible as the basis for the selection of patients from the data pool for retrospective evaluation of treatment and treatment plans and correlation with follow-up data. Furthermore, it can be used for the development of an assistance tool for a clinical application.
The USEPA has been very active in membrane research. The following areas are currently being investigated: in-house fouling research, Information Collection Rule (ICR) treatment studies, inorganic scaling modeling, Environmental Technology Verification (ETV) program implementati...
78 FR 57162 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-17
... that must be capable of verification by qualified auditors. Besides determining program reimbursement...). Insurers, underwriters, third party administrators, and self-insured/self-administered employers use the...
Jinno, Shunta; Tachibana, Hidenobu; Moriya, Shunsuke; Mizuno, Norifumi; Takahashi, Ryo; Kamima, Tatsuya; Ishibashi, Satoru; Sato, Masanori
2018-05-21
In inhomogeneous media, there is often a large systematic difference in the dose between the conventional Clarkson algorithm (C-Clarkson) for independent calculation verification and the superposition-based algorithms of treatment planning systems (TPSs). These treatment site-dependent differences increase the complexity of the radiotherapy planning secondary check. We developed a simple and effective method of heterogeneity correction integrated with the Clarkson algorithm (L-Clarkson) to account for the effects of heterogeneity in the lateral dimension, and performed a multi-institutional study to evaluate the effectiveness of the method. In the method, a 2D image reconstructed from computed tomography (CT) images is divided according to lines extending from the reference point to the edge of the multileaf collimator (MLC) or jaw collimator for each pie sector, and the radiological path length (RPL) of each line is calculated on the 2D image to obtain a tissue maximum ratio and phantom scatter factor, allowing the dose to be calculated. A total of 261 plans (1237 beams) for conventional breast and lung treatments and lung stereotactic body radiotherapy were collected from four institutions. Disagreements in dose between the on-site TPSs and a verification program using the C-Clarkson and L-Clarkson algorithms were compared. Systematic differences with the L-Clarkson method were within 1% for all sites, while the C-Clarkson method resulted in systematic differences of 1-5%. The L-Clarkson method showed smaller variations. This heterogeneity correction integrated with the Clarkson algorithm would provide a simple evaluation within the range of -5% to +5% for a radiotherapy plan secondary check.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toltz, Allison; Hoesl, Michaela; Schuemann, Jan
Purpose: A method to refine the implementation of an in vivo, adaptive proton therapy range verification methodology was investigated. Simulation experiments and in-phantom measurements were compared to validate the calibration procedure of a time-resolved diode dosimetry technique. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification by correlating properties of the detector signal to the water equivalent path length (WEPL). The implementation of this system requires a set of calibration measurements to establish a beam-specific diode response to WEPL fit for the selected ‘scout’ beam in a solidmore » water phantom. This process is both tedious, as it necessitates a separate set of measurements for every ‘scout’ beam that may be appropriate to the clinical case, as well as inconvenient due to limited access to the clinical beamline. The diode response to WEPL relationship for a given ‘scout’ beam may be determined within a simulation environment, facilitating the applicability of this dosimetry technique. Measurements for three ‘scout’ beams were compared against simulated detector response with Monte Carlo methods using the Tool for Particle Simulation (TOPAS). Results: Detector response in water equivalent plastic was successfully validated against simulation for spread out Bragg peaks of range 10 cm, 15 cm, and 21 cm (168 MeV, 177 MeV, and 210 MeV) with adjusted R{sup 2} of 0.998. Conclusion: Feasibility has been shown for performing calibration of detector response for a given ‘scout’ beam through simulation for the time resolved diode dosimetry technique.« less
SU-E-J-115: Graticule for Verification of Treatment Position in Neutron Therapy.
Halford, R; Snyder, M
2012-06-01
Until recently the treatment verification for patients undergoing fast neutron therapy at our facility was accomplished through a combination of neutron beam portal films aligned with a graticule mounted on an orthronormal x-ray tube. To eliminate uncertainty with respect to the relative positions of the x-ray graticule and the therapy beam, we have developed a graticule which is placed in the neutron beam itself. For a graticule to be visible on the portal film, the attenuation of the neutron beam by the graticule landmarks must be significantly greater than that of the material in which the landmarks are mounted. Various materials, thicknesses, and mounting points were tried to gain the largest contrast between the graticule landmarks and the mounting material. The final design involved 2 inch steel pins of 0.125 inch diameter captured between two parallel plates of 0.25 inch thick clear acrylic plastic. The distance between the two acrylic plates was 1.625 inches, held together at the perimeter with acrylic sidewall spacers. This allowed the majority of length of the steel pins to be surrounded by air. The pins were set 1 cm apart and mounted at angles parallel to the divergence of the beam dependent on their position within the array. The entire steel pin and acrylic plate assembly was mounted on an acrylic accessory tray to allow for graticule alignment. Despite the inherent difficulties in attenuating fast neutrons, our simple graticule design produces the required difference of attenuation between the arrays of landmarks and the mounting material. The graticule successfully provides an in-beam frame of reference for patient portal verification. © 2012 American Association of Physicists in Medicine.
Shafi, Shahid; Barnes, Sunni; Ahn, Chul; Hemilla, Mark R; Cryer, H Gill; Nathens, Avery; Neal, Melanie; Fildes, John
2016-10-01
The Trauma Quality Improvement Project of the American College of Surgeons (ACS) has demonstrated variations in trauma center outcomes despite similar verification status. The purpose of this study was to identify structural characteristics of trauma centers that affect patient outcomes. Trauma registry data on 361,187 patients treated at 222 ACS-verified Level I and Level II trauma centers were obtained from the National Trauma Data Bank of ACS. These data were used to estimate each center's observed-to-expected (O-E) mortality ratio with 95% confidence intervals using multivariate logistic regression analysis. De-identified data on structural characteristics of these trauma centers were obtained from the ACS Verification Review Committee. Centers in the lowest quartile of mortality based on O-E ratio (n = 56) were compared to the rest (n = 166) using Classification and Regression Tree (CART) analysis to identify institutional characteristics independently associated with high-performing centers. Of the 72 structural characteristics explored, only 3 were independently associated with high-performing centers: annual patient visits to the emergency department of fewer than 61,000; proportion of patients on Medicare greater than 20%; and continuing medical education for emergency department physician liaison to the trauma program ranging from 55 and 113 hours annually. Each 5% increase in O-E mortality ratio was associated with an increase in total length of stay of one day (r = 0.25; p < 0.001). Very few structural characteristics of ACS-verified trauma centers are associated with risk-adjusted mortality. Thus, variations in patient outcomes across trauma centers are likely related to variations in clinical practices. Therapeutic study, level III.
NASA Astrophysics Data System (ADS)
Pötzi, W.; Veronig, A. M.; Temmer, M.
2018-06-01
In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the Hα observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within {±} 1 heliographic degree for both algorithms, and the peak times improve from a mean difference of 1.7± 2.9 minutes to 1.3± 2.3 minutes. The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within -0.47± 4.10 minutes.
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.
Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K
2014-10-07
We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.
Escobedo, Patricia; Cruz, Tess Boley; Tsai, Kai-Ya; Allem, Jon-Patrick; Soto, Daniel W; Kirkpatrick, Matthew G; Pattarroyo, Monica; Unger, Jennifer B
2017-09-11
Limited information exists about strategies and methods used on brand marketing websites to transmit pro-tobacco messages to tobacco users and potential users. This study compared age verification methods, themes, interactive activities and links to social media across tobacco brand websites. This study examined 12 tobacco brand websites representing four tobacco product categories: cigarettes, cigar/cigarillos, smokeless tobacco, and e-cigarettes. Website content was analyzed by tobacco product category and data from all website visits (n = 699) were analyzed. Adult smokers (n=32) coded websites during a one-year period, indicating whether or not they observed any of 53 marketing themes, seven interactive activities, or five external links to social media sites. Most (58%) websites required online registration before entering, however e-cigarette websites used click-through age verification. Compared to cigarette sites, cigar/cigarillo sites were more likely to feature themes related to "party" lifestyle, and e-cigarette websites were much more likely to feature themes related to harm reduction. Cigarette sites featured greater levels of interactive content compared to other tobacco products. Compared to cigarette sites, cigar/cigarillo sites were more likely to feature activities related to events and music. Compared to cigarette sites, both cigar and e-cigarette sites were more likely to direct visitors to external social media sites. Marketing methods and strategies normalize tobacco use by providing website visitors with positive themes combined with interactive content, and is an area of future research. Moreover, all tobacco products under federal regulatory authority should be required to use more stringent age verification gates. Findings indicate the Food and Drug Administration (FDA) should require brand websites of all tobacco products under its regulatory authority use more stringent age verification gates by requiring all visitors be at least 18 years of age and register online prior to entry. This is important given that marketing strategies may encourage experimentation with tobacco or deter quit attempts among website visitors. Future research should examine the use of interactive activities and social media on a wide variety of tobacco brand websites as interactive content is associated with more active information processing. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy
NASA Astrophysics Data System (ADS)
Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.
2014-10-01
We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.
77 FR 21616 - Agency Information Collection Activities: Proposed Request and Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-10
... disability payments. SSA considers the claimants the primary sources of verification; therefore, if claimants... or private self-insured companies administering WC/PDB benefits to disability claimants. Type of...
Creep fatigue life prediction for engine hot section materials (ISOTROPIC)
NASA Technical Reports Server (NTRS)
Nelson, R. S.; Schoendorf, J. F.; Lin, L. S.
1986-01-01
The specific activities summarized include: verification experiments (base program); thermomechanical cycling model; multiaxial stress state model; cumulative loading model; screening of potential environmental and protective coating models; and environmental attack model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swart, Peter K.; Dixon, Tim
2014-09-30
A series of surface geophysical and geochemical techniques are tested in order to demonstrate and validate low cost approaches for Monitoring, Verification and Accounting (MVA) of the integrity of deep reservoirs for CO 2 storage. These techniques are (i) surface deformation by GPS; ii) surface deformation by InSAR; iii) passive source seismology via broad band seismometers; and iv) soil gas monitoring with a cavity ring down spectrometer for measurement of CO 2 concentration and carbon isotope ratio. The techniques were tested at an active EOR (Enhanced Oil Recovery) site in Texas. Each approach has demonstrated utility. Assuming Carbon Capture, Utilizationmore » and Storage (CCUS) activities become operational in the future, these techniques can be used to augment more expensive down-hole techniques.« less
NASA Astrophysics Data System (ADS)
Hendricks Franssen, H. J.; Post, H.; Vrugt, J. A.; Fox, A. M.; Baatz, R.; Kumbhar, P.; Vereecken, H.
2015-12-01
Estimation of net ecosystem exchange (NEE) by land surface models is strongly affected by uncertain ecosystem parameters and initial conditions. A possible approach is the estimation of plant functional type (PFT) specific parameters for sites with measurement data like NEE and application of the parameters at other sites with the same PFT and no measurements. This upscaling strategy was evaluated in this work for sites in Germany and France. Ecosystem parameters and initial conditions were estimated with NEE-time series of one year length, or a time series of only one season. The DREAM(zs) algorithm was used for the estimation of parameters and initial conditions. DREAM(zs) is not limited to Gaussian distributions and can condition to large time series of measurement data simultaneously. DREAM(zs) was used in combination with the Community Land Model (CLM) v4.5. Parameter estimates were evaluated by model predictions at the same site for an independent verification period. In addition, the parameter estimates were evaluated at other, independent sites situated >500km away with the same PFT. The main conclusions are: i) simulations with estimated parameters reproduced better the NEE measurement data in the verification periods, including the annual NEE-sum (23% improvement), annual NEE-cycle and average diurnal NEE course (error reduction by factor 1,6); ii) estimated parameters based on seasonal NEE-data outperformed estimated parameters based on yearly data; iii) in addition, those seasonal parameters were often also significantly different from their yearly equivalents; iv) estimated parameters were significantly different if initial conditions were estimated together with the parameters. We conclude that estimated PFT-specific parameters improve land surface model predictions significantly at independent verification sites and for independent verification periods so that their potential for upscaling is demonstrated. However, simulation results also indicate that possibly the estimated parameters mask other model errors. This would imply that their application at climatic time scales would not improve model predictions. A central question is whether the integration of many different data streams (e.g., biomass, remotely sensed LAI) could solve the problems indicated here.
Code Verification of the HIGRAD Computational Fluid Dynamics Solver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.
2012-05-04
The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less
The use of robots for arms control treaty verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalowski, S.J.
1991-01-01
Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less
Families of Functions and Functions of Proof
ERIC Educational Resources Information Center
Landman, Greisy Winicki
2002-01-01
This article describes an activity for secondary school students that may constitute an appropriate opportunity to discuss with them the idea of proof, particularly in an algebraic context. During the activity the students may experience and understand some of the roles played by proof in mathematics in addition to verification of truth:…
NASA Technical Reports Server (NTRS)
Gernand, Jeremy M.
2004-01-01
Experience with the International Space Station (ISS) program demonstrates the degree to which engineering design and operational solutions must protect crewmembers from health risks due to long-term exposure to the microgravity environment. Risks to safety and health due to degradation in the microgravity environment include crew inability to complete emergency or nominal activities, increased risk of injury, and inability to complete safe return to the ground due to reduced strength or embrittled bones. These risks without controls slowly increase in probability for the length of the mission and become more significant for increasing mission durations. Countermeasures to microgravity include hardware systems that place a crewmember s body under elevated stress to produce an effect similar to daily exposure to gravity. The ISS countermeasure system is predominately composed of customized exercise machines. Historical treatment of microgravity countermeasure systems as medical research experiments unintentionally reduced the foreseen importance and therefore the capability of the systems to function in a long-term operational role. Long-term hazardous effects and steadily increasing operational risks due to non-functional countermeasure equipment require a more rigorous design approach and incorporation of redundancy into seemingly non- mission-critical hardware systems. Variations in the rate of health degradation and responsiveness to countermeasures among the crew population drastically increase the challenge for design requirements development and verification of the appropriate risk control strategy. The long-term nature of the hazards and severe limits on logistical re-supply mass, volume and frequency complicates assessment of hardware availability and verification of an adequate maintenance and sparing plan. Design achievement of medically defined performance requirements by microgravity countermeasure systems and incorporation of adequate failure tolerance significantly reduces these risks. Future implementation of on-site monitoring hardware for critical health parameters such as bone mineral density would allow greater responsiveness, efficiency, and optimized design of the countermeasures system.
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
NASA Astrophysics Data System (ADS)
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bentefour, El H., E-mail: hassan.bentefour@iba-group.com; Prieels, Damien; Tang, Shikui
Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification inmore » the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose measurement method satisfies the two basic requirements, WEPL accuracy and minimum dose, necessary for clinical use, thus, its potential for in-vivo protons range verification. Further development is needed, namely, devising a workflow that takes into account the limits imposed by proton range mixing and the susceptibility of the comparison of measured and expected WEPLs to errors on the detector positions. The methods may also be used for in-vivo dosimetry and could benefit various proton therapy treatments.« less
7 CFR 989.77 - Verification of reports and rec-ords.
Code of Federal Regulations, 2010 CFR
2010-01-01
... representatives, shall have access to any handler's premises during regular business hours and shall be permitted... advertising activities conducted by handlers under § 989.53. Each handler shall furnish all labor and...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2014 CFR
2014-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2010 CFR
2010-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2013 CFR
2013-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2012 CFR
2012-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
6 CFR Appendix B to Part 5 - Public Reading Rooms of the Department of Homeland Security
Code of Federal Regulations, 2011 CFR
2011-01-01
...-proliferation and verification research and development program. The life sciences activities related to... book) 11. Former components of the General Services Administration: For the Federal Computer Incident...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
...: VA Form Letter 21-914 is use to verify whether Filipino veterans of the Special Philippine Scouts, Commonwealth Army of the Philippines, organized guerilla groups receiving service-connected compensation...
Static and Dynamic Verification of Critical Software for Space Applications
NASA Astrophysics Data System (ADS)
Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.
Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Qingdu; Guo, Jianli; Yang, Xiao-Song, E-mail: yangxs@hust.edu.cn
We present some rich new complex gaits in the simple walking model with upper body by Wisse et al. in [Robotica 22, 681 (2004)]. We first show that the stable gait found by Wisse et al. may become chaotic via period-doubling bifurcations. Such period-doubling routes to chaos exist for all parameters, such as foot mass, upper body mass, body length, hip spring stiffness, and slope angle. Then, we report three new gaits with period 3, 4, and 6; for each gait, there is also a period-doubling route to chaos. Finally, we show a practical method for finding a topological horseshoemore » in 3D Poincaré map, and present a rigorous verification of chaos from these gaits.« less
Witnessing effective entanglement over a 2 km fiber channel.
Wittmann, Christoffer; Fürst, Josef; Wiechers, Carlos; Elser, Dominique; Häseler, Hauke; Lütkenhaus, Norbert; Leuchs, Gerd
2010-03-01
We present a fiber-based continuous-variable quantum key distribution system. In the scheme, a quantum signal of two non-orthogonal weak optical coherent states is sent through a fiber-based quantum channel. The receiver simultaneously measures conjugate quadratures of the light using two homodyne detectors. From the measured Q-function of the transmitted signal, we estimate the attenuation and the excess noise caused by the channel. The estimated excess noise originating from the channel and the channel attenuation including the quantum efficiency of the detection setup is investigated with respect to the detection of effective entanglement. The local oscillator is considered in the verification. We witness effective entanglement with a channel length of up to 2 km.
Theoretical Insight into Dispersion of Silica Nanoparticles in Polymer Melts.
Wei, Zhaoyang; Hou, Yaqi; Ning, Nanying; Zhang, Liqun; Tian, Ming; Mi, Jianguo
2015-07-30
Silica nanoparticles dispersed in polystyrene, poly(methyl methacrylate), and poly(ethylene oxide) melts have been investigated using a density functional approach. The polymers are regarded as coarse-grained semiflexible chains, and the segment sizes are represented by their Kuhn lengths. The particle-particle and particle-polymer interactions are calculated with the Hamaker theory to reflect the relationship between particles and polymer melts. The effects of particle volume fraction and size on the particle dispersion have been quantitatively determined to evaluate their dispersion/aggregation behavior in these polymer melts. It is shown that theoretical predictions are generally in good agreement with the corresponding experimental results, providing the reasonable verification of particle dispersion/agglomeration and polymer depletion.
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
NASA's Approach to Software Assurance
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2015-01-01
NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.
NASA Astrophysics Data System (ADS)
Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders
2013-08-01
The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about 20 times lower than that of a patient undergoing standard 18F-FDG treatment. When using a combination of short lived nuclides such as 15O (half-life: 2 min) and 11C (half-life: 20 min) with low activity it is not optimal to use clinical reconstruction protocols. Thus, it might be desirable to further optimize reconstruction parameters as well as to address hardware improvements in realizing in vivo treatment verification with PET/CT in the future. A significant improvement with regard to 15O imaging could also be expected by having the PET/CT unit located close to the radiation treatment room.
NASA Astrophysics Data System (ADS)
Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.
2017-12-01
Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.
The PLAID graphics analysis impact on the space program
NASA Technical Reports Server (NTRS)
Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.
1994-01-01
An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.
Test and training simulator for ground-based teleoperated in-orbit servicing
NASA Technical Reports Server (NTRS)
Schaefer, Bernd E.
1989-01-01
For the Post-IOC(In-Orbit Construction)-Phase of COLUMBUS it is intended to use robotic devices for the routine operations of ground-based teleoperated In-Orbit Servicing. A hardware simulator for verification of the relevant in-orbit operations technologies, the Servicing Test Facility, is necessary which mainly will support the Flight Control Center for the Manned Space-Laboratories for operational specific tasks like system simulation, training of teleoperators, parallel operation simultaneously to actual in-orbit activities and for the verification of the ground operations segment for telerobotics. The present status of definition for the facility functional and operational concept is described.
Towards Formal Verification of a Separation Microkernel
NASA Astrophysics Data System (ADS)
Butterfield, Andrew; Sanan, David; Hinchey, Mike
2013-08-01
The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.
NASA Astrophysics Data System (ADS)
Rausch, Peter; Verpoort, Sven; Wittrock, Ulrich
2017-11-01
Concepts for future large space telescopes require an active optics system to mitigate aberrations caused by thermal deformation and gravitational release. Such a system would allow on-site correction of wave-front errors and ease the requirements for thermal and gravitational stability of the optical train. In the course of the ESA project "Development of Adaptive Deformable Mirrors for Space Instruments" we have developed a unimorph deformable mirror designed to correct for low-order aberrations and dedicated to be used in space environment. We briefly report on design and manufacturing of the deformable mirror and present results from performance verifications and environmental testing.
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2013-12-01
ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2011 CFR
2011-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
18 CFR 281.213 - Data Verification Committee.
Code of Federal Regulations, 2010 CFR
2010-04-01
... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
David L. Chichester; James T. Johnson; Edward H. Seabury
2012-07-01
Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and experiments, using fission-spectrum neutron sources to assess neutron transmission through composite low-Z attenuators.« less
NASA Technical Reports Server (NTRS)
Cleveland, Paul E.; Parrish, Keith A.
2005-01-01
A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.
Day length is associated with physical activity and sedentary behavior among older women.
Schepps, Mitchell A; Shiroma, Eric J; Kamada, Masamitsu; Harris, Tamara B; Lee, I-Min
2018-04-26
Physical activity may be influenced by one's physical environment, including day length and weather. Studies of physical activity, day length, and weather have primarily used self-reported activity, broad meteorological categorization, and limited geographic regions. We aim to examine the association of day length and physical activity in a large cohort of older women, covering a wide geographic range. Participants (N = 16,741; mean (SD) age = 72.0 (SD = 5.7) years) were drawn from the Women's Health Study and lived throughout the United States. Physical activity was assessed by accelerometer (ActiGraph GT3X+) between 2011 and 2015. Day length and weather information were obtained by matching weather stations to the participants' location using National Oceanic and Atmospheric Administration databases. Women who experienced day lengths greater than 14 hours had 5.5% more steps, 9.4% more moderate-to-vigorous physical activity, and 1.6% less sedentary behavior, compared to women who experienced day lengths less than 10 hours, after adjusting for age, accelerometer wear, temperature, and precipitation. Day length is associated with physical activity and sedentary behavior in older women, and needs to be considered in programs promoting physical activity as well as in the analyses of accelerometer data covering wide geographic regions.
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
VERIFICATION AND VALIDATION OF THE SPARC MODEL
Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-31
... to determine Filipino Veterans or beneficiaries receiving benefit at the full-dollar rate continues... approved collection. Abstract: VA Form Letter 21-914 is use to verify whether Filipino Veterans of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-07
... to determine Filipino veterans or beneficiaries receiving benefit at the full-dollar rate continues... approved collection. Abstract: VA Form Letter 21-914 is use to verify whether Filipino veterans of the...
Code of Federal Regulations, 2010 CFR
2010-01-01
... (including verification of identity based on fingerprinting), employment history, education, and personal..., training, or education to effectively utilize the specific Safeguards Information in the proceeding. Where... performing active operations on material such as chemical transformation, physical transformation, or transit...
Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.
Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M
2000-02-01
Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.
Verification study of an emerging fire suppression system
Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...
2016-01-01
Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less
Optimizing IV and V for Mature Organizations
NASA Technical Reports Server (NTRS)
Fuhman, Christopher
2003-01-01
NASA is intending for its future software development agencies to have at least a Level 3 rating in the Carnegie Mellon University Capability Maturity Model (CMM). The CMM has built-in Verification and Validation (V&V) processes that support higher software quality. Independent Verification and Validation (IV&V) of software developed by mature agencies can be therefore more effective than for software developed by less mature organizations. How is Independent V&V different with respect to the maturity of an organization? Knowing a priori the maturity of an organization's processes, how can IV&V planners better identify areas of need choose IV&V activities, etc? The objective of this research is to provide a complementary set of guidelines and criteria to assist the planning of IV&V activities on a project using a priori knowledge of the measurable levels of maturity of the organization developing the software.
Russian-US collaboration on implementation of the active well coincidence counter (AWCC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mozhajev, V.; Pshakin, G.; Stewart, J.
The feasibility of using a standard AWCC at the Obninsk IPPE has been demonstrated through active measurements of single UO{sub 2} (36% enriched) disks and through passive measurements of plutonium metal disks used for simulating reactor cores. The role of the measurements is to verify passport values assigned to the disks by the facility, and thereby facilitate the mass accountability procedures developed for the very large inventory of fuel disks at the facility. The AWCC is a very flexible instrument for verification measurements of the large variety of nuclear material items at the Obninsk IPPE and other Russian facilities. Futuremore » work at the IPPE will include calibration and verification measurements for other materials, both in individual disks and in multi-disk storage tubes; it will also include training in the use of the AWCC.« less
Verification study of an emerging fire suppression system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.
Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...
This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...
Kostanyan, Artak E; Erastov, Andrey A
2016-09-02
The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. Copyright © 2016 Elsevier B.V. All rights reserved.
MIT's interferometer CST testbed
NASA Technical Reports Server (NTRS)
Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard
1990-01-01
The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.
Engineering support activities for the Apollo 17 Surface Electrical Properties Experiment.
NASA Technical Reports Server (NTRS)
Cubley, H. D.
1972-01-01
Description of the engineering support activities which were required to ensure fulfillment of objectives specified for the Apollo 17 SEP (Surface Electrical Properties) Experiment. Attention is given to procedural steps involving verification of hardware acceptability to the astronauts, computer simulation of the experiment hardware, field trials, receiver antenna pattern measurements, and the qualification test program.
ERIC Educational Resources Information Center
Marinis, Theodoros; Saddy, Douglas
2013-01-01
Twenty-five monolingual (L1) children with specific language impairment (SLI), 32 sequential bilingual (L2) children, and 29 L1 controls completed the Test of Active & Passive Sentences-Revised (van der Lely 1996) and the Self-Paced Listening Task with Picture Verification for actives and passives (Marinis 2007). These revealed important…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... of the ETA-9016 (OMB Control No. 1205-0268) on Alien Claims Activity Report; Comment Request on... the U.S. Citizenship and Immigration Services (USCIS), Systematic Alien Verification for Entitlement.... In addition, data from the Alien Claims Activity report is being used to assist the Secretary of...
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary
2018-01-01
Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…
NASA Technical Reports Server (NTRS)
Cornford, S.; Gibbel, M.
1997-01-01
NASA's Code QT Test Effectiveness Program is funding a series of applied research activities focused on utilizing the principles of physics and engineering of failure and those of engineering economics to assess and improve the value-added by the various validation and verification activities to organizations.
Using Aerial Photography to Estimate Riparian Zone Impacts in a Rapidly Developing River Corridor
NASA Astrophysics Data System (ADS)
Owers, Katharine A.; Albanese, Brett; Litts, Thomas
2012-03-01
Riparian zones are critical for protecting water quality and wildlife, but are often impacted by human activities. Ongoing threats and uncertainty about the effectiveness of buffer regulations emphasize the importance of monitoring riparian buffers through time. We developed a method to rapidly categorize buffer width and landuse attributes using 2007 leaf-on aerial photography and applied it to a 65 km section of the Toccoa River in north Georgia. We repeated our protocol using 1999 leaf-off aerial photographs to assess the utility of our approach for monitoring. Almost half (45%) of the length of the Toccoa River was bordered by buffers less than 50 ft wide in 2007, with agricultural and built-up lands having the smallest buffers. The percentage of river length in each buffer width category changed little between 1999 and 2007, but we did detect a 5% decrease in agricultural land use, a corresponding increase in built-up land use, and an additional 149 buildings within 100 ft of the river. Field verification indicated that our method overestimated buffer widths and forested land use and underestimated built-up land use and the number of buildings within 100 ft of the river. Our methodology can be used to rapidly assess the status of riparian buffers. Including supplemental data (e.g., leaf-off imagery, road layers) will allow detection of the fine-scale impacts underestimated in our study. Our results on the Toccoa River reflect historic impacts, exemptions and variances to regulations, and the ongoing threat of vacation home development. We recommend additional monitoring, improvements in policy, and efforts to increase voluntary protection and restoration of stream buffers.
Using aerial photography to estimate riparian zone impacts in a rapidly developing river corridor.
Owers, Katharine A; Albanese, Brett; Litts, Thomas
2012-03-01
Riparian zones are critical for protecting water quality and wildlife, but are often impacted by human activities. Ongoing threats and uncertainty about the effectiveness of buffer regulations emphasize the importance of monitoring riparian buffers through time. We developed a method to rapidly categorize buffer width and landuse attributes using 2007 leaf-on aerial photography and applied it to a 65 km section of the Toccoa River in north Georgia. We repeated our protocol using 1999 leaf-off aerial photographs to assess the utility of our approach for monitoring. Almost half (45%) of the length of the Toccoa River was bordered by buffers less than 50 ft wide in 2007, with agricultural and built-up lands having the smallest buffers. The percentage of river length in each buffer width category changed little between 1999 and 2007, but we did detect a 5% decrease in agricultural land use, a corresponding increase in built-up land use, and an additional 149 buildings within 100 ft of the river. Field verification indicated that our method overestimated buffer widths and forested land use and underestimated built-up land use and the number of buildings within 100 ft of the river. Our methodology can be used to rapidly assess the status of riparian buffers. Including supplemental data (e.g., leaf-off imagery, road layers) will allow detection of the fine-scale impacts underestimated in our study. Our results on the Toccoa River reflect historic impacts, exemptions and variances to regulations, and the ongoing threat of vacation home development. We recommend additional monitoring, improvements in policy, and efforts to increase voluntary protection and restoration of stream buffers.
30 CFR 250.913 - When must I resubmit Platform Verification Program plans?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
NASA Astrophysics Data System (ADS)
Damle, R. M.; Atrey, M. D.
2015-01-01
The aim of this work is to develop a transient program for the simulation of a miniature Joule-Thomson (J-T) cryocooler to predict its cool-down characteristics. A one dimensional transient model is formulated for the fluid streams and the solid elements of the recuperative heat exchanger. Variation of physical properties due to pressure and temperature is considered. In addition to the J-T expansion at the end of the finned tube, the distributed J-T effect along its length is also considered. It is observed that the distributed J-T effect leads to additional cooling of the gas in the finned tube and that it cannot be neglected when the pressure drop along the length of the finned tube is large. The mathematical model, method of resolution and the global transient algorithm, within a modular object-oriented framework, are detailed in this paper. As a part of verification and validation of the developed model, cases available in the literature are simulated and the results are compared with the corresponding numerical and experimental data.
A detailed experimental study of a DNA computer with two endonucleases.
Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz
2017-07-14
Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.
Advanced applications of cosmic-ray muon radiography
NASA Astrophysics Data System (ADS)
Perry, John
The passage of cosmic-ray muons through matter is dominated by the Coulomb interaction with electrons and atomic nuclei. The muon's interaction with electrons leads to continuous energy loss and stopping through the process of ionization. The muon's interaction with nuclei leads to angular diffusion. If a muon stops in matter, other processes unfold, as discussed in more detail below. These interactions provide the basis for advanced applications of cosmic-ray muon radiography discussed here, specifically: 1) imaging a nuclear reactor with near horizontal muons, and 2) identifying materials through the analysis of radiation lengths weighted by density and secondary signals that are induced by cosmic-ray muon trajectories. We have imaged a nuclear reactor, type AGN-201m, at the University of New Mexico, using data measured with a particle tracker built from a set of sealed drift tubes, the Mini Muon Tracker (MMT). Geant4 simulations were compared to the data for verification and validation. In both the data and simulation, we can identify regions of interest in the reactor including the core, moderator, and shield. This study reinforces our claims for using muon tomography to image reactors following an accident. Warhead and special nuclear materials (SNM) imaging is an important thrust for treaty verification and national security purposes. The differentiation of SNM from other materials, such as iron and aluminum, is useful for these applications. Several techniques were developed for material identification using cosmic-ray muons. These techniques include: 1) identifying the radiation length weighted by density of an object and 2) measuring the signals that can indicate the presence of fission and chain reactions. By combining the radiographic images created by tracking muons through a target plane with the additional fission neutron and gamma signature, we are able to locate regions that are fissionable from a single side. The following materials were imaged with this technique: aluminum, concrete, steel, lead, and uranium. Provided that there is sufficient mass, U-235 could be differentiated from U-238 through muon induced fission.
30 CFR 250.909 - What is the Platform Verification Program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
Banda, Jorge A; Haydel, K Farish; Davila, Tania; Desai, Manisha; Bryson, Susan; Haskell, William L; Matheson, Donna; Robinson, Thomas N
2016-01-01
To examine the effects of accelerometer epoch lengths, wear time (WT) algorithms, and activity cut-points on estimates of WT, sedentary behavior (SB), and physical activity (PA). 268 7-11 year-olds with BMI ≥ 85th percentile for age and sex wore accelerometers on their right hips for 4-7 days. Data were processed and analyzed at epoch lengths of 1-, 5-, 10-, 15-, 30-, and 60-seconds. For each epoch length, WT minutes/day was determined using three common WT algorithms, and minutes/day and percent time spent in SB, light (LPA), moderate (MPA), and vigorous (VPA) PA were determined using five common activity cut-points. ANOVA tested differences in WT, SB, LPA, MPA, VPA, and MVPA when using the different epoch lengths, WT algorithms, and activity cut-points. WT minutes/day varied significantly by epoch length when using the NHANES WT algorithm (p < .0001), but did not vary significantly by epoch length when using the ≥ 20 minute consecutive zero or Choi WT algorithms. Minutes/day and percent time spent in SB, LPA, MPA, VPA, and MVPA varied significantly by epoch length for all sets of activity cut-points tested with all three WT algorithms (all p < .0001). Across all epoch lengths, minutes/day and percent time spent in SB, LPA, MPA, VPA, and MVPA also varied significantly across all sets of activity cut-points with all three WT algorithms (all p < .0001). The common practice of converting WT algorithms and activity cut-point definitions to match different epoch lengths may introduce significant errors. Estimates of SB and PA from studies that process and analyze data using different epoch lengths, WT algorithms, and/or activity cut-points are not comparable, potentially leading to very different results, interpretations, and conclusions, misleading research and public policy.
WFF TOPEX Software Documentation Overview, May 1999. Volume 2
NASA Technical Reports Server (NTRS)
Brooks, Ronald L.; Lee, Jeffrey
2003-01-01
This document provides an overview'of software development activities and the resulting products and procedures developed by the TOPEX Software Development Team (SWDT) at Wallops Flight Facility, in support of the WFF TOPEX Engineering Assessment and Verification efforts.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... a TWIC and a voluntary customer satisfaction survey. DATES: Send your comments by August 15, 2011. A... identification verification and access control. TSA also conducts a survey to capture worker overall satisfaction...
Test/QA Plan for Verification of Microcystin Test Kits
Microcystin test kits are used to quantitatively measure total microcystin in recreational waters. These test kits are based on enzyme-linked immunosorbent assays (ELISA) with antibodies that bind specifically to microcystins or phosphate activity inhibition where the phosphatas...
Proving autonomous vehicle and advanced driver assistance systems safety : final research report.
DOT National Transportation Integrated Search
2016-02-15
The main objective of this project was to provide technology for answering : crucial safety and correctness questions about verification of autonomous : vehicle and advanced driver assistance systems based on logic. : In synergistic activities, we ha...
Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.
Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M
2013-05-21
This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewberry, R.; Ayers, J.; Tietze, F.
The Analytical Development (AD) Section field nuclear measurement group performed six 'best available technique' verification measurements to satisfy a DOE requirement instituted for the March 2009 semi-annual inventory. The requirement of (1) yielded the need for SRNL Research Operations Department Material Control & Accountability (MC&A) group to measure the Pu content of five items and the highly enrich uranium (HEU) content of two. No 14Q-qualified measurement equipment was available to satisfy the requirement. The AD field nuclear group has routinely performed the required Confirmatory Measurements for the semi-annual inventories for fifteen years using sodium iodide and high purity germanium (HpGe)more » {gamma}-ray pulse height analysis nondestructive assay (NDA) instruments. With appropriate {gamma}-ray acquisition modeling, the HpGe spectrometers can be used to perform verification-type quantitative assay for Pu-isotopics and HEU content. The AD nuclear NDA group is widely experienced with this type of measurement and reports content for these species in requested process control, MC&A booking, and holdup measurements assays Site-wide. However none of the AD HpGe {gamma}-ray spectrometers have been 14Q-qualified, and the requirement of reference 1 specifically excluded a {gamma}-ray PHA measurement from those it would accept for the required verification measurements. The requirement of reference 1 was a new requirement for which the Savannah River National Laboratory (SRNL) Research Operations Department (ROD) MC&A group was unprepared. The criteria for exemption from verification were: (1) isotope content below 50 grams; (2) intrinsically tamper indicating or TID sealed items which contain a Category IV quantity of material; (3) assembled components; and (4) laboratory samples. Therefore all (SRNL) Material Balance Area (MBA) items with greater than 50 grams total Pu or greater than 50 grams HEU were subject to a verification measurement. The pass/fail criteria of reference 7 stated 'The facility will report measured values, book values, and statistical control limits for the selected items to DOE SR...', and 'The site/facility operator must develop, document, and maintain measurement methods for all nuclear material on inventory'. These new requirements exceeded SRNL's experience with prior semi-annual inventory expectations, but allowed the AD nuclear field measurement group to demonstrate its excellent adaptability and superior flexibility to respond to unpredicted expectations from the DOE customer. The requirements yielded five SRNL items subject to Pu verification and two SRNL items subject to HEU verification. These items are listed and described in Table 1.« less
Intermediate Experimental Vehicle (IXV): Avionics and Software of the ESA Reentry Demonstrator
NASA Astrophysics Data System (ADS)
Malucchi, Giovanni; Dussy, Stephane; Camuffo, Fabrizio
2012-08-01
The IXV project is conceived as a technology platform that would perform the step forward with respect to the Atmospheric Reentry Demonstrator (ARD), by increasing the system maneuverability and verifying the critical technology performances against a wider re- entry corridor.The main objective is to design, develop and to perform an in-flight verification of an autonomous lifting and aerodynamically controlled (by a combined use of thrusters and aerodynamic surfaces) reentry system.The project also includes the verification and experimentation of a set of critical reentry technologies and disciplines:Thermal Protection System (TPS), for verification and characterization of thermal protection technologies in representative operational environment;Aerodynamics - Aerthermodynamics (AED-A TD), for understanding and validation of aerodynamics and aerothermodyamics phenomena with improvement of design tools;Guidance, Navigation and Control (GNC), for verification of guidance, navigation and control techniques in representative operational environment (i.e. reentry from Low Earth Orbit);Flight dynamics, to update and validate the vehicle model during actual flight, focused on stability and control derivatives.The above activities are being performed through the implementation of a strict system design-to-cost approach with a proto-flight model development philosophy.In 2008 and 2009, the IXV project activities reached the successful completion of the project Phase-B, including the System PDR, and early project Phase-C.In 2010, following a re-organization of the industrial consortium, the IXV project successfully completed a design consolidation leading to an optimization of the technical baseline including the GNC, avionics (i.e. power, data handling, radio frequency and telemetry), measurement sensors, hot and cold composite structures, thermal protections and control, with significant improvements of the main system budgets.The project has successfully closed the System CDR during 2011 and it is currently running the Phase-D with the target to be launched with Vega from Kourou in 2014The paper will provide an overview of the IXV design and mission objectives in the frame of the atmospheric reentry overall activities, focusing on the avionics and software architecture and design.
Peneva-Reed, Elitsa I.; Romijn, J. Erika
2018-05-31
This report was written as a collaborative effort between the U.S. Geological Survey, SilvaCarbon, and Wageningen University with funding provided by the U.S. Agency for International Development and the European Space Agency, respectively, to address a pressing need for enhanced result-based monitoring and evaluation of delivered capacity-building activities. For this report, the capacity-building activities delivered by capacity-building providers (referred to as “providers” hereafter) during 2011–15 (the study period) to support countries in building measurement, reporting, and verification (MRV) systems for reducing emissions from deforestation and forest degradation (REDD+) were assessed and evaluated.Summarizing capacity-building activities and outcomes across multiple providers was challenging. Many of the providers did not have information readily available, which precluded them from participating in this study despite the usefulness of their information. This issue led to a key proposed future action: Capacity-building providers could establish a central repository within the Global Forestry Observation Initiative (GFOI; http://www.gfoi.org/) where data from past, current, and future activities of all capacity-building providers could be stored. The repository could be maintained in a manner to continually learn from previous lessons.Although various providers monitored and evaluated the success of their capacity-building activities, such evaluations only assessed the success of immediate outcomes and not the overarching outcomes and impacts of activities implemented by multiple providers. Good monitoring and evaluation should continuously monitor and periodically evaluate all factors affecting the outcomes of a provided capacity-building activity.The absence of a methodology to produce quantitative evidence of a causal link between multiple capacity-building activities delivered and successful outcomes left only a plausible association. A previous publication argued that plausible association, although not a precise measurement of cause and effect, was a realistic tool. Our review of the available literature on this subject did not find another similar assessment to assess capacity-building activities for supporting the countries in building MRV system for REDD+.Four countries from the main forested regions of Africa, the Americas, and Asia were chosen as subjects for this report based on the length of time SilvaCarbon and other providers have provided capacity-building activities toward MRV system for REDD+: Colombia (the Americas), the Democratic Republic of the Congo (DRC; Africa), Peru (the Americas), and the Republic of the Philippines (referred to as “the Philippines” hereafter; Asia).Several providers were contacted for information to include in this report, but, because of various constraints, only SilvaCarbon, the Food and Agriculture Organization of the United Nations (FAO), and the World Wildlife Fund (WWF) participated in this study. These three providers supported various targeted capacity-building activities through-out Africa, the Americas, and Asia, including the following: technical workshops at national and regional levels (referred to as “workshops” hereafter), hands on training, study tours, technical details by experts, technical consultation between providers and recipients, sponsorship for travel, organizing network meetings, developing sampling protocols, assessing deforestation and degradation drivers, estimating carbon stock and flow, designing monitoring systems for multiple uses, promoting public-private partnerships to scale up investments on MRV system for REDD+, and assisting with the design of national forest monitoring systems.Their activities were planned in coordination with key partners in each country and region and with the support and assistance of other providers. Note that several other organizations and institutions assisted the providers to deliver capacity-building activities, including Boston University, Conservation International, Stanford University, University of Maryland, and Wageningen University & Research.
Hailstorms over Switzerland: Verification of Crowd-sourced Data
NASA Astrophysics Data System (ADS)
Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia
2016-04-01
The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
Mihailov, Rossen; Stoeva, Dilyana; Pencheva, Blagovesta; Pentchev, Eugeni
2018-03-01
In a number of cases the monitoring of patients with type I diabetes mellitus requires measurement of the exogenous insulin levels. For the purpose of a clinical investigation of the efficacy of a medical device for application of exogenous insulin aspart, a verification of the method for measurement of this synthetic analogue of the hormone was needed. The information in the available medical literature for the measurement of the different exogenous insulin analogs is insufficient. Thus, verification was required to be in compliance with the active standards in Republic of Bulgaria. A manufactured method developed for ADVIA Centaur XP Immunoassay, Siemens Healthcare, was used which we verified using standard solutions and a patient serum pool by adding the appropriate quantity exogenous insulin aspart. The method was verified in accordance with the bioanalytical method verification criteria and regulatory requirements for using a standard method: CLIA chemiluminescence immunoassay ADVIA Centaur® XP. The following parameters are determined and monitored: intra-day precision and accuracy, inter-day precision and accuracy, limit of detection and lower limit of quantification, linearity, analytical recovery. The routine application of the method for measurement of immunoreactive insulin using the analyzer ADVIA Centaur® XP is directed to the measurement of endogenous insulin. The method is applicable for measuring different types of exogenous insulin, including insulin aspart.
Independent Validation and Verification of automated information systems in the Department of Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunteman, W.J.; Caldwell, R.
1994-07-01
The Department of Energy (DOE) has established an Independent Validation and Verification (IV&V) program for all classified automated information systems (AIS) operating in compartmented or multi-level modes. The IV&V program was established in DOE Order 5639.6A and described in the manual associated with the Order. This paper describes the DOE IV&V program, the IV&V process and activities, the expected benefits from an IV&V, and the criteria and methodologies used during an IV&V. The first IV&V under this program was conducted on the Integrated Computing Network (ICN) at Los Alamos National Laboratory and several lessons learned are presented. The DOE IV&Vmore » program is based on the following definitions. An IV&V is defined as the use of expertise from outside an AIS organization to conduct validation and verification studies on a classified AIS. Validation is defined as the process of applying the specialized security test and evaluation procedures, tools, and equipment needed to establish acceptance for joint usage of an AIS by one or more departments or agencies and their contractors. Verification is the process of comparing two levels of an AIS specification for proper correspondence (e.g., security policy model with top-level specifications, top-level specifications with source code, or source code with object code).« less
SMAP Verification and Validation Project - Final Report
NASA Technical Reports Server (NTRS)
Murry, Michael
2012-01-01
In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.
Adaptation of the length-active tension relationship in rabbit detrusor
Almasri, Atheer M.; Bhatia, Hersch; Klausner, Adam P.; Ratz, Paul H.
2009-01-01
Studies have shown that the length-tension (L-T) relationships in airway and vascular smooth muscles are dynamic and can adapt to length changes over a period of time. Our prior studies have shown that the passive L-T relationship in rabbit detrusor smooth muscle (DSM) is also dynamic and that DSM exhibits adjustable passive stiffness (APS) characterized by a passive L-T curve that can shift along the length axis as a function of strain history and activation history. The present study demonstrates that the active L-T curve for DSM is also dynamic and that the peak active tension produced at a particular muscle length is a function of both strain and activation history. More specifically, this study reveals that the active L-T relationship, or curve, does not have a unique peak tension value with a single ascending and descending limb, but instead reveals that multiple ascending and descending limbs can be exhibited in the same DSM strip. This study also demonstrates that for DSM strips not stretched far enough to reveal a descending limb, the peak active tension produced by a maximal KCl-induced contraction at a short, passively slack muscle length of 3 mm was reduced by 58.6 ± 4.1% (n = 15) following stretches to and contractions at threefold the original muscle length, 9 mm. Moreover, five subsequent contractions at the short muscle length displayed increasingly greater tension; active tension produced by the sixth contraction was 91.5 ± 9.1% of that produced by the prestretch contraction at that length. Together, these findings indicate for the first time that DSM exhibits length adaptation, similar to vascular and airway smooth muscles. In addition, our findings demonstrate that preconditioning, APS and adaptation of the active L-T curve can each impact the maximum total tension observed at a particular DSM length. PMID:19675182
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
NASA Astrophysics Data System (ADS)
Gaeris, Andres Claudio
The Stimulated Brillouin Scattering (SBS) instability is studied in moderately short scale-length plasmas. The backscattered and specularly reflected light resulting from the interaction of a pair of high power picosecond duration laser pulses with solid Silicon, Gold and Parylene-N (CH) strip targets was spectrally resolved. The first, weaker laser pulse forms a short scale-length plasma while the second delayed one interacts with the isothermally expanded, underdense region of the plasma. The pulses are generated by the Table Top Terawatt (TTT) laser operating at 1054 nm (infrared) with intensities up to 5.10 16 W/cm2. Single laser pulses only show Lambertian scattering on the target critical surface. Pairs of pulses with high intensity in the second pulse show an additional backscattered, highly blueshifted feature, associated with SBS. Increasing this second pulse intensity even more leads to the appearance of a third feature, even more blueshifted than the second, resulting from the Brillouin sidescattering of the laser pulse reflected on the critical surface. The SBS threshold intensities and enhanced reflectivities for P-polarized light are determined for different plasma density scale-lengths. These measurements agree with the convective thresholds predicted by the SBS theory of Liu, Rosenbluth, and White using plasma profiles simulated by the LILAC code. The spectral position of the Brillouin back- and sidescattered features are determined. The SBS and Doppler shifts are much too small to explain the observed blueshifts. The refractive index shift is of the right magnitude, although more detailed verification is required in the future.
NASA Technical Reports Server (NTRS)
Windley, P. J.
1991-01-01
In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.
Hydrogen and Storage Initiatives at the NASA JSC White Sands Test Facility
NASA Technical Reports Server (NTRS)
Maes, Miguel; Woods, Stephen S.
2006-01-01
NASA WSTF Hydrogen Activities: a) Aerospace Test; b) System Certification & Verification; c) Component, System, & Facility Hazard Assessment; d) Safety Training Technical Transfer: a) Development of Voluntary Consensus Standards and Practices; b) Support of National Hydrogen Infrastructure Development.
Self-verification motives at the collective level of self-definition.
Chen, Serena; Chen, Karen Y; Shaw, Lindsay
2004-01-01
Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
Bifurcation and chaos in the simple passive dynamic walking model with upper body.
Li, Qingdu; Guo, Jianli; Yang, Xiao-Song
2014-09-01
We present some rich new complex gaits in the simple walking model with upper body by Wisse et al. in [Robotica 22, 681 (2004)]. We first show that the stable gait found by Wisse et al. may become chaotic via period-doubling bifurcations. Such period-doubling routes to chaos exist for all parameters, such as foot mass, upper body mass, body length, hip spring stiffness, and slope angle. Then, we report three new gaits with period 3, 4, and 6; for each gait, there is also a period-doubling route to chaos. Finally, we show a practical method for finding a topological horseshoe in 3D Poincaré map, and present a rigorous verification of chaos from these gaits.
Calculation of streamflow statistics for Ontario and the Great Lakes states
Piggott, Andrew R.; Neff, Brian P.
2005-01-01
Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.
NASA Astrophysics Data System (ADS)
Wang, Qi; Song, Huaqing; Wang, Xingpeng; Wang, Dongdong; Li, Li
2018-03-01
In this paper, we demonstrated thermally tunable 1- μm single-frequency fiber lasers utilizing loop mirror filters (LMFs) with unpumped Yb-doped fibers. The frequency selection and tracking was achieved by combining a fiber Bragg grating (FBG) and a dynamic grating established inside the LMF. The central emission wavelength was at 1064.07 nm with a tuning range of 1.4 nm, and the measured emission linewidth was less than 10 kHz. We also systematically studied the wavelength-tracking thermal stability of the LMF with separate thermal treatment upon the FBG and LMF, respectively. Finally, we presented a selection criterion for the minimum unpumped doped fiber length inside the LMF with experimental verification.
Vector wind profile gust model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1981-01-01
To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.
NASA Technical Reports Server (NTRS)
Winget, C. M.; Deroshia, C. W.; Markley, C. L.; Holley, D. C.
1984-01-01
This review discusses the effects, in the aerospace environment, of alterations in approximately 24-h periodicities (circadian rhythms) upon physiological and psychological functions and possible therapies for desynchronosis induced by such alterations. The consequences of circadian rhythm alteration resulting from shift work, transmeridian flight, or altered day lengths are known as desynchronosis, dysrhythmia, dyschrony, jet lag, or jet syndrome. Considerable attention is focused on the ability to operate jet aircraft and manned space vehicles. The importance of environmental cues, such as light-dark cycles, which influence physiological and psychological rhythms is discussed. A section on mathematical models is presented to enable selection and verification of appropriate preventive and corrective measures and to better understand the problem of dysrhythmia.
Bifurcation and chaos in the simple passive dynamic walking model with upper body
NASA Astrophysics Data System (ADS)
Li, Qingdu; Guo, Jianli; Yang, Xiao-Song
2014-09-01
We present some rich new complex gaits in the simple walking model with upper body by Wisse et al. in [Robotica 22, 681 (2004)]. We first show that the stable gait found by Wisse et al. may become chaotic via period-doubling bifurcations. Such period-doubling routes to chaos exist for all parameters, such as foot mass, upper body mass, body length, hip spring stiffness, and slope angle. Then, we report three new gaits with period 3, 4, and 6; for each gait, there is also a period-doubling route to chaos. Finally, we show a practical method for finding a topological horseshoe in 3D Poincaré map, and present a rigorous verification of chaos from these gaits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolme, David S; Tokola, Ryan A; Boehnen, Chris Bensing
Automatic recognition systems are a valuable tool for identifying unknown deceased individuals. Immediately af- ter death fingerprint and face biometric samples are easy to collect using standard sensors and cameras and can be easily matched to anti-mortem biometric samples. Even though post-mortem fingerprints and faces have been used for decades, there are no studies that track these biomet- rics through the later stages of decomposition to determine the length of time the biometrics remain viable. This paper discusses a multimodal dataset of fingerprints, faces, and irises from 14 human cadavers that decomposed outdoors under natural conditions. Results include predictive modelsmore » relating time and temperature, measured as Accumulated Degree Days (ADD), and season (winter, spring, summer) to the predicted probably of automatic verification using a commercial algorithm.« less
2001-02-03
In the Space Station Processing Facility, workers help guide the Multi-Purpose Logistics Module Donatello as it moves the length of the SSPF toward a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
Wick, David V.
2005-12-20
An active optical zoom system changes the magnification (or effective focal length) of an optical imaging system by utilizing two or more active optics in a conventional optical system. The system can create relatively large changes in system magnification with very small changes in the focal lengths of individual active elements by leveraging the optical power of the conventional optical elements (e.g., passive lenses and mirrors) surrounding the active optics. The active optics serve primarily as variable focal-length lenses or mirrors, although adding other aberrations enables increased utility. The active optics can either be LC SLMs, used in a transmissive optical zoom system, or DMs, used in a reflective optical zoom system. By appropriately designing the optical system, the variable focal-length lenses or mirrors can provide the flexibility necessary to change the overall system focal length (i.e., effective focal length), and therefore magnification, that is normally accomplished with mechanical motion in conventional zoom lenses. The active optics can provide additional flexibility by allowing magnification to occur anywhere within the FOV of the system, not just on-axis as in a conventional system.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...
National Centers for Environmental Prediction
Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
National Centers for Environmental Prediction
Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
NASA Technical Reports Server (NTRS)
Johnston, Shaida
2004-01-01
The term verification implies compliance verification in the language of treaty negotiation and implementation, particularly in the fields of disarmament and arms control. The term monitoring on the other hand, in both environmental and arms control treaties, has a much broader interpretation which allows for use of supporting data sources that are not necessarily acceptable or adequate for direct verification. There are many ways that satellite Earth observation (EO) data can support international environmental agreements, from national forest inventories to use in geographic information system (GIs) tools. Though only a few references to satellite EO data and their use exist in the treaties themselves, an expanding list of applications can be considered in support of multilateral environmental agreements (MEAs). This paper explores the current uses of satellite Earth observation data which support monitoring activities of major environmental treaties and draws conclusions about future missions and their data use. The scope of the study includes all phases of environmental treaty fulfillment - development, monitoring, and enforcement - and includes a multinational perspective on the use of satellite Earth observation data for treaty support.
Control/structure interaction design methodology
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.; Layman, William E.
1989-01-01
The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.
Hiraishi, Kunihiko
2014-01-01
One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766
NASA Astrophysics Data System (ADS)
Reed, Joshua L.
Permanent implants of low-energy photon-emitting brachytherapy sources are used to treat a variety of cancers. Individual source models must be separately characterized due to their unique geometry, materials, and radionuclides, which all influence their dose distributions. Thermoluminescent dosimeters (TLDs) are often used for dose measurements around low-energy photon-emitting brachytherapy sources. TLDs are typically calibrated with higher energy sources such as 60Co, which requires a correction for the change in the response of the TLDs as a function of photon energy. These corrections have historically been based on TLD response to x ray bremsstrahlung spectra instead of to brachytherapy sources themselves. This work determined the TLD intrinsic energy dependence for 125I and 103Pd sources relative to 60Co, which allows for correction of TLD measurements of brachytherapy sources with factors specific to their energy spectra. Traditional brachytherapy sources contain mobile internal components and large amounts of high-Z material such as radio-opaque markers and titanium encapsulations. These all contribute to perturbations and uncertainties in the dose distribution around the source. The CivaString is a new elongated 103Pd brachytherapy source with a fixed internal geometry, polymer encapsulation, and lengths ranging from 1 to 6 cm, which offers advantages over traditional source designs. This work characterized the CivaString source and the results facilitated the formal approval of this source for use in clinical treatments. Additionally, the accuracy of a superposition technique for dose calculation around the sources with lengths >1 cm was verified. Advances in diagnostic techniques are paving the way for focal brachytherapy in which the dose is intentionally modulated throughout the target volume to focus on subvolumes that contain cancer cells. Brachytherapy sources with variable longitudinal strength (VLS) are a promising candidate for use in focal brachytherapy treatments given their customizable activity distributions, although they are not yet commercially available. This work characterized five prototype VLS sources, developed methods for clinical calibration and verification of these sources, and developed an analytical dose calculation algorithm that scales with both source length and VLS.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Nutrition Assistance Program Prisoner and Death Match Requirements AGENCY: Food and Nutrition Service (FNS.... SUPPLEMENTARY INFORMATION: Title: Supplemental Nutrition Assistance Program Prisoner and Death Match... verification and death matching procedures as mandated by legislation and previously implemented through agency...
40 CFR 92.504 - Right of entry and access.
Code of Federal Regulations, 2010 CFR
2010-07-01
... manufacturer's or remanufacturer's production line testing or auditing program or any procedure or activity... service accumulation, emission test cycles, and maintenance and verification of test equipment calibration... not limited to, clerical, copying, interpretation and translation services; the making available on an...
42 CFR 457.380 - Eligibility verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380... Requirements: Eligibility, Screening, Applications, and Enrollment § 457.380 Eligibility verification. (a) The... State may establish reasonable eligibility verification mechanisms to promote enrollment of eligible...
The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...
NASA Astrophysics Data System (ADS)
Moteabbed, M.; Trofimov, A.; Sharp, G. C.; Wang, Y.; Zietman, A. L.; Efstathiou, J. A.; Lu, H.-M.
2017-03-01
Proton therapy of prostate by anterior beams could offer an attractive option for treating patients with hip prosthesis and limiting the high-dose exposure to the rectum. We investigated the impact of setup and anatomy variations on the anterior-oblique (AO) proton plan dose, and strategies to manage these effects via range verification and adaptive delivery. Ten patients treated by bilateral (BL) passive-scattering proton therapy (79.2 Gy in 44 fractions) who underwent weekly verification CT scans were selected. Plans with AO beams were additionally created. To isolate the effect of daily variations, initial AO plans did not include range uncertainty margins. The use of fixed planning margins and adaptive range adjustments to manage these effects was investigated. For each case, the planned dose was recalculated on weekly CTs, and accumulated on the simulation CT using deformable registration to approximate the delivered dose. Planned and accumulated doses were compared for each scenario to quantify dose deviations induced by variations. The possibility of estimating the necessary range adjustments before each treatment was explored by simulating the procedure of a diode-based in vivo range verification technique, which would potentially be used clinically. The average planned rectum, penile bulb and femoral heads mean doses were smaller for initial AO compared to BL plans (by 8.3, 16.1 and 25.9 Gy, respectively). After considering interfractional variations in AO plans, the target coverage was substantially reduced. The maximum reduction of V 79.2/D 95/D mean/EUD for AO (without distal margins) (25.3%/10.7/1.6/4.9 Gy, respectively) was considerably larger than BL plans. The loss of coverage was mainly related to changes in water equivalent path length of the prostate after fiducial-based setup, caused by discrepancies in patient anterior surface and bony-anatomy alignment. Target coverage was recovered partially when using fixed planning margins, and fully when applying adaptive range adjustments. The accumulated organs-at-risk dose for AO beams after range adjustment demonstrated full sparing of femoral heads and superior sparing of penile bulb and rectum compared to the conventional BL cases. Our study indicates that using AO beams makes prostate treatment more susceptible to target underdose induced by interfractional variations. Adaptive range verification/adjustment may facilitate the use of anterior beam approaches, and ensure adequate target coverage in every fraction of the treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy
The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup,more » under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.« less
Taming active turbulence with patterned soft interfaces.
Guillamat, P; Ignés-Mullol, J; Sagués, F
2017-09-15
Active matter embraces systems that self-organize at different length and time scales, often exhibiting turbulent flows apparently deprived of spatiotemporal coherence. Here, we use a layer of a tubulin-based active gel to demonstrate that the geometry of active flows is determined by a single length scale, which we reveal in the exponential distribution of vortex sizes of active turbulence. Our experiments demonstrate that the same length scale reemerges as a cutoff for a scale-free power law distribution of swirling laminar flows when the material evolves in contact with a lattice of circular domains. The observed prevalence of this active length scale can be understood by considering the role of the topological defects that form during the spontaneous folding of microtubule bundles. These results demonstrate an unexpected strategy for active systems to adapt to external stimuli, and provide with a handle to probe the existence of intrinsic length and time scales.Active nematics consist of self-driven components that develop orientational order and turbulent flow. Here Guillamat et al. investigate an active nematic constrained in a quasi-2D geometrical setup and show that there exists an intrinsic length scale that determines the geometry in all forcing regimes.
Shuttle propellant loading instrumenation development
NASA Technical Reports Server (NTRS)
Hamlet, J.
1975-01-01
A continuous capacitance sensor was developed and an analog signal conditioner was evaluated to demonstrate the acceptability of these items for use in the space shuttle propellant loading system. An existing basic sensor concept was redesigned to provide capability for cryogenic operation, to improve performance, and to minimize production costs. Sensor development verification consisted of evaluation of sensor linearity, cryogenic performance, and stability during vibration. The signal conditioner evaluation consisted mainly of establishing the effects of the variations in temperature and cable parameters and evaluating the stability. A sensor linearity of 0.04 in. was achieved over most of the sensor length. The sensor instability caused by vibration was 0.04 percent. The cryogenic performance data show a maximum instability of 0.19 percent at liquid hydrogen temperature; a theoretical calibration can be computed a within 1 percent. The signal conditioner evaluation showed that, with temperature compensation, all error sources typically contribute much less than 1 percent. An estimate of the accuracy achievable with the sensor and signal conditioner shows an rss estimate of 0.75 in. for liquid oxygen and 1.02 in. for liquid hydrogen. These are approximately four times better than the shuttle requirements. Comparison of continuous sensor and discrete sensor performance show the continuous sensor to be significantly better when there is surface activity due to sloshing, boiling, or other disturbances.
Hanft, Laurin M; McDonald, Kerry S
2010-08-01
According to the Frank-Starling relationship, increased ventricular volume increases cardiac output, which helps match cardiac output to peripheral circulatory demand. The cellular basis for this relationship is in large part the myofilament length-tension relationship. Length-tension relationships in maximally calcium activated preparations are relatively shallow and similar between cardiac myocytes and skeletal muscle fibres. During twitch activations length-tension relationships become steeper in both cardiac and skeletal muscle; however, it remains unclear whether length dependence of tension differs between striated muscle cell types during submaximal activations. The purpose of this study was to compare sarcomere length-tension relationships and the sarcomere length dependence of force development between rat skinned left ventricular cardiac myocytes and fast-twitch and slow-twitch skeletal muscle fibres. Muscle cell preparations were calcium activated to yield 50% maximal force, after which isometric force and rate constants (k(tr)) of force development were measured over a range of sarcomere lengths. Myofilament length-tension relationships were considerably steeper in fast-twitch fibres compared to slow-twitch fibres. Interestingly, cardiac myocyte preparations exhibited two populations of length-tension relationships, one steeper than fast-twitch fibres and the other similar to slow-twitch fibres. Moreover, myocytes with shallow length-tension relationships were converted to steeper length-tension relationships by protein kinase A (PKA)-induced myofilament phosphorylation. Sarcomere length-k(tr) relationships were distinct between all three cell types and exhibited patterns markedly different from Ca(2+) activation-dependent k(tr) relationships. Overall, these findings indicate cardiac myocytes exhibit varied length-tension relationships and sarcomere length appears a dominant modulator of force development rates. Importantly, cardiac myocyte length-tension relationships appear able to switch between slow-twitch-like and fast-twitch-like by PKA-mediated myofibrillar phosphorylation, which implicates a novel means for controlling Frank-Starling relationships.
Physical activity and telomere length: Impact of aging and potential mechanisms of action
Arsenis, Nicole C.; You, Tongjian; Ogawa, Elisa F.; Tinsley, Grant M.; Zuo, Li
2017-01-01
Telomeres protect the integrity of information-carrying DNA by serving as caps on the terminal portions of chromosomes. Telomere length decreases with aging, and this contributes to cell senescence. Recent evidence supports that telomere length of leukocytes and skeletal muscle cells may be positively associated with healthy living and inversely correlated with the risk of several age-related diseases, including cancer, cardiovascular disease, obesity, diabetes, chronic pain, and stress. In observational studies, higher levels of physical activity or exercise are related to longer telomere lengths in various populations, and athletes tend to have longer telomere lengths than non-athletes. This relationship is particularly evident in older individuals, suggesting a role of physical activity in combating the typical age-induced decrements in telomere length. To date, a small number of exercise interventions have been executed to examine the potential influence of chronic exercise on telomere length, but these studies have not fully established such relationship. Several potential mechanisms through which physical activity or exercise could affect telomere length are discussed, including changes in telomerase activity, oxidative stress, inflammation, and decreased skeletal muscle satellite cell content. Future research is needed to mechanistically examine the effects of various modalities of exercise on telomere length in middle-aged and older adults, as well as in specific clinical populations. PMID:28410238
Physical activity and telomere length: Impact of aging and potential mechanisms of action.
Arsenis, Nicole C; You, Tongjian; Ogawa, Elisa F; Tinsley, Grant M; Zuo, Li
2017-07-04
Telomeres protect the integrity of information-carrying DNA by serving as caps on the terminal portions of chromosomes. Telomere length decreases with aging, and this contributes to cell senescence. Recent evidence supports that telomere length of leukocytes and skeletal muscle cells may be positively associated with healthy living and inversely correlated with the risk of several age-related diseases, including cancer, cardiovascular disease, obesity, diabetes, chronic pain, and stress. In observational studies, higher levels of physical activity or exercise are related to longer telomere lengths in various populations, and athletes tend to have longer telomere lengths than non-athletes. This relationship is particularly evident in older individuals, suggesting a role of physical activity in combating the typical age-induced decrements in telomere length. To date, a small number of exercise interventions have been executed to examine the potential influence of chronic exercise on telomere length, but these studies have not fully established such relationship. Several potential mechanisms through which physical activity or exercise could affect telomere length are discussed, including changes in telomerase activity, oxidative stress, inflammation, and decreased skeletal muscle satellite cell content. Future research is needed to mechanistically examine the effects of various modalities of exercise on telomere length in middle-aged and older adults, as well as in specific clinical populations.
EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...
40 CFR 1066.240 - Torque transducer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...
Compliance Verification Paths for Residential and Commercial Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Makela, Eric J.; Fannin, Jerica D.
2011-10-10
This report looks at different ways to verify energy code compliance and to ensure that the energy efficiency goals of an adopted document are achieved. Conformity assessment is the body of work that ensures compliance, including activities that can ensure residential and commercial buildings satisfy energy codes and standards. This report identifies and discusses conformity-assessment activities and provides guidance for conducting assessments.
NASA Astrophysics Data System (ADS)
Arndt, J.; Kreimer, J.
2010-09-01
The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
Joint ETV/NOWATECH verification protocol for the Sorbisense GSW40 passive sampler
Environmental technology verification (ETV) is an independent (third party) assessment of the performance of a technology or a product for a specified application, under defined conditions and quality assurance. This verification is a joint verification with the US EPA ETV schem...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
Operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
1983-01-01
The major operational areas of the COSMIC center are described. Quantitative data on the software submittals, program verification, and evaluation are presented. The dissemination activities are summarized. Customer services and marketing activities of the center for the calendar year are described. Those activities devoted to the maintenance and support of selected programs are described. A Customer Information system, the COSMIC Abstract Recording System Project, and the COSMIC Microfiche Project are summarized. Operational cost data are summarized.
NASA Astrophysics Data System (ADS)
Lin, Yen-Hui
2017-11-01
A non-steady-state mathematical model system for the kinetics of adsorption and biodegradation of 2-chlorophenol (2-CP) by attached and suspended biomass on activated carbon process was derived. The mechanisms in the model system included 2-CP adsorption by activated carbon, 2-CP mass transport diffusion in biofilm, and biodegradation by attached and suspended biomass. Batch kinetic tests were performed to determine surface diffusivity of 2-CP, adsorption parameters for 2-CP, and biokinetic parameters of biomass. Experiments were conducted using a biological activated carbon (BAC) reactor system with high recycled rate to approximate a completely mixed flow reactor for model verification. Concentration profiles of 2-CP by model predictions indicated that biofilm bioregenerated the activated carbon by lowering the 2-CP concentration at the biofilm-activated carbon interface as the biofilm grew thicker. The removal efficiency of 2-CP by biomass was approximately 98.5% when 2-CP concentration in the influent was around 190.5 mg L-1 at a steady-state condition. The concentration of suspended biomass reached up to about 25.3 mg L-1 while the thickness of attached biomass was estimated to be 636 μm at a steady-state condition by model prediction. The experimental results agree closely with the results of the model predictions.
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
TQAP for Verification of Qualitative Lead Test Kits
There are lead-based paint test kits available to help home owners and contractors identify lead-based paint hazards before any Renovation, Repair, and Painting (RRP) activities take place so that proper health and safety meaures can be enacted. However, many of these test kits ...
Summary of the 2014 Sandia V&V Challenge Workshop
Schroeder, Benjamin B.; Hu, Kenneth T.; Mullins, Joshua Grady; ...
2016-02-19
A discussion of the five responses to the 2014 Sandia Verification and Validation (V&V) Challenge Problem, presented within this special issue, is provided hereafter. Overviews of the challenge problem workshop, workshop participants, and the problem statement are also included. Brief summations of teams' responses to the challenge problem are provided. Issues that arose throughout the responses that are deemed applicable to the general verification, validation, and uncertainty quantification (VVUQ) community are the main focal point of this paper. The discussion is oriented and organized into big picture comparison of data and model usage, VVUQ activities, and differentiating conceptual themes behindmore » the teams' VVUQ strategies. Significant differences are noted in the teams' approaches toward all VVUQ activities, and those deemed most relevant are discussed. Beyond the specific details of VVUQ implementations, thematic concepts are found to create differences among the approaches; some of the major themes are discussed. Lastly, an encapsulation of the key contributions, the lessons learned, and advice for the future are presented.« less
Execution of the Spitzer In-orbit Checkout and Science Verification Plan
NASA Technical Reports Server (NTRS)
Miles, John W.; Linick, Susan H.; Long, Stacia; Gilbert, John; Garcia, Mark; Boyles, Carole; Werner, Michael; Wilson, Robert K.
2004-01-01
The Spitzer Space Telescope is an 85-cm telescope with three cryogenically cooled instruments. Following launch, the observatory was initialized and commissioned for science operations during the in-orbit checkout (IOC) and science verification (SV) phases, carried out over a total of 98.3 days. The execution of the IOC/SV mission plan progressively established Spitzer capabilities taking into consideration thermal, cryogenic, optical, pointing, communications, and operational designs and constraints. The plan was carried out with high efficiency, making effective use of cryogen-limited flight time. One key component to the success of the plan was the pre-launch allocation of schedule reserve in the timeline of IOC/SV activities, and how it was used in flight both to cover activity redesign and growth due to continually improving spacecraft and instrument knowledge, and to recover from anomalies. This paper describes the adaptive system design and evolution, implementation, and lessons learned from IOC/SV operations. It is hoped that this information will provide guidance to future missions with similar engineering challenges
[Does action semantic knowledge influence mental simulation in sentence comprehension?].
Mochizuki, Masaya; Naito, Katsuo
2012-04-01
This research investigated whether action semantic knowledge influences mental simulation during sentence comprehension. In Experiment 1, we confirmed that the words of face-related objects include the perceptual knowledge about the actions that bring the object to the face. In Experiment 2, we used an acceptability judgment task and a word-picture verification task to compare the perceptual information that is activated by the comprehension of sentences describing an action using face-related objects near the face (near-sentence) or far from the face (far-sentence). Results showed that participants took a longer time to judge the acceptability of the far-sentence than the near-sentence. Verification times were significantly faster when the actions in the pictures matched the action described in the sentences than when they were mismatched. These findings suggest that action semantic knowledge influences sentence processing, and that perceptual information corresponding to the content of the sentence is activated regardless of the action semantic knowledge at the end of the sentence processing.
Rassier, Dilson E; Herzog, Walter; Wakeling, Jennifer; Syme, Douglas A
2003-09-01
Stretch-induced force enhancement has been observed in a variety of muscle preparations and on structural levels ranging from single fibers to in vivo human muscles. It is a well-accepted property of skeletal muscle. However, the mechanism causing force enhancement has not been elucidated, although the sarcomere-length non-uniformity theory has received wide support. The purpose of this paper was to re-investigate stretch-induced force enhancement in frog single fibers by testing specific hypotheses arising from the sarcomere-length non-uniformity theory. Single fibers dissected from frog tibialis anterior (TA) and lumbricals (n=12 and 22, respectively) were mounted in an experimental chamber with physiological Ringer's solution (pH=7.5) between a force transducer and a servomotor length controller. The tetantic force-length relationship was determined. Isometric reference forces were determined at optimum length (corresponding to the maximal, active, isometric force), and at the initial and final lengths of the stretch experiments. Stretch experiments were performed on the descending limb of the force-length relationship after maximal tetanic force was reached. Stretches of 2.5-10% (TA) and 5-15% lumbricals of fiber length were performed at 0.1-1.5 fiber lengths/s. The stretch-induced, steady-state, active isometric force was always equal or greater than the purely isometric force at the muscle length from which the stretch was initiated. Moreover, for stretches of 5% fiber length or greater, and initiated near the optimum length of the fiber, the stretch-enhanced active force always exceeded the maximal active isometric force at optimum length. Finally, we observed a stretch-induced enhancement of passive force. We conclude from these results that the sarcomere length non-uniformity theory alone cannot explain the observed force enhancement, and that part of the force enhancement is associated with a passive force that is substantially greater after active compared to passive muscle stretch.
78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...
78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
P.C. Weaver
2008-06-12
Conduct verification surveys of available grids at the DWI 1630 in Knoxville, Tennessee. A representative with the Independent Environmental Assessment and Verification (IEAV) team from ORISE conducted a verification survey of a partial area within Grid E9.
40 CFR 1065.395 - Inertial PM balance verifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...
40 CFR 1065.395 - Inertial PM balance verifications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by the...
22 CFR 123.14 - Import certificate/delivery verification procedure.
Code of Federal Regulations, 2010 CFR
2010-04-01
... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...
22 CFR 123.14 - Import certificate/delivery verification procedure.
Code of Federal Regulations, 2011 CFR
2011-04-01
... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
24 CFR 5.512 - Verification of eligible immigration status.
Code of Federal Regulations, 2010 CFR
2010-04-01
... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...
Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation
NASA Technical Reports Server (NTRS)
Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna
2000-01-01
This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.
WE-EF-303-10: Single- Detector Proton Radiography as a Portal Imaging Equivalent for Proton Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doolan, P; Bentefour, E; Testa, M
2015-06-15
Purpose: In proton therapy, patient alignment is of critical importance due to the sensitivity of the proton range to tissue heterogeneities. Traditionally proton radiography is used for verification of the water-equivalent path length (WEPL), which dictates the depth protons reach. In this work we propose its use for alignment. Additionally, many new proton centers have cone-beam computed tomography in place of beamline X-ray imaging and so proton radiography offers a unique patient alignment verification similar to portal imaging in photon therapy. Method: Proton radiographs of a CIRS head phantom were acquired using the Beam Imaging System (BIS) (IBA, Louvain-la-Neuve) inmore » a horizontal beamline. A scattered beam was produced using a small, dedicated, range modulator (RM) wheel fabricated out of aluminum. The RM wheel was rotated slowly (20 sec/rev) using a stepper motor to compensate for the frame rate of the BIS (120 ms). Dose rate functions (DRFs) over two RM wheel rotations were acquired. Calibration was made with known thicknesses of homogeneous solid water. For each pixel the time width, skewness and kurtosis of the DRFs were computed. The time width was used to compute the object WEPL. In the heterogeneous phantom, the excess skewness and excess kurtosis (i.e. difference from homogeneous cases) were computed and assessed for suitability for patient set up. Results: The technique allowed for the simultaneous production of images that can be used for WEPL verification, showing few internal details, and excess skewness and kurtosis images that can be used for soft tissue alignment. These latter images highlight areas where range mixing has occurred, correlating with phantom heterogeneities. Conclusion: The excess skewness and kurtosis images contain details that are not visible in the WET images. These images, unique to the time-resolved proton radiographic method, could be used for patient set up according to soft tissues.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kawai, D; Takahashi, R; Kamima, T
2015-06-15
Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencilmore » Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.« less
SU-E-T-32: A Feasibility Study of Independent Dose Verification for IMAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamima, T; Takahashi, R; Sato, Y
2015-06-15
Purpose: To assess the feasibility of the independent dose verification (Indp) for intensity modulated arc therapy (IMAT). Methods: An independent dose calculation software program (Simple MU Analysis, Triangle Products, JP) was used in this study, which can compute the radiological path length from the surface to the reference point for each control point using patient’s CT image dataset and the MLC aperture shape was simultaneously modeled in reference to the information of MLC from DICOM-RT plan. Dose calculation was performed using a modified Clarkson method considering MLC transmission and dosimetric leaf gap. In this study, a retrospective analysis was conductedmore » in which IMAT plans from 120 patients of the two sites (prostate / head and neck) from four institutes were retrospectively analyzed to compare the Indp to the TPS using patient CT images. In addition, an ion-chamber measurement was performed to verify the accuracy of the TPS and the Indp in water-equivalent phantom. Results: The agreements between the Indp and the TPS (mean±1SD) were −0.8±2.4% and −1.3±3.8% for the regions of prostate and head and neck, respectively. The measurement comparison showed similar results (−0.8±1.6% and 0.1±4.6% for prostate and head and neck). The variation was larger in the head and neck because the number of the segments was increased that the reference point was under the MLC and the modified Clarkson method cannot consider the smooth falloff of the leaf penumbra. Conclusion: The independent verification program would be practical and effective for secondary check for IMAT with the sufficient accuracy in the measurement and CT-based calculation. The accuracy would be improved if considering the falloff of the leaf penumbra.« less
Delamination Assessment Tool for Spacecraft Composite Structures
NASA Astrophysics Data System (ADS)
Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert
2012-07-01
Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH
An Efficient Location Verification Scheme for Static Wireless Sensor Networks.
Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok
2017-01-24
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks
Kim, In-hwan; Kim, Bo-sung; Song, JooSeok
2017-01-01
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K
2009-11-06
N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests.
LLCEDATA and LLCECALC for Windows version 1.0, Volume 3: Software verification and validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFadden, J.G.
1998-09-04
LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file(EDF) that represents a snapshot of both the LLCE and the tank from which it originates. LLCECALC reads the EDF and the gamma assay file (AV2) that is produced by the flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, which discusses system limitations and provides recommendations to the LLCE process. Volume 3 documents LLCEDATA and LLCECALC`s verification and validation. Two of the three installation test cases, from Volume 1, are independently confirmed. Data bases used in LLCEDATA are verified and referenced. Both phases of LLCECALC process gamma and characterization, are extensively tested to verify that the methodology and algorithms used are correct.« less
Characterizing proton-activated materials to develop PET-mediated proton range verification markers
NASA Astrophysics Data System (ADS)
Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.
2016-06-01
Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.
Biomechanical simulation of thorax deformation using finite element approach.
Zhang, Guangzhi; Chen, Xian; Ohgi, Junji; Miura, Toshiro; Nakamoto, Akira; Matsumura, Chikanori; Sugiura, Seiryo; Hisada, Toshiaki
2016-02-06
The biomechanical simulation of the human respiratory system is expected to be a useful tool for the diagnosis and treatment of respiratory diseases. Because the deformation of the thorax significantly influences airflow in the lungs, we focused on simulating the thorax deformation by introducing contraction of the intercostal muscles and diaphragm, which are the main muscles responsible for the thorax deformation during breathing. We constructed a finite element model of the thorax, including the rib cage, intercostal muscles, and diaphragm. To reproduce the muscle contractions, we introduced the Hill-type transversely isotropic hyperelastic continuum skeletal muscle model, which allows the intercostal muscles and diaphragm to contract along the direction of the fibres with clinically measurable muscle activation and active force-length relationship. The anatomical fibre orientations of the intercostal muscles and diaphragm were introduced. Thorax deformation consists of movements of the ribs and diaphragm. By activating muscles, we were able to reproduce the pump-handle and bucket-handle motions for the ribs and the clinically observed motion for the diaphragm. In order to confirm the effectiveness of this approach, we simulated the thorax deformation during normal quiet breathing and compared the results with four-dimensional computed tomography (4D-CT) images for verification. Thorax deformation can be simulated by modelling the respiratory muscles according to continuum mechanics and by introducing muscle contractions. The reproduction of representative motions of the ribs and diaphragm and the comparison of the thorax deformations during normal quiet breathing with 4D-CT images demonstrated the effectiveness of the proposed approach. This work may provide a platform for establishing a computational mechanics model of the human respiratory system.
NASA Astrophysics Data System (ADS)
Everett, Dominique Tresten
Environmental pollution has exponentially increased since the industrial revolution due to many advancements in technology which has led to the use of innovative materials. In the manufacturing and fabrication processes of modern technology, society has become victim to the contamination via production byproducts. This issue needs to be addressed with greater efforts to solve this worldwide issue to ultimately minimize these potential detrimental public health effects and improve environmental preservation. This research study focuses on contributing to efforts with minimizing wastewater pollution by the fabrication and characterization of complex porosity gradient fibrous membrane that purifies via particle size exclusion, photocatalysis and also physical adsorption. The membrane consists of a nano/mico-fibrous composite network fabricated by side-by-side electrospinning for the initial aim of this study. The experimental setup resulted in a novel morphological structure that yields exceptional catalytic responsiveness in visible light compared to conventional materials that are currently used. Subsequently, there is a thermal bonded discontinuous polymeric microfibrous mat with activated carbon granule incorporation to serve as a superior mechanical stability agent with high physical adsorption capability. The second aim was to investigate fiber length dependence on mechano-morphological properties while achieving adequate activated carbon during processing when subjected to post-fabrication thermal bonding of resulting mat. Furthermore, the third aim was to fabricate the complex construct by combining methods from the first and second aim to assemble a system that filters through two water purification mechanisms (photocatalysis and physical adsorption) simultaneously. This study was investigated for characterization and verification for various aspects such as morphological analyses, crystallographic assessments, mechanical testing, while defining construct functionality by examining adsorption and photodegradation performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.
This thesis is the culminating project for my participation in the OECD NEA International School of Nuclear Law. This paper will begin by providing a historical background to current disarmament and denuclearization treaties. This paper will discuss the current legal framework based on current and historical activities related to denuclearization and nuclear disarmament. Then, it will propose paths forward for the future efforts, and describe the necessary legal considerations. Each treaty or agreement will be examined in respect to its requirements for: 1) limitations and implementation; 2) and verification and monitoring. Then, lessons learned in each of the two areasmore » (limitations and verification) will be used to construct a proposed path forward at the end of this paper.« less
Effects of computerized prescriber order entry on pharmacy order-processing time.
Wietholter, Jon; Sitterson, Susan; Allison, Steven
2009-08-01
The effect of computerized prescriber order entry (CPOE) on the efficiency of medication-order-processing time was evaluated. This study was conducted at a 761-bed, tertiary care hospital. A total of 2988 medication orders were collected and analyzed before (n = 1488) and after CPOE implementation (n = 1500). Data analyzed included the time the prescriber ordered the medication, the time the pharmacy received the order, and the time the order was completed by a pharmacist. The mean order-processing time before CPOE implementation was 115 minutes from prescriber composition to pharmacist verification. After CPOE implementation, the mean order-processing time was reduced to 3 minutes (p < 0.0001). The time that an order was received by the pharmacy to the time it was verified by a pharmacist was reduced from 31 minutes before CPOE implementation to 3 minutes after CPOE implementation (p < 0.0001). The implementation of CPOE reduced the order-processing time (from order composition to verification) by 97%. Additionally, pharmacy-specific order-processing time (from order receipt in the pharmacy to pharmacist verification) was reduced by 90%. This reduction in order-processing time improves patient care by shortening the interval between physician prescribing and medication availability and may allow pharmacists to explore opportunities for enhanced clinical activities that will further positively impact patient care. CPOE implementation reduced the mean pharmacy order-processing time from composition to verification by 97%. After CPOE implementation, a new medication order was verified as appropriate by a pharmacist in three minutes, on average.
Reactive system verification case study: Fault-tolerant transputer communication
NASA Technical Reports Server (NTRS)
Crane, D. Francis; Hamory, Philip J.
1993-01-01
A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.
7 CFR 272.8 - State income and eligibility verification system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...
24 CFR 985.3 - Indicators, HUD verification methods and ratings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...
78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...
30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?
Code of Federal Regulations, 2010 CFR
2010-07-01
... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
10 CFR 9.54 - Verification of identity of individuals making requests.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Verification of identity of individuals making requests. 9... About Them § 9.54 Verification of identity of individuals making requests. (a) Identification... respecting records about himself, except that no verification of identity shall be required if the records...
Verification test report on a solar heating and hot water system
NASA Technical Reports Server (NTRS)
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
46 CFR 61.40-3 - Design verification testing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...
Working Memory Mechanism in Proportional Quantifier Verification
ERIC Educational Resources Information Center
Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria
2014-01-01
The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…
Implication of forage particle length on chewing activities and milk production in dairy goats.
Lu, C D
1987-07-01
Twenty-four primiparous Alpine does fed a high concentrate ration were utilized to study the effect of forage particle length on chewing activity, ruminal components, and milk composition. Treatments were Bermudagrass hay with mean particle length of 2.38 and 3.87 mm. Forage particle length was determined with an oscillating screen particle separator. Feeding forage with 3.87-mm mean particle length to lactating dairy goats resulted in higher total chewing and rumination times, slightly higher milk fat content, and fat-corrected milk production. Results from this experiment support the hypothesis that forage particle length affects chewing activities and production of milk fat precursors in the rumen and alters milk fat content and output of fat-corrected milk. Forage particle length appeared to be an important index for forage quality and a quantitative approach could be feasible to establish a system relating forage particle length to milk production in dairy goats.
Wu, Yu-Tzu; Luben, Robert; Wareham, Nicholas; Griffin, Simon; Jones, Andy P
2017-01-01
A wide range of environmental factors have been related to active ageing, but few studies have explored the impact of weather and day length on physical activity in older adults. We investigate the cross-sectional association between weather conditions, day length and activity in older adults using a population-based cohort in England, the European Prospective Investigation into Cancer and Nutrition (EPIC) Norfolk study. Physical activity was measured objectively over 7 days using an accelerometer and this was used to calculate daily total physical activity (counts per minute), daily minutes of sedentary behaviour and light, moderate and vigorous physical activity (LMVPA). Day length and two types of weather conditions, precipitation and temperature, were obtained from a local weather station. The association between these variables and physical activity was examined by multilevel first-order autoregressive modelling. After adjusting for individual factors, short day length and poor weather conditions, including high precipitation and low temperatures, were associated with up to 10% lower average physical activity (p<0.01) and 8 minutes less time spent in LMVPA but 15 minutes more sedentary time, compared to the best conditions. Day length and weather conditions appear to be an important factor related to active ageing. Future work should focus on developing potential interventions to reduce their impact on physical activity behaviours in older adults.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-19
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0406] Proposed Information Collection... any VA-guaranteed loans on an automatic basis. DATES: Written comments and recommendations on the... written comments on the collection of information through the Federal Docket Management System (FDMS) at...
78 FR 67204 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... action to submit an information collection request to the Office of Management and Budget (OMB) and... Verification System (LVS) has been developed, providing an electronic method for fulfilling this requirement... publicly available documents, including the draft supporting statement, at the NRC's Public Document Room...
78 FR 66365 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF Data Report, the SSP-MOE Data Report, the Caseload Reduction Documentation Process, and the Reasonable...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... Enterprise, Department of Veterans Affairs. ACTION: Notice. SUMMARY: The Center for Veterans Enterprise (CVE... veterans owned businesses. DATES: Written comments and recommendations on the proposed collection of... online through the Federal Docket Management System (FDMS) at http://www.Regulations.gov . FOR FURTHER...
77 FR 6094 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-07
....-International Atomic Energy Agency Additional Protocol. Under the U.S.-International Atomic Energy Agency (IAEA...-related activities to the IAEA and potentially provide access to IAEA inspectors for verification purposes. The U.S.-IAEA Additional Protocol permits the United States unilaterally to declare exclusions from...
Code of Federal Regulations, 2010 CFR
2010-07-01
... confidentiality of, Statements of Account, Verification Auditor's Reports, and other verification information... GENERAL PROVISIONS § 201.29 Access to, and confidentiality of, Statements of Account, Verification Auditor... Account, including the Primary Auditor's Reports, filed under 17 U.S.C. 1003(c) and access to a Verifying...
Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.
1987-06-01
166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold
Code of Federal Regulations, 2011 CFR
2011-10-01
...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...
Code of Federal Regulations, 2013 CFR
2013-10-01
...; verification of identity of individuals making requests; accompanying persons; and procedures for... Procedures and Requirements § 802.7 Requests: How, where, and when presented; verification of identity of... which the record is contained. (d) Verification of identity of requester. (1) For written requests, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: 3206-0215, Verification of Full-Time School...) 3206-0215, Verification of Full-Time School Attendance. As required by the Paperwork Reduction Act of... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: RI 25-49, Verification of Full-Time School...
25 CFR 61.8 - Verification forms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...
Turkoglu, Ahu N; Huijing, Peter A; Yucesoy, Can A
2014-05-07
Recent experiments involving muscle force measurements over a range of muscle lengths show that effects of botulinum toxin (BTX) are complex e.g., force reduction varies as a function of muscle length. We hypothesized that altered conditions of sarcomeres within active parts of partially paralyzed muscle is responsible for this effect. Using finite element modeling, the aim was to test this hypothesis and to study principles of how partial activation as a consequence of BTX affects muscle mechanics. In order to model the paralyzing effect of BTX, only 50% of the fascicles (most proximal, or middle, or most distal) of the modeled muscle were activated. For all muscle lengths, a vast majority of sarcomeres of these BTX-cases were at higher lengths than identical sarcomeres of the BTX-free muscle. Due to such "longer sarcomere effect", activated muscle parts show an enhanced potential of active force exertion (up to 14.5%). Therefore, a muscle force reduction originating exclusively from the paralyzed muscle fiber populations, is compromised by the changes of active sarcomeres leading to a smaller net force reduction. Moreover, such "compromise to force reduction" varies as a function of muscle length and is a key determinant of muscle length dependence of force reduction caused by BTX. Due to longer sarcomere effect, muscle optimum length tends to shift to a lower muscle length. Muscle fiber-extracellular matrix interactions occurring via their mutual connections along full peripheral fiber lengths (i.e., myofascial force transmission) are central to these effects. Our results may help improving our understanding of mechanisms of how the toxin secondarily affects the muscle mechanically. Copyright © 2014 Elsevier Ltd. All rights reserved.
Sivanesam, Kalkena; Shu, Irene; Huggins, Kelly N. L.; Tatarek-Nossol, Marianna; Kapurniotu, Aphrodite; Andersen, Niels H.
2016-01-01
Versions of a previously discovered β-hairpin peptide inhibitor of IAPP aggregation that are stabilized in that conformation, or even forced to remain in the hairpin conformation by a backbone cyclization constraint, display superior activity as inhibitors. The cyclized hairpin, cyclo-WW2, displays inhibitory activity at sub-stoichiometric concentrations relative to this amyloidogenic peptide. The hairpin binding hypothesis stands confirmed. PMID:27317951
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
Piezoelectric pushers for active vibration control of rotating machinery
NASA Technical Reports Server (NTRS)
Palazzolo, Alan B.; Kascak, Albert F.
1988-01-01
The active control of rotordynamic vibrations and stability by magnetic bearings and electromagnetic shakers have been discussed extensively in the literature. These devices, though effective, are usually large in volume and add significant weight to the stator. The use of piezoelectric pushers may provide similar degrees of effectiveness in light, compact packages. Tests are currently being conducted with piezoelectric pusher-based active vibration control. Results from tests performed on NASA test rigs as preliminary verification of the related theory are presented.
Piezoelectric pushers for active vibration control of rotating machinery
NASA Technical Reports Server (NTRS)
Palazzolo, A. B.; Lin, R. R.; Alexander, R. M.; Kascak, A. F.; Montague, J.
1989-01-01
The active control of rotordynamic vibrations and stability by magnetic bearings and electromagnetic shakers have been discussed extensively in the literature. These devices, though effective, are usually large in volume and add significant weight to the stator. The use of piezoelectric pushers may provide similar degrees of effectiveness in light, compact packages. Tests are currently being conducted with piezoelectric pusher-based active vibration control. Results from tests performed on NASA test rigs as preliminary verification of the related theory are presented.
Install active/passive neutron examination and assay (APNEA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1996-04-01
This document describes activities pertinent to the installation of the prototype Active/Passive Neutron Examination and Assay (APNEA) system built in Area 336 into its specially designed trailer. It also documents the basic theory of operation, design and protective features, basic personnel training, and the proposed characterization site location at Lockheed Martin Specialty Components, Inc., (Specialty Components) with the estimated 10 mrem/year boundary. Additionally, the document includes the Preventive Change Analysis (PCA) form, and a checklist of items for verification prior to unrestricted system use.
CFD Aerothermodynamic Characterization Of The IXV Hypersonic Vehicle
NASA Astrophysics Data System (ADS)
Roncioni, P.; Ranuzzi, G.; Marini, M.; Battista, F.; Rufolo, G. C.
2011-05-01
In this paper, and in the framework of the ESA technical assistance activities for IXV project, the numerical activities carried out by ASI/CIRA to support the development of Aerodynamic and Aerothermodynamic databases, independent from the ones developed by the IXV Industrial consortium, are reported. A general characterization of the IXV aerothermodynamic environment has been also provided for cross checking and verification purposes. The work deals with the first year activities of Technical Assistance Contract agreed between the Italian Space Agency/CIRA and ESA.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Gain-assisted broadband ring cavity enhanced spectroscopy
NASA Astrophysics Data System (ADS)
Selim, Mahmoud A.; Adib, George A.; Sabry, Yasser M.; Khalil, Diaa
2017-02-01
Incoherent broadband cavity enhanced spectroscopy can significantly increase the effective path length of light-matter interaction to detect weak absorption lines over broad spectral range, for instance to detect gases in confined environments. Broadband cavity enhancement can be based on the decay time or the intensity drop technique. Decay time measurement is based on using tunable laser source that is expensive and suffers from long scan time. Intensity dependent measurement is usually reported based on broadband source using Fabry-Perot cavity, enabling short measurement time but suffers from the alignment tolerance of the cavity and the cavity insertion loss. In this work we overcome these challenges by using an alignment-free ring cavity made of an optical fiber loop and a directional coupler, while having a gain medium pumped below the lasing threshold to improve the finesse and reduce the insertion loss. Acetylene (C2H2) gas absorption is measured around 1535 nm wavelength using a semiconductor optical amplifier (SOA) gain medium. The system is analyzed for different ring resonator forward coupling coefficient and loses, including the 3-cm long gas cell insertion loss and fiber connector losses used in the experimental verification. The experimental results are obtained for a coupler ratio of 90/10 and a fiber length of 4 m. The broadband source is the amplified spontaneous emission of another SOA and the output is measured using a 70pm-resolution optical spectrum analyzer. The absorption depth and the effective interaction length are improved about an order of magnitude compared to the direct absorption of the gas cell. The presented technique provides an engineering method to improve the finesse and, consequently the effective length, while relaxing the technological constraints on the high reflectivity mirrors and free-space cavity alignment.
Model development and verification for mass transport to Escherichia coli cells in a turbulent flow
NASA Astrophysics Data System (ADS)
Hondzo, Miki; Al-Homoud, Amer
2007-08-01
Theoretical studies imply that fluid motion does not significantly increase the molecular diffusive mass flux toward and away from microscopic organisms. This study presents experimental and theoretical evidence that small-scale turbulence modulates enhanced mass transport to Escherichia coli cells in a turbulent flow. Using the technique of inner region and outer region expansions, a model for dissolved oxygen and glucose uptake by E. coli was developed. The mass transport to the E. coli was modeled by the Sherwood (Sh)-Péclet (Pe) number relationship with redefined characteristic length and velocity scales. The model Sh = (1 + Pe1/2 + Pe) agreed with the laboratory measurements well. The Péclet number that quantifies the role and function of small-scale turbulence on E. coli metabolism is defined by Pe = (?) where Ezz is the root mean square of fluid extension in the direction of local vorticity, ηK is the Kolmogorov length scale, Lc is the length scale of E. coli, and D is the molecular diffusion coefficient. An alternative formulation for the redefined Pe is given by Pe = (?) where ? = 0.5(ɛν)1/4 is the Kolmogorov velocity averaged over the Kolmogorov length scale, ɛ is dissipation of turbulent kinetic energy, and ν is the kinematic viscosity of fluid. The dissipation of turbulent kinetic energy was estimated directly from measured velocity gradients and was within the reported range in engineered and natural aquatic ecosytems. The specific growth of E. coli was up to 5 times larger in a turbulent flow in comparison to the still water controls. Dissolved oxygen and glucose uptake were enhanced with increased ɛ in the turbulent flow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luke, S J
2011-12-20
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less
Three-temperature plasma shock solutions with gray radiation diffusion
Johnson, Bryan M.; Klein, Richard I.
2016-04-19
Here we discuss the effects of radiation on the structure of shocks in a fully ionized plasma are investigated by solving the steady-state fluid equations for ions, electrons, and radiation. The electrons and ions are assumed to have the same bulk velocity but separate temperatures, and the radiation is modeled with the gray diffusion approximation. Both electron and ion conduction are included, as well as ion viscosity. When the material is optically thin, three-temperature behavior occurs. When the diffusive flux of radiation is important but radiation pressure is not, two-temperature behavior occurs, with the electrons strongly coupled to the radiation.more » Since the radiation heats the electrons on length scales that are much longer than the electron–ion Coulomb coupling length scale, these solutions resemble radiative shock solutions rather than plasma shock solutions that neglect radiation. When radiation pressure is important, all three components are strongly coupled. Results with constant values for the transport and coupling coefficients are compared to a full numerical simulation with a good match between the two, demonstrating that steady shock solutions constitute a straightforward and comprehensive verification test methodology for multi-physics numerical algorithms.« less
A phenomenological continuum model for force-driven nano-channel liquid flows
NASA Astrophysics Data System (ADS)
Ghorbanian, Jafar; Celebi, Alper T.; Beskok, Ali
2016-11-01
A phenomenological continuum model is developed using systematic molecular dynamics (MD) simulations of force-driven liquid argon flows confined in gold nano-channels at a fixed thermodynamic state. Well known density layering near the walls leads to the definition of an effective channel height and a density deficit parameter. While the former defines the slip-plane, the latter parameter relates channel averaged density with the desired thermodynamic state value. Definitions of these new parameters require a single MD simulation performed for a specific liquid-solid pair at the desired thermodynamic state and used for calibration of model parameters. Combined with our observations of constant slip-length and kinematic viscosity, the model accurately predicts the velocity distribution and volumetric and mass flow rates for force-driven liquid flows in different height nano-channels. Model is verified for liquid argon flow at distinct thermodynamic states and using various argon-gold interaction strengths. Further verification is performed for water flow in silica and gold nano-channels, exhibiting slip lengths of 1.2 nm and 15.5 nm, respectively. Excellent agreements between the model and the MD simulations are reported for channel heights as small as 3 nm for various liquid-solid pairs.
Perspectives on continuum flow models for force-driven nano-channel liquid flows
NASA Astrophysics Data System (ADS)
Beskok, Ali; Ghorbanian, Jafar; Celebi, Alper
2017-11-01
A phenomenological continuum model is developed using systematic molecular dynamics (MD) simulations of force-driven liquid argon flows confined in gold nano-channels at a fixed thermodynamic state. Well known density layering near the walls leads to the definition of an effective channel height and a density deficit parameter. While the former defines the slip-plane, the latter parameter relates channel averaged density with the desired thermodynamic state value. Definitions of these new parameters require a single MD simulation performed for a specific liquid-solid pair at the desired thermodynamic state and used for calibration of model parameters. Combined with our observations of constant slip-length and kinematic viscosity, the model accurately predicts the velocity distribution and volumetric and mass flow rates for force-driven liquid flows in different height nano-channels. Model is verified for liquid argon flow at distinct thermodynamic states and using various argon-gold interaction strengths. Further verification is performed for water flow in silica and gold nano-channels, exhibiting slip lengths of 1.2 nm and 15.5 nm, respectively. Excellent agreements between the model and the MD simulations are reported for channel heights as small as 3 nm for various liquid-solid pairs.
Three-temperature plasma shock solutions with gray radiation diffusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Bryan M.; Klein, Richard I.
Here we discuss the effects of radiation on the structure of shocks in a fully ionized plasma are investigated by solving the steady-state fluid equations for ions, electrons, and radiation. The electrons and ions are assumed to have the same bulk velocity but separate temperatures, and the radiation is modeled with the gray diffusion approximation. Both electron and ion conduction are included, as well as ion viscosity. When the material is optically thin, three-temperature behavior occurs. When the diffusive flux of radiation is important but radiation pressure is not, two-temperature behavior occurs, with the electrons strongly coupled to the radiation.more » Since the radiation heats the electrons on length scales that are much longer than the electron–ion Coulomb coupling length scale, these solutions resemble radiative shock solutions rather than plasma shock solutions that neglect radiation. When radiation pressure is important, all three components are strongly coupled. Results with constant values for the transport and coupling coefficients are compared to a full numerical simulation with a good match between the two, demonstrating that steady shock solutions constitute a straightforward and comprehensive verification test methodology for multi-physics numerical algorithms.« less
Study on numerical simulation of asymmetric structure aluminum profile extrusion based on ALE method
NASA Astrophysics Data System (ADS)
Chen, Kun; Qu, Yuan; Ding, Siyi; Liu, Changhui; Yang, Fuyong
2018-05-01
Using the HyperXtrude module based on the Arbitrary Lagrangian-Eulerian (ALE) finite element method, the paper simulates the steady extrusion process of the asymmetric structure aluminum die successfully. A verification experiment is carried out to verify the simulation results. Having obtained and analyzed the stress-strain field, temperature field and extruded velocity of the metal, it confirms that the simulation prediction results and the experimental schemes are consistent. The scheme of the die correction and optimization are discussed at last. By adjusting the bearing length and core thickness, adopting the structure of feeder plate protection, short shunt bridge in the upper die and three-level bonding container in the lower die to control the metal flowing, the qualified aluminum profile can be obtained.
NASA Technical Reports Server (NTRS)
Kellogg, E.; Brissenden, R.; Flanagan, K.; Freeman, M.; Hughes, J.; Jones, M.; Ljungberg, M.; Mckinnon, P.; Podgorski, W.; Schwartz, D.
1992-01-01
Advanced X-ray Astrophysics Facility (AXAF) X-ray optics testing is conducted by VETA-I, which consists of six nested Wolter type I grazing-incidence mirrors; VETA's X-ray Detection System (VXDS) in turn measures the imaging properties of VETA-I, yielding FWHM and encircled energy of the X-ray image obtained, as well as its effective area. VXDS contains a high resolution microchannel plate imaging X-ray detector and a pinhole scanning system in front of proportional-counter detectors. VETA-I's X-ray optics departs from the AXAF flight configuration in that it uses a temporary holding fixture; its mirror elements are not cut to final length, and are not coated with the metal film used to maximize high-energy reflection.
CMM Interim Check Design of Experiments (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montano, Joshua Daniel
2015-07-29
Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length and include a weekly interim check to reduce risk. The CMM interim check makes use of Renishaw’s Machine Checking Gauge which is an off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. As verification on the interim check process a design of experiments investigation was proposed to test a couple of key factorsmore » (location and inspector). The results from the two-factor factorial experiment proved that location influenced results more than the inspector or interaction.« less
NASA Astrophysics Data System (ADS)
Valenziano, L.; Gregorio, A.; Butler, R. C.; Amiaux, J.; Bonoli, C.; Bortoletto, F.; Burigana, C.; Corcione, L.; Ealet, A.; Frailis, M.; Jahnke, K.; Ligori, S.; Maiorano, E.; Morgante, G.; Nicastro, L.; Pasian, F.; Riva, M.; Scaramella, R.; Schiavone, F.; Tavagnacco, D.; Toledo-Moreo, R.; Trifoglio, M.; Zacchei, A.; Zerbi, F. M.; Maciaszek, T.
2012-09-01
Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near- Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control of systematic effects is required. From previous experiments, we have built the concept of an integrated instrument development and verification approach, where the scientific, instrument and ground-segment expertise have strong interactions from the early phases of the project. In particular, we discuss the strong integration of test and calibration activities with the Ground Segment, starting from early pre-launch verification activities. We want to report here the expertise acquired by the Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it is applied in the Euclid mission framework.
Project W-314 specific test and evaluation plan for transfer line SN-633 (241-AX-B to 241-AY-02A)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hays, W.H.
1998-03-20
The purpose of this Specific Test and Evaluation Plan (STEP) is to provide a detailed written plan for the systematic testing of modifications made by the addition of the SN-633 transfer line by the W-314 Project. The STEP develops the outline for test procedures that verify the system`s performance to the established Project design criteria. The STEP is a lower tier document based on the W-314 Test and Evaluation Plan (TEP). This STEP encompasses all testing activities required to demonstrate compliance to the project design criteria as it relates to the addition of transfer line SN-633. The Project Design Specificationsmore » (PDS) identify the specific testing activities required for the Project. Testing includes Validations and Verifications (e.g., Commercial Grade Item Dedication activities), Factory Acceptance Tests (FATs), installation tests and inspections, Construction Acceptance Tests (CATs), Acceptance Test Procedures (ATPs), Pre-Operational Test Procedures (POTPs), and Operational Test Procedures (OTPs). It should be noted that POTPs are not required for testing of the transfer line addition. The STEP will be utilized in conjunction with the TEP for verification and validation.« less
Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency
NASA Technical Reports Server (NTRS)
Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey
2012-01-01
The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.
The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...
49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?
Code of Federal Regulations, 2010 CFR
2010-10-01
... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...
40 CFR 1065.550 - Gas analyzer range verification and drift verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
.... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...
Automated verification of flight software. User's manual
NASA Technical Reports Server (NTRS)
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... (ETA) sponsored information collection request (ICR) titled, ``Income and Eligibility Verification... this request to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-ETA..., the ETA issued a final rule regarding the Confidentiality and Disclosure of State Unemployment...
Code of Federal Regulations, 2010 CFR
2010-01-01
..., procedures, and other arrangements that control reasonably foreseeable risks to customers or to the safety... other suspicious activity related to, a covered account; and (5) Notice from customers, victims of... policies and procedures regarding identification and verification set forth in the Customer Identification...
UAS Integration in the NAS Project: Part Task 6 V & V Simulation: Primary Results
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor
2016-01-01
This is a presentation of the preliminary results on final V and V (Verification and Validation) activity of [RTCA (Radio Technical Commission for Aeronautics)] SC (Special Committee)-228 DAA (Detect and Avoid) HMI (Human-Machine Interface) requirements for display alerting and guidance.
NGA West 2 | Pacific Earthquake Engineering Research Center
, multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors
IT Project Success w\\7120 and 7123 NPRs to Achieve Project Success
NASA Technical Reports Server (NTRS)
Walley, Tina L.
2009-01-01
This slide presentation reviews management techniques to assure information technology development project success. Details include the work products, the work breakdown structure (WBS), system integration, verification and validation (IV&V), and deployment and operations. An example, the NASA Consolidated Active Directory (NCAD), is reviewed.
78 FR 32356 - United States-Korea Free Trade Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-30
... possesses the required origin information; (2) the consistent application of the rules of origin to UKFTA... finding of either (a) repeated unlawful activity or (b) willful presentation of inaccurate origin... that a claim of origin for a textile or apparel good is accurate) or Article 4.3.5 (verification to...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-27
... FURTHER INFORMATION CONTACT: Denise McLamb, Enterprise Records Service (005R1B), Department of Veterans... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0673] Agency Information Collection (One-VA..., Security, and Preparedness, Department of Veterans Affairs, will submit the collection of information...
Test/QA Plan for Verification of Coliform Detection Technologies for Drinking Water
The coliform detection technologies to be tested use chromatogenic and fluorogenic growth media to detect coliforms and E. coli based on the enzymatic activity of these organisms. The systems consist of single-use sample containers that contain pre-measured reagents and can be u...
Environmental Technology Verification Program Quality Management Plan, Version 3.0
The ETV QMP is a document that addresses specific policies and procedures that have been established for managing quality-related activities in the ETV program. It is the “blueprint” that defines an organization’s QA policies and procedures; the criteria for and areas of QA appli...
A Comparison of the Effects of Two Instructional Sequences Involving Science Laboratory Activities.
ERIC Educational Resources Information Center
Ivins, Jerry Edward
This study attempted to determine if students learn science concepts better when laboratories are used to verify concepts already intorduced through lectures and textbooks (verification laboratories or whether achievement and retention are improved when laboratories are used to introduce new concepts (directed discovery learning laboratories). The…
A significant challenge in environmental studies is to determine the onset and extent of MTBE bioremediation at an affected site, which may involve indirect approaches such as microcosm verification of microbial activities at a given site. Stable isotopic fractionation is cha...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services Agency Information...), U.S. Citizenship and Immigration Services (USCIS) will be submitting the following information... sponsoring the collection: Form I-9. U.S. Citizenship and Immigration Services. (4) Affected public who will...
Formal verification of software-based medical devices considering medical guidelines.
Daw, Zamira; Cleaveland, Rance; Vetter, Marcus
2014-01-01
Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.